Re: [Wikidata-l] Suggestions for improvements of Wikidata

2015-04-13 Thread Cristian Consonni
2015-04-13 14:54 GMT+02:00 Cristian Consonni :
> With this premise, I think that Romaine's proposal for a game is
> absolutely doable and a good idea.

I want to clarify, I mean that I agree with Gerard that the best
indication for "country" for "Battle of Stalingrad" is URSS, I simply
say that a game should keep it simple (so in this case, either the
system is able to infer URSS as a possibility to present to the user
or otherwise the user should (be instructed to) say "not sure").

I am much less convinced about the "citizenship" violations. Even if I
believe that citizenship is a concept introduced with the modern
nation-state, for a variety of reasons this is anyway applied to
people that have lived before that state(at least in is modern form)
was established.

For example, Galileo Galilei is reported as an error but all the
biggest Wikipedias (and some others that I am able to read) state that
Galileo Galilei was Italian (catalan Wikipedia says that he was Tuscan
in the artcle, but caegorizes him in the category "Físics italians"
(Italian pysicists) and "Astrònoms italians" (Italian astronomers). On
the other hand, the use of the name "Italia" to indicate at least a
portion of present-day Italy goes back in history and there are
mentions in documents from at least 42 b.C. (and possibly this will be
the same for most Europe).

C

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Suggestions for improvements of Wikidata

2015-04-13 Thread Cristian Consonni
2015-04-13 18:46 GMT+02:00 Magnus Manske :
> So I present:
> https://tools.wmflabs.org/wikidata-todo/wrong_nationality.html

All links to Wikidata are missing the "/wiki/" part.

C

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Suggestions for improvements of Wikidata

2015-04-13 Thread Cristian Consonni
2015-04-13 14:00 GMT+02:00 Gerard Meijssen :
> The point is very much that the battle WAS in the USSR. It is not "not
> applicable" it is one of the most important battles in the second world war.
> My point is that we should not forget this. The battle of Uhud was not in
> Saudi Arabia either...

Ok, but I think that having a system that, for examples, cross checks
dates and presents "URSS" as a possibility would be much more
complicated to build.

I think that the Wikidata game (or a similar game-like system) can not
address all possible complicated scenarios,  and thus there will
always be some cases that should be handled directly editing Wikidata.

I was following Magnus here, in the post where he introduces the
Wikidata Game[1]:
«So what’s the approach here? I feel the crucial issue for
gamification is breaking complicated processes down into simple
actions, which themselves are just manifest decisions – “A”, “B”, or
“I don’t want to decide this now!”.

[...]

Of course, this simplification misses a lot of “fine-tuning” – what if
you are asked to decide the gender of an item that has been
accidentally tagged as “person”? What if the gender of this person is
something other than “male” or “female”? Handling all these special
cases would, of course, be possible – but it would destroy the
simplicity of the three-button interface. The games always leave you a
“way out” – when in doubt, skip the decision. Someone else will take
care of it, eventually, probably on Wikidata proper.»

With this premise, I think that Romaine's proposal for a game is
absolutely doable and a good idea.

C


[1] http://magnusmanske.de/wordpress/?p=203

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Suggestions for improvements of Wikidata

2015-04-13 Thread Cristian Consonni
2015-04-09 8:29 GMT+02:00 Gerard Meijssen :
> Because the battle of Stalingrad as a battle was not fought by modern day
> Russia, it was fought by the USSR and Nazi Germany. Associating the battle
> of Stalingrad with modern day Russia is wrong on so many levels. At the time
> it was Stalingrad, hence the name. It will never be the battle of Wolgograd.

I believe that you should have a "Not applicable" button to click for
these cases.

C

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Preliminary SPARQL endpoint for Wikidata

2015-04-08 Thread Cristian Consonni
2015-03-23 16:37 GMT+01:00 Markus Krötzsch :
> Brilliant, we should set up a page with a list of SPARQL endpoits for
> Wikidata! For production usage, it is great to have a variety to chose from.

strong +1

Also, would you mind if the examples you shared on this list are
reused on the other projects? I specifically have in mind to use them
to provide example queries on Wikidata-LDF?[1]

Thank you,

Cristian

[1] http://wikidataldf.com/

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Kian: The first neural network to serve Wikidata

2015-03-14 Thread Cristian Consonni
2015-03-08 12:56 GMT+01:00 Ricordisamoa :
> Sounds promising! It'd be good to have the code publicly viewable.


+1

Cristian

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] OpenStreetMap + Wikidata

2015-03-10 Thread Cristian Consonni
Hi Amir,

2015-03-10 14:31 GMT+01:00 Amir E. Aharoni :
> [ Aude and Christian Consonni, this should especially interest you. ]

:-)

Luca already posted the link that summarizes my idea about Wikidata and OSM.

2015-03-10 17:38 GMT+01:00 Luca Martinelli :
> https://meta.wikimedia.org/wiki/Grants:IdeaLab/OSMdata:_a_Wikidata-like_editor_for_OpenStreetMap

I just want to add a little bit of context to this, and I want to say
that this proposal stemmed out of what I saw as one of the major
advantages of Wikidata over - say - DBpedia, i.e. the integration with
the other Wikimedia projects through Mediawiki.

I remember that when Wikidata was launched in Washington one key
element behind the idea of using Mediawiki to build a data repository
- which to me  looked a little crazy (and perhaps a litlle more than
just a little) back then, and in part it still does  - was that using
Mediawiki would have provided the users with the same environment that
they were (and are) used to when they edit the projects. Even keeping
the same structure of the site was considered a plus.

In the same sense I wondered "what would happen if we do the same
thing with OSM?" Would this facilitate the integration of OSM data and
Wikipedia data in the same way that Wikidata is facilitating the
integration of data among all the WIkimedia projects?

I don't know if I have misunderstood or overestimated this "unified
environment" factor, or if this idea was just born out of a period
where in Wikimedia Italia we were like "Let's use Wikibase
everywhere!" (we have a project where we are using Wikibase outside of
Wikidata: the EAGLE project [1]. it is going very well and we are very
happy about it).

The next idea was that it should be synchronized with OSM and then I
realized that the net effect would be that this system would provid a
new editor for OSM data, perhaps more specialized and focused in
particular for tags.

And that's basically it.

C
[1] http://www.eagle-network.eu/wiki/index.php/Main_Page

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Mapillary property

2015-03-02 Thread Cristian Consonni
Hi Jo,

2015-03-02 6:32 GMT+01:00 Jo :
> I tried to add a Mapillary picture as an image. Didn't work. Since the
> licenses agree I could copy the image to Commons, but that is too much
> effort and one loses the ability to scroll through the rest of the sequence.
>
> Besides, when I'd upload pictures of artwork, they would be removed shortly
> thereafter due to lack of FOP in Belgium. So Commons is useless for most of
> my purposes (1000s of mapping pictures and specific pictures)
>
> So, can we have a dedicated Mapillary property? I also want to use it as a
> source to prove that Q19368861 and Q19368857 are buried at Q2744459.

You can propose a new property here:
https://www.wikidata.org/wiki/Wikidata:Property_proposal

(btw, I am also a user of Mapillary)

Ciao,

Cristian

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Wikidata meetup at 31C3

2014-12-28 Thread Cristian Consonni
Il 28/Dic/2014 19:55 "Lydia Pintscher"  ha
scritto:
>
> Hey folks :)
>
> If you're at 31C3 in Hamburg at the moment: There will be a Wikidata
> meetup tomorrow. Say hi to some other cool Wikidata folks.
>
http://events.ccc.de/congress/2014/wiki/Session:Wikidata:_The_free_knowledge_base
>
>
> Cheers
> Lydia, who would love to be there now

(OT: +1, Lydia)

C
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] Public datasets available via Amazon

2014-11-28 Thread Cristian Consonni
Hi,

I have just noticed that at this address:
https://aws.amazon.com/datasets

several public datasets are available (including Wikipedia's traffic stats).

Maybe you can find something stimulating in there.

Cristian

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] Wikidata wins the ODI Open Data Award

2014-11-04 Thread Cristian Consonni
Saw this now on Twitter:

wikidata vince l'#opendata award nella sezione publisher del @UKODI
#ODISummit #Wikipedia /cc @CristianCantoro http://t.co/1ejEUUnP7j
(https://twitter.com/napo/status/529721791326208000?s=03)

Yay!
C
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Another birthday gift: SPARQL queries in the browser over Wikidata RDF dumps using Linked Data Fragments

2014-11-04 Thread Cristian Consonni
Hi Markus,

2014-11-01 0:29 GMT+01:00 Markus Krötzsch :
> Nice. We are running the RDF generation on a shared cloud environment and I
> am not sure we can really use a lot of RAM there. Do you have any guess how
> much RAM you needed to get this done?

I didn't take any stats (my bad) but I would say that for the combined
dump, starting from the compressed (gz) file it took around 50GB.
I don't have time to re-run this experiment again now but I next time
I will take some measurements.

Cristian

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Another birthday gift: SPARQL queries in the browser over Wikidata RDF dumps using Linked Data Fragments

2014-10-31 Thread Cristian Consonni
2014-10-30 22:40 GMT+01:00 Cristian Consonni :
> Ok, now I have managed to add the Wikidata statements dump too.

And I have added a wikidata.hdt combined dump of all of the  above.

2014-10-31 10:25 GMT+01:00 Ruben Verborgh :
> Maybe some nuance: creating HDT exports is not *that* hard.
>
> First, on a technical level, it's simply:
> rdf2hdt -f turtle triples.ttl triples.hdt
> so that's not really difficult ;-)

Yes, I agree.
I mean, I am not an expert in the field - this should be clear by now
:P - and I was able to do that.
(by "not an expert in the field" I mean that I never heard about HDT
or LDF before 6 days ago)

It should be noted that in the conversion of the statements and terms
dump I obtained some "Unicode range" errors, which result in ignored
triples (i.e. triples not inserted in the HDT files). I am unable to
say if this is a problem of the dumps or of hdt-lib.

C

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Another birthday gift: SPARQL queries in the browser over Wikidata RDF dumps using Linked Data Fragments

2014-10-30 Thread Cristian Consonni
2014-10-30 19:41 GMT+01:00 Cristian Consonni :
> 2014-10-30 18:05 GMT+01:00 Markus Krötzsch :
>> Awesome :-) Small note: I just got a "Bad Gateway" when trying
>> http://data.wikidataldf.com/ but it now seems to work.
>
> I was restarting the server in fact I have uploaded also the
> wikidata-terms dump now.
> (the wikidata-statements file is not collaborating though :( )

Ok, now I have managed to add the Wikidata statements dump too.

If somebody would like to add some exemple SPARQL queries it would be awesome.

>> It also seems that some of your post answers the question from my previous
>> email. That sounds as if it is pretty hard to create HDT exports (not much
>> surprise there). Maybe it would be nice to at least reuse the work: could we
>> re-publish your HDT dumps after you created them?
>
> yes, sure, here they are:
> http://wikidataldf.com/download/

I should add, yes, it is pretty hard to create the HDT file since the
process requires an awful lot of RAM, and I don't know if in the
future I will be able to produce them.

C

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Wikidata RDF

2014-10-30 Thread Cristian Consonni
2014-10-30 17:34 GMT+01:00 Markus Krötzsch :
> On 30.10.2014 11:49, Cristian Consonni wrote:
>>
>> 2014-10-29 22:59 GMT+01:00 Lydia Pintscher :
>>>
>>> Help with this would be awesome and totally welcome. The tracking bug
>>> is at https://bugzilla.wikimedia.org/show_bug.cgi?id=48143
>>
>>
>> Speaking of totally awesome (aehm :D):
>> * see: http://wikidataldf.com
>> * see this other thread:
>> https://lists.wikimedia.org/pipermail/wikidata-l/2014-October/004920.html
>>
>> (If I can ask, having the RDF dumps in HDT format [again, see the
>> other thread] would be really helpful)
>
>
> We are using OpenRDF. Can it do HDT? If yes, this would be easy to do. If
> no, it would be easier to use a standalone tool to transform our dumps. We
> could still do this. Do you have any recommendation what we could use there
> (i.e., a memory-efficient command-line conversion script for N3 -> HDT)?

It seems that OpenRDF does not support HDT creation (see [1]).
I have been using the rdf2hdt tool, obtained compiling the devel
branch of the hdt-cpp library[2].
Which is developed by the group who is proposing the standard
implementation to the W3C.
C

[1] https://openrdf.atlassian.net/browse/SES-1874
[2] https://github.com/rdfhdt/hdt-cpp/tree/devel

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Another birthday gift: SPARQL queries in the browser over Wikidata RDF dumps using Linked Data Fragments

2014-10-30 Thread Cristian Consonni
2014-10-30 18:05 GMT+01:00 Markus Krötzsch :
> Hi Christian,
>
> Awesome :-) Small note: I just got a "Bad Gateway" when trying
> http://data.wikidataldf.com/ but it now seems to work.

I was restarting the server in fact I have uploaded also the
wikidata-terms dump now.
(the wikidata-statements file is not collaborating though :( )

> It also seems that some of your post answers the question from my previous
> email. That sounds as if it is pretty hard to create HDT exports (not much
> surprise there). Maybe it would be nice to at least reuse the work: could we
> re-publish your HDT dumps after you created them?

yes, sure, here they are:
http://wikidataldf.com/download/

C

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Wikidata RDF

2014-10-30 Thread Cristian Consonni
2014-10-29 22:59 GMT+01:00 Lydia Pintscher :
> Help with this would be awesome and totally welcome. The tracking bug
> is at https://bugzilla.wikimedia.org/show_bug.cgi?id=48143

Speaking of totally awesome (aehm :D):
* see: http://wikidataldf.com
* see this other thread:
https://lists.wikimedia.org/pipermail/wikidata-l/2014-October/004920.html

(If I can ask, having the RDF dumps in HDT format [again, see the
other thread] would be really helpful)

C

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] Another birthday gift: SPARQL queries in the browser over Wikidata RDF dumps using Linked Data Fragments

2014-10-30 Thread Cristian Consonni
Dear all,

I wanted to join in and give my birthday present to Wikidata (I  am a
little bit late, though!)
(also, honestly, I didn't recall it was Wikidata's birthday, but it is
a nice occasion :P)

Here it is:
http://wikidataldf.com

What is LDF?
LDF stands for Linked Data Fragments, they are a new system for query
RDF datasets that stands middle way between having a SPARQL endpoint
and downloading the whole thing.

More formally LDF is «a publishing method [for RDF datasets] that
allows efficient offloading of query execution from servers to clients
through a lightweight partitioning strategy. It enables servers to
maintain availability rates as high as any regular HTTP server,
allowing querying to scale reliably to much larger numbers of
clients»[1].

This system was devised Ruben Verborgh, Miel Vander Sande and Pieter
Colpaert at Multimedia Lab (Ghent University)  in Ghent, Belgium.
You can read more about it: http://linkeddatafragments.org/

What is Wikidata LDF?
Using the software by Verborgh et al. I have setup the website
http://wikidataldf.com that contains:
* an interface to navigate in the RDF data and query them using the
Triple Pattern Fragments client
* a web client where you can compose and execute SPARQL queries

This is not, strictly speaking, a SPARQL endpoint (not all the SPARQL
standard is implemented and it is slower, but it should be more
reliable, if you are interested in details, please do read more at the
link above).

The data are, for the moment, limited to the sitelinks dump but I am
working towards adding the other dump. I have taken the Wikidata RDF
dumps as of Oct, 13th 2014[2].

To use them I had to convert them in HDT format[3a][3b], using the
hdt-cpp library[3c] (devel) (which is taking quite a lot of resources
and computing time for the whole dumps, that's the reason why I
haven't published the rest yet ^_^).

DBpedia has also this[4]:
http://fragments.dbpedia.org/

All the software used is available under the MIT license on the LDF
repo on github[5a], and also the (two pages) website is available
here[5b].

I would like to thank Ruben for his feedback and his presentation
about LDF at SpazioDati in Trento, Italy (here's the slides[6]).

All this said, happy birthday Wikidata.

Cristian

[1] http://linkeddatafragments.org/publications/ldow2014.pdf
[2] https://tools.wmflabs.org/wikidata-exports/rdf/exports/
[3a] http://www.rdfhdt.org/
[3b] http://www.w3.org/Submission/HDT-Implementation/
[3c] https://github.com/rdfhdt/hdt-cpp
[4] http://sourceforge.net/p/dbpedia/mailman/message/32982329/
[5a] see the Browser.js, Server.js and Client.js repos in
https://github.com/LinkedDataFragments
[5b] https://github.com/CristianCantoro/wikidataldf
[6] 
http://www.slideshare.net/RubenVerborgh/querying-datasets-on-the-web-with-high-availability

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] Content negotiation for Wikidata entity export in RDF/N-Triple

2014-10-10 Thread Cristian Consonni
Hi all,

I have signaled to a list of Italian developers/people interested in
(Linked) Open Data (known as Spaghetti Open Data[*]) the fact that
Wikidata item can be export in RDF/N-Triple:
https://www.wikidata.org/wiki/Special:EntityData/Q1.rdf
https://www.wikidata.org/wiki/Special:EntityData/Q1.nt

I have been asked if content negotiation is supported.

I have no idea of what to answer ^_^.
Any help is appreciated :)

C
[*] the list: https://groups.google.com/forum/#!forum/spaghettiopendata
and the website: http://www.spaghettiopendata.org/

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] Fwd: [OSM-talk] Adding Wikidata tags to 70k items automatically

2014-08-27 Thread Cristian Consonni
This may be of interest here.

Cristian


-- Forwarded message --
From: Edward Betts 
Date: 2014-08-27 18:47 GMT+02:00
Subject: [OSM-talk] Adding Wikidata tags to 70k items automatically
To: t...@openstreetmap.org


I've written some code to match items in Wikidata with items in OSM. Currently
I have found 70,849 unique matches, where there is a one-to-one mapping
between OSM and Wikidata objects.

I'd like to annotate these 70k objects in OSM with a Wikidata tag
automatically.

For example:

Way: Piper's Orchard (43246411)
http://www.openstreetmap.org/way/43246411

And on Wikidata: https://www.wikidata.org/wiki/Q7197307

I would like to add wikidata=Q7197307 to "Piper's Orchard".

The code to find the matches is here:

https://github.com/edwardbetts/osm-wikidata

Matching criteria:

https://github.com/EdwardBetts/osm-wikidata/blob/master/entity_types.json

The results are here:

http://edwardbetts.com/osm-wikidata/

The best approach is probably to update 100 items with wikidata tags, then
we can check them to make sure the edit looks good. If everything is fine I
can go ahead and load the other 70k.

Does anybody have a strong preference that the edits are split up by region,
or loaded in batches?

Any objections?

I've read https://wiki.openstreetmap.org/wiki/Mechanical_Edit_Policy - if
there are no major objections I'll go ahead and create
https://wiki.openstreetmap.org/wiki/Mechanical_Edits/edward

See also:

http://wiki.openstreetmap.org/wiki/Proposed_features/Wikidata
http://wiki.openstreetmap.org/wiki/Wikidata
http://wiki.openstreetmap.org/wiki/Key:wikidata

Edward.

___
talk mailing list
t...@openstreetmap.org
https://lists.openstreetmap.org/listinfo/talk

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] [Wikitech-l] Request for comments: How to deal with open datasets?

2014-05-15 Thread Cristian Consonni
2014-05-15 11:25 GMT+02:00 David Cuenca :
> During the Zürich Hackathon I met several people that looked for solutions
> about how to integrate external open datasets into our projects (mainly
> Wikipedia, Wikidata). Since Wikidata is not the right tool to manage them
> (reasons explained in the RFC as discussed during the Wikidata session), I
> have felt convenient to centralize the discussion about potential
> requirements, needs, and how to approach this new changing landscape that
> didn't exist a few years ago.
>
> You will find more details here
> https://meta.wikimedia.org/wiki/Requests_for_comment/How_to_deal_with_open_datasets
>
> Your comments, thoughts and ideas are appreciated!

Thanks for the pointer, "How can I put this open data on Wikidata is a
question that I have been asked many times", this page was needed.

Ciao,

C

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] A proposal for a panel that might interest you

2014-03-30 Thread Cristian Consonni
2014-03-30 12:43 GMT+02:00 Luca Martinelli :
>> Gerard, if this is the first project outside of wmf concerning wikidata,
>> what is the first project including wmf?
>
> Well, actually we are talking about Wikibase, which is the extension that
> Wikidata uses, not Wikidata itself. This project is allegedly the first
> non-WMF to use Wikibase, since the first (and only) WMF project is Wikidata.
>
> Then again, we all know Wikibase stemmed from a WM-DE idea... :)

Adam Shorland, from the Wikidata development team, gave a presentation
at the "Spaghetti Open Data" conference in Bologna, Italy on Friday
(his participation was sponsored by WM-IT) and we had this funny
moment like:
* question from the public "Are there examples of use of Wikibase
outside Wikidata"
* Adam: "Not to my knowledge"
* Me: "Actually yes, EAGLE..."

We (as "Wikimedia Italia") did not advertise the project very much :)

C

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] [cultural-partners] Wikidata and GLAM

2014-03-19 Thread Cristian Consonni
Il 20/mar/2014 01:01 "Sarah Stierch"  ha scritto:
>
> Cool idea on the accession number.
>
> I'd like to get a simple (cough "dummies guide") on how to use Wikidata
in general (it's went through epic changes since I first fiddled with upon
launch), and perahps a workshop on how we can use this for GLAM
professionals -  meaning I'd like to see a chance for us to be presenting
at conferences on how to use this tool.
>
> Thanks Gerard for taking up this much needed task!

Sarah and already covered my ideas. I would only add, would it be possible
to have a video recording of your presentation so that we can share it with
interested people and GLAMs?

Thank you.

C
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] date for Wikiquote getting language links via Wikidata

2014-03-07 Thread Cristian Consonni
2014-03-08 0:46 GMT+01:00 Luca Martinelli :
> Yes! This is an awesome news!!! :))) I have already updated the
> relative page on Wikidata. :)

You wanted to say "related" there =P.

(Just trolling Luca, you are awesome guys)

Ciao,

C

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Wikisource is here!

2014-01-14 Thread Cristian Consonni
2014/1/14 Lydia Pintscher :
> Hey everyone :)
>
> We just enabled language links for Wikisource via Wikidata. Please
> welcome the next sister project in our round.

A [not really] small step for a wiki, a giant leap for libraries :P

/me is happy

Cristian

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] [Wikitech-l] Italian Wikipedia complements search results with Wikidata based functionality

2013-12-03 Thread Cristian Consonni
2013/12/3 Magnus Manske :
> I like it! Can we use the Wikidata color schema, according to the new,
> improved trademark policy (R) (TM)?

((... usual big IANAL disclaimer here ...))

I think so. On Wikimedia sites "You may use and remix the Wikimedia
marks on the Wikimedia sites as you please."[1], see also this FAQ[2]
which mentions tools.

Cristian

[1] https://meta.wikimedia.org/wiki/Trademark_policy#policy-onwmsites
[2] https://meta.wikimedia.org/wiki/Trademark_policy#FAQ-withoutpermission

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] [Wikitech-l] Italian Wikipedia complements search results with Wikidata based functionality

2013-12-03 Thread Cristian Consonni
Hi all,

I want to propose a logo for reasonator (to substitute the "R" now in use);
https://commons.wikimedia.org/wiki/File:Reasonator_logo_proposal.png

also, I think that in the mouseover text a more verbose description
like: "Show the properties of this item" would be helpful.

Ciao,

Cristian

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Wikidata-Freebase mappings published under CC0

2013-11-11 Thread Cristian Consonni
2013/11/11 Denny Vrandečić :
> as you know, I have recently started a new job. My mission to get more free
> data out to the world has not changed due to that, though.

=))
I am very happy to hear this.

Also, the mapping is awesome.

2013/11/11 Klein,Max :
> I regretted writing what I did after thinking about it over lunch, since it
> is not "Assume good faith" towards Google. Maybe one of the reasons that I
> was sensitive to it was because I'm representing VIAF in Wikidata, which is
> kind of the same as Freebase in Wikidata, and I wouldn't want people
> assuming bad faith about VIAF.
>
> Thanks for being clear and open about your work, its a real inspiration.
> With apologies,

Yours too, Max.

Thank you both for your very good work.

Cristian

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Counting sitelinks - period

2013-09-24 Thread Cristian Consonni
2013/9/24 Luca Martinelli :
> 2013/9/24 Magnus Manske :
>> https://tools.wmflabs.org/magnustools/static_data/items_per_site.20130924.tab
>>
>> SQL query used:
>> select ips_site_id,count(*) from wb_items_per_site group by ips_site_id
>>
>> For a list of all items with these links, now that might be a little long to
>> put in an attachment...
>
> Well, this is a beginning. :) Thank you very much, I'm bothering you a
> lot in these days. :)

Quick question, the file says that in Wikidata there are:
1276758 items link it.wiki, but it.wiki has "only"
1066230 articles
So there are many (~210k)  Wikidata items pointing to non-article
pages? Or maybe there is some double counting (linking)?

C

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Make Commons a wikidata client

2013-08-12 Thread Cristian Consonni
2013/8/11 Jane Darnell :
> Hmm, I am not quite sure how to see this. Places and people yes: It
> would be nice to have the geo coordinates on Wikidata and for the
> artist and writers

I am not sure I get what geocoordinates means for people.

>  I also agree for the book and the artwork templates.
> But how could you possibly move all of the Commons copyright logic? As
> far as I know, it's really quite a small group of people who even
> understand how all that stuff works on Commons and can untangle those
> template categories and delete/keep workflows... if you open Wikidata
> to keeping the data on copyrighted materials, like books and artworks,
> is that metadata OK to move and manage there?

I think this was not the sense of Maarten's proposal.

Cristian

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Make Commons a wikidata client

2013-08-10 Thread Cristian Consonni
2013/8/10 Maarten Dammers :
> Small change, lot's of benefits!

Strong +1

C

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Hello from Adam!

2013-07-01 Thread Cristian Consonni
2013/7/1 Adam Shorland :
> Hi! I'm Adam. I have just started working at WMDE on Wikidata which is
> contributing towards my placement year at University in the UK.
>
> I will be around for at least the next 6 months which I am sure will be
> great! I'm going to be working on lots of bits and pieces which I hope will
> keep everyone happy including usability testing, bug fixing and triage,
> analysis of usage patterns, api stuff and communication (among others) !

Welcome Adam,

I'm sure you will enjoy a lot your work with the Wikidata team.

Cristian

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Geoccordinates are live

2013-06-12 Thread Cristian Consonni
2013/6/12 Kolossos :
> Hey,
> the question is now how we can merge coordinates from all languages to
> Wikidata. I would propose to use the coordinate from the longest article to
> have a good chance for using the most accurate one. Thats the way I use in
> Wikipedia-World[1]. After an update we could also use this database for an
> import.

> Worst case would be that everyone use a bot and we would have a great
> bot-war.

I think it should be possible to just import them as data with
different sources.
If a coordinate pair is the same over multiple Wikipedia then you have
more sources, see for example the property occupation:politician
here[1]

Ciao,

Cristian

[1] http://www.wikidata.org/wiki/Q76

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l