Re: [Wikidata-l] Wikidata won another award! \o/ - Land der Ideen
Hey, Congratulations to everyone at Wikimedia Germany! Also congratulations to those who came up with the idea itself! :) Cheers -- Jeroen De Dauw - http://www.bn2vs.com Software craftsmanship advocate Developer at Wikimedia Germany ~=[,,_,,]:3 ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] Empty references
Hey, Last week a change to ignore empty references in lists of references was made: https://github.com/wmde/WikibaseDataModel/pull/399 This means that no new empty references can be added, and that existing ones will no longer be visible in most places. The one exception I can think of are dumps, since they use JSON generated before the change was made. Making a single edit to an entity that has an empty reference will fix it there as well though. Note that this change has not yet been deployed to Wikidata.org. Cheers -- Jeroen De Dauw - http://www.bn2vs.com Software craftsmanship advocate Developer at Wikimedia Germany ~=[,,_,,]:3 ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] Kian: The first neural network to serve Wikidata
Hey, Yay, neural nets are definitely fun! Am I right in understanding this is a software you created for the specific purpose of doing tasks in Wikidata? Congratulations for this bold step towards the Singularity :-) Don't worry, it'll be some time before AI can actually ingest Wikidata, see https://dl.dropboxusercontent.com/u/7313450/entropy/aitraining.png Cheers -- Jeroen De Dauw - http://www.bn2vs.com Software craftsmanship advocate Evil software architect at Wikimedia Germany ~=[,,_,,]:3 ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] Wikidata read API etiquette
Hey, And to answer your second question: Maximum number of values is 50 (500 for bots) (from https://www.wikidata.org/w/api.php?action=helpmodules=wbgetentities) That seems a bit much to me. Considering an entity can easily be over 1MB in size. Won't something die by the time you get to half a GB? Cheers -- Jeroen De Dauw - http://www.bn2vs.com Software craftsmanship advocate Evil software architect at Wikimedia Germany ~=[,,_,,]:3 ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] Empty references
Hey, I think the serializer should just drop empty references This seems like the wrong place do to that. I suggest to do this in the model itself, very similar to what we are doing with empty alias groups in lists of alias groups: https://github.com/wmde/WikibaseDataModel/blob/5c78e35ee1a382d3b132ddfd760772b94d600daa/src/Term/AliasGroupList.php#L114-L127 Cheers -- Jeroen De Dauw - http://www.bn2vs.com Software craftsmanship advocate Evil software architect at Wikimedia Germany ~=[,,_,,]:3 ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] Platypus, a speaking interface for Wikidata
Hey, Very cool! Had a quick look at the code, and am surprised by how clean it is. Cleaner than most MediaWiki / Wikibase code it seems. Well done. Cheers -- Jeroen De Dauw - http://www.bn2vs.com Software craftsmanship advocate ~=[,,_,,]:3 ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] Call for development openness
Hey, As Lydia mentioned, we obviously do not actively discourage outside contributions, and will gladly listen to suggestions on how we can do better. That being said, we are actively taking steps to make it easier for developers not already part of the community to start contributing. For instance, we created a website about our software itself [0], which lists the MediaWiki extensions and the different libraries [1] we created. For most of our libraries, you can just clone the code and run composer install. And then you're all set. You can make changes, run the tests and submit them back. Different workflow than what you as MediaWiki developer are used to perhaps, though quite a bit simpler. Furthermore, we've been quite progressive in adopting practices and tools from the wider PHP community. I definitely do not disagree with you that some things could, and should, be improved. Like you I'd like to see the Wikibase git repository and naming of the extensions be aligned more, since it indeed is confusing. Increased API stability, especially the JavaScript one, is something else on my wish-list, amongst a lot of other things. There are always reasons of why things are the way they are now and why they did not improve yet. So I suggest to look at specific pain points and see how things can be improved there. This will get us much further than looking at the general state, concluding people do not want third party contributions, and then protesting against that. [0] http://wikiba.se/ [1] http://wikiba.se/components/ Cheers -- Jeroen De Dauw - http://www.bn2vs.com Software craftsmanship advocate Evil software architect at Wikimedia Germany ~=[,,_,,]:3 ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] Various questions
Hey, I was looking through the configuration trying to debug my issues from my last email and noticed the list of blacklisted IDs. They appear to be numbers with special meaning. I was curious about two things, why are they blacklisted and what is the meaning of the remaining number? * 1: I imagine that this just refers to #1 * 23: Probably refers to the 23 enigma * 42: Life the universe and everything * 1337: leet * 9001: ISO 9001, which deals with quality assurance * 31337: Elite I guess we probably ought to delete those default values. They where added for something easter-egg like in the Wikidata project, and might well get in the way for third party users. This is also not the list of actual IDs that got blacklisted on Wikidata.org, which was a bit more extensive, and for instance had Q2013, the year in which Wikidata launched. I submitted a removal of these blacklisted IDs from the default config in https://gerrit.wikimedia.org/r/#/c/172504/ The only number that left me lost was 720101010. I couldn't figure this one out. 720101010 is 1337 for trolololo :) Cheers -- Jeroen De Dauw - http://www.bn2vs.com Software craftsmanship advocate Evil software architect at Wikimedia Germany ~=[,,_,,]:3 ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] Querying Arbitrary Wikibase Installations
Hey Derric, There is no stable release of the Wikibase Query functionality yet, and even in the development branch, the feature set is far short of what I guess you want to use. One thing that is not entirely clear to me is if you want to query data from Wikidata.org, or from your own Wikibase Repository instance. In any case, have you considered Semantic MediaWiki for your use case? http://semantic-mediawiki.org/ Any help would be appreciated, it looks like Wikibase documentation for reusers is not terribly great yet. Indeed. We have essentially no user documentation on the features of the software that is not specific to Wikidata.org. Then again, we also do not have stable releases or usable release notes for our main applications (Wikibase Repository and Wikibase Client) suitable for third party users. I hope we can improve these things in the near future, though one has to hold into account that third party users are not amongst the first priorities of the main developer of the Wikibase software, the Wikidata team. Cheers -- Jeroen De Dauw - http://www.bn2vs.com Software craftsmanship advocate Evil software architect at Wikimedia Germany ~=[,,_,,]:3 ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] Querying Arbitrary Wikibase Installations
Hey, I just need to be able to do something like: SELECT `qid` FROM Wikibase WHERE `Instance Of` = 'Elephant'; Using that list and the Wikibase API, I can do processing on my own of the returned data and take it from there. That is not possible until simple query functionality is finished. I knew complex queries were not done yet, but I though simple might be. Its of no worry though. I can run a nightly job to iterate through all of the items to build an index of them that is good enough for what I need. While the MediaWiki extension that will expose this, Wikibase Query, has no releases yet, the component it is based on, Wikibase QueryEngine, does have a release already. And that release allows you to do the very basic query in your example. So if you are fine with specifying the query and getting the results in PHP land, you can already go ahead and use QueryEngine. Cheers -- Jeroen De Dauw - http://www.bn2vs.com Software craftsmanship advocate Evil software architect at Wikimedia Germany ~=[,,_,,]:3 ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] Birthday gift: Missing Wikipedia links (was Re: Wikidata turns two!)
Hey, Does this mean we can also shoot a TODO list in the direction of Google? :) Cheers -- Jeroen De Dauw - http://www.bn2vs.com Software craftsmanship advocate Evil software architect at Wikimedia Germany ~=[,,_,,]:3 ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] How can I increase the throughput of ProteinBoxBot?
Hey, Currently each entity creation and subsequent claims are unique API calls. So using the wbeditentity will probably result in an improvement. Thanks for the suggestion. I second that suggestion. It should definitely not take 2 weeks or more to add a mere 50k items. In case your bot is PHP or could easily do some PHP, then [0] and [1] are probably of use to you. [0] https://github.com/wmde/WikibaseDataModel [1] https://github.com/wmde/WikibaseDataModelSerialization Cheers -- Jeroen De Dauw - http://www.bn2vs.com Software craftsmanship advocate Evil software architect at Wikimedia Germany ~=[,,_,,]:3 ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] Classes and Properties browser update
Hey, \o/ Where are the source code and issue tracker for this? Probably good if those where linked from the tool. If you load this in Firefox, it spends several seconds loading, after which one gets the use another browser error. Would be nice if this was shown before the rest was loaded. Of course it'd be much nicer if the biggest free browser could also be supported. http://tools.wmflabs.org/wikidata-exports/miga/#_item=1204 That first shows population. When then clicking on the link, you see the data type is quantity, not string. Cheers -- Jeroen De Dauw - http://www.bn2vs.com Software craftsmanship advocate Evil software architect at Wikimedia Germany ~=[,,_,,]:3 ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] Wikibase now has a home
Hey, Just wondering - why isn't this part of mediawiki.org? Most of the Wikibase components have nothing to do with MediaWiki. Having it on MediaWiki.org would make the whole software appear to be MediaWiki specific, and thus scare non-MediaWiki users away. Furthermore, it's sub optimal to have a collection of pages about a single topic on a wiki about something else like that, since the main navigation menus point to unrelated content. Cheers -- Jeroen De Dauw - http://www.bn2vs.com Software craftsmanship advocate Evil software architect at Wikimedia Germany ~=[,,_,,]:3 ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] Wikibase now has a home
Hey, Good to know, up till now I thought wikibase was still a mediawiki extension :s It is. But it consists of several components by now, some of which are independent of MediaWiki. I don't agree with that use of the terminology. The website defines it as follows: Wikibase is a collection of applications and libraries for creating, managing and sharing structured data. This includes Wikibase Repository and Wikibase Client, both of which are MediaWiki extensions. Cheers -- Jeroen De Dauw - http://www.bn2vs.com Software craftsmanship advocate Evil software architect at Wikimedia Germany ~=[,,_,,]:3 ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
[Wikidata-l] Wikibase now has a home
Hey, Wikibase, the software created for the Wikidata project, now has it's own website. http://wikiba.se/ Among the goals of the website are promoting third party usage and outlining the various components and applications that make up the Wikibase software. It is not meant to hold the per component documentation, such as installation instructions for the Wikibase Repository extension, which remain where they are, and simply are referenced where applicable. It's still a work in progress, and you are invited to edit the site, by sending a pull request. Cheers -- Jeroen De Dauw - http://www.bn2vs.com Software craftsmanship advocate Evil software architect at Wikimedia Germany ~=[,,_,,]:3 ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] [Wikidata] weekly summary
Hey, But you should check out Q908238 anyway ;-) Apparently you can find it via Google Images like this: https://www.google.com/search?q=Q908238tbm=isch Cheers -- Jeroen De Dauw - http://www.bn2vs.com Software craftsmanship advocate Evil software architect at Wikimedia Germany ~=[,,_,,]:3 ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] Installing Wikibase extension
Hey, I don't know about this particular error, though can say Wikibase is no longer compatible with MediaWiki 1.23. Unfortunately retaining compatibility with a stable MediaWiki version, and third party support in general, does not appear to be high on the priority list of the Wikidata team. So I can only recommend against third parities using the software in serious contexts for now. Cheers -- Jeroen De Dauw - http://www.bn2vs.com Software craftsmanship advocate Evil software architect at Wikimedia Germany ~=[,,_,,]:3 ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] Wikidata query feature: status and plans
Hey Yury, We are indeed planning to use the Ask query language for Wikidata. People will be able to define queries on dedicated query pages that contain a query entity. These query entities will represent things such as The cities with highest population in Europe. People will then be able to access the result for those queries via the web API and be able to embed different views on them into wiki pages. These views will be much like SMW result formats, and we might indeed be able to share code between the two projects for that. This functionality is still some way off though. We still need to do a lot of work, such as creating a nice visual query builder. To already get something out to the users, we plan to enable more simple queries via the web API in the near future. Cheers -- Jeroen De Dauw - http://www.bn2vs.com Software craftsmanship advocate Evil software architect at Wikimedia Germany ~=[,,_,,]:3 ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] weekly summary #108
Hey, Anything to add? Please share! :) You forgot the part where we made big improvements to the DataModel component :) I wrote a blog post about some of that http://www.bn2vs.com/blog/2014/04/30/wikibase-datamodel-entity-v2/ Cheers -- Jeroen De Dauw - http://www.bn2vs.com Software craftsmanship advocate Evil software architect at Wikimedia Germany ~=[,,_,,]:3 ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] claims Datatypes inconsistency suspicion
Hey, The situation with commonsMedia is a bit bad because it should be a URL rather than a string. What I do in wda is effectively a type conversion from string to URI in this particular case. Maybe we can fix this somehow in the future when URIs are supported as a value datatype. Ok, this makes me somewhat concerned. We do have a IriValue DV [0], which we've had for nearly a year. It is indeed not used for commonsMedia, not sure why. What concerns me is that we are now introducing a url data type, which will also just use the string DV, rather then the IRI DV. I'm not very happy with this, though it is what most of the team wants. If there is a problem with this approach, it should be outlined _soon_, since this is something not far from deployment if I understand it correctly. [0] https://github.com/wikimedia/mediawiki-extensions-DataValues/blob/master/DataValues/src/IriValue.php Cheers -- Jeroen De Dauw http://www.bn2vs.com Don't panic. Don't be evil. ~=[,,_,,]:3 -- ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] Browser search plugins for Wikidata
Hey, Hmm, I tried installing this plugin for Chrome, but I just get the message This web page has not been found Good catch. The searching for the entered term was indeed broken, I did not realize this needed handling different then the selection of one of the suggestions. An update with a fix for this and some other goodies [0] is now available in the Chrome web store. [0] https://github.com/JeroenDeDauw/WikidataChromeSearch#release-notes Cheers -- Jeroen De Dauw http://www.bn2vs.com Don't panic. Don't be evil. ~=[,,_,,]:3 -- ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
[Wikidata-l] Browser search plugins for Wikidata
(Apologies if you are getting this twice - I accidentally send this to wikidata-tech earlier on.) Hey all, Today I had a day off and decided to play around with Firefox plugins and Chrome extensions. The result is one of each being created, both for searching against Wikidata. They are both very simple, though hopefully helpful to you. Firefox plugin: https://addons.mozilla.org/en-US/firefox/addon/wikidata-search/ Chrome extension: https://chrome.google.com/webstore/detail/wikidata-search/ingjkjibhnkhomomlmlabndfmiaejkpn A more verbose version of this email: http://www.bn2vs.com/blog/2013/07/12/wikidata-search-plugins/ Cheers -- Jeroen De Dauw http://www.bn2vs.com Don't panic. Don't be evil. ~=[,,_,,]:3 -- ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] data inclusion by lables is now available on Wikipedias and time datatype is getting closer
Property labels are unique, so we do not have to deal with disambiguation. Sent from my HTC one X. On 23 May, 2013 8:40 AM, Gerard Meijssen gerard.meijs...@gmail.com wrote: Hoi, How will it deal with disambiguation ? There are plenty of reasonable properties that can have different and multiple meanings. Also using properties means that an infobox or whatever cannot be reused on a different Wiki. Thanks, Gerard On 22 May 2013 23:58, Lydia Pintscher lydia.pintsc...@wikimedia.dewrote: Heya folks :) I just wanted to let you know that it is now possible to include data from Wikidata using the property's label (not just the ID). So in essence this means that you can for example use {{#property:continent}} instead of {{#property:P30}} if you don't want to remember the ID there. Both of these would return Europe if used in the article about Spain for example. In related good news: The time datatype has progressed very well and we expect to be able to deploy it next week. However it'd be awesome if you could help test it on the demo system (http://wikidata-test-repo.wikimedia.de/wiki) to make sure we have not missed any major issues. Help with translations on translatewiki.net would also be most excellent. Cheers Lydia -- Lydia Pintscher - http://about.me/lydia.pintscher Community Communications for Technical Projects Wikimedia Deutschland e.V. Obentrautstr. 72 10963 Berlin www.wikimedia.de Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V. Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985. ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] Wikidata queries
Hey, Comments on our errors and requests for clarifications are more than welcome. What is the difference between 'sort' and 'order' QueryOptions 'sort' defines how to bring the result items in a linear order, and 'order' whether to start at the front or the end of the order. Sorry for causing some confusion here - this is my fault as I incorrectly added the order option to the document. I just reverted this. There just will be a sort field in query options, though this field is itself complex (an array of sort expressions, where each expression specifies something to sort by and a sort order). For those who are interested, we already have a preliminary implementation of the query language: https://github.com/wikimedia/mediawiki-extensions-Ask/tree/master/includes/Ask/Language Cheers -- Jeroen De Dauw http://www.bn2vs.com Don't panic. Don't be evil. -- ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] Which API is to get wikidata.org content in real-time
Hey, There are some ids like dewiki, enwiki etc, which I guess can be interpreted to corresponding languages de, en respectively. But is there a reliable map from these *wiki to the language code? And some are even using 3-letter prefix, e.g. gotwiki, xmfwiki. You cannot infer the language from the site identifier. enwiki is a site identifier. The software allows having multiple sites for the same language. For instance you could have an entity that is also described on the English Wikitionary. Or an entity described on a third party website as well, such as a movie on imdb. Unfortunately it looks like we are not yet providing an actual API for accessing this information. Are the APIs above (action=queryprop=revisions and actioon=querylist=recentchanges) the supported way to retrieve wikidata.org in realtime? I suspect this is your best bet for now. We have a mechanism for change propagation to mirrors, though right now the only implementation on top of this that we have is WMF specific. Volunteers and third parties can however create their own implementation suitable for non-WMF use. Cheers -- Jeroen De Dauw http://www.bn2vs.com Don't panic. Don't be evil. -- ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] script to acquire the Wikidata ID
Hey, $store = Wikibase\StoreFactory::getStore(); $numericId = $store-newSiteLinkCache()-getItemIdForLink( $globalSiteId, $pageTitle ); Cheers -- Jeroen De Dauw http://www.bn2vs.com Don't panic. Don't be evil. -- ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
[Wikidata-l] Call for participation - SMWCon Spring 2013
* We apologize if you received multiple copies of this Call for Papers. Please feel free to distribute it to those who might be interested. * This is an announcement for the Semantic MediaWiki Conference 'SMWCon Spring 2013'. WHERE The Spring 2013 SMWCon will be held at Interactive Telecommunications Program (or ITP), a department of New York University, in New York City, in the United States. ITP is an incredibly creative place, located at 721 Broadway, between Washington Pl. and Waverly Pl., in New York's Greenwich Village neighborhood. WHEN March 20-22 2013. CONFERENCE WEBSITE: http://semantic-mediawiki.org/wiki/SMWCon_Spring_2013 This will be the 10th edition of the Semantic MediaWiki Conference! SMWCon is open to everyone interested in collaborative knowledge creation using semantic wikis. The event brings together developers, users, and organizations from the Semantic MediaWiki community around the world. In what is becoming a tradition, we will spend the first day with tutorials and a workshop for those who want to learn more about Semantic MediaWiki and build from it. The second and the third day of the conference will include talks from the developers and users of SMW. HOW TO PARTICIPATE We have started to form a program of the conference. Since this is a community driven event, please visit the following wiki page to get news about the program, add your name to the attendees list and register your talk: http://semantic-mediawiki.org/wiki/SMWCon_Spring_2013 It does not have to be formal and/or complete, however, you're welcome to share more information about your talk, like a short abstract, and even desirable length. We are looking for use cases, updates on existing projects, lightning talks, or even demos. IMPORTANT DATES February 20 - deadlines to submit proposals of talks (the earlier the better) March 20 - the conference itself See you in New York! Regards, Laurent Alquier, Program Chair ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] Social service directory project
Hey, Depending on the specifics of your project maybe Semantic MediaWiki is a better fit though. I recommend you have a look at it as well. Agree. Definitely sounds like SMW (https://semantic-mediawiki.org) is what you are looking for. Later it might or might not make sense for you to switch to using Wikibase, the Wikidata software, or even have your data in Wikidata itself. Having it in a wiki already in a structured format is a good start either way. Cheers -- Jeroen De Dauw http://www.bn2vs.com Don't panic. Don't be evil. -- ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] Update to time and space model
Hey, Why use Q2 (earth) as the glob, and not Q215848 (WSG84)? That would be a lot clearer, I think. Since WGS84 implies Earth, this works for Earth. Is such an implication always present though? What if I want to describe a location on some random planet - I suspect you'd have to specify some system and the globe. If this is the case, we should not omit the globe. Cheers -- Jeroen De Dauw http://www.bn2vs.com Don't panic. Don't be evil. -- ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] Update to time and space model
Hey, For every globe we would always need a geodesic system. My concern is not having the geodesic system field. This is fine. My concern is not having a globe field. Cheers -- Jeroen De Dauw http://www.bn2vs.com Don't panic. Don't be evil. -- ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] Suggestions
Hey, It woud be great to have keys into other databases from Wikidata. I'd be happy to contribute Freebase IDs to matching Wikidata concepts. However, I'm wondering if it really makes sense to have a separate property for every type of ID. Shouldn't it be modeled more like interwiki links so that each concept has many foreign keys each with an associated data source. That's how we've modeled it in Freebase and it scales quite well. If I'm not mistaken this is what we plan to do. The current system we're using for the language links can certainly handle it, as it's not WP specific. Cheers -- Jeroen De Dauw http://www.bn2vs.com Don't panic. Don't be evil. -- ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
[Wikidata-l] Storage of claims and statements
Hey, We have had a createclaim API module for a while now, and although we have not deployed it on wikidata.org, some of you might have played around with it on local installs. Yesterday changes where merged vastly improving how claims are stored. The current code no longer supports the old format and you might get errors when trying to load entities that have claims in the old format attached. If this causes any problems, you can delete these entities. Cheers -- Jeroen De Dauw http://www.bn2vs.com Don't panic. Don't be evil. -- ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] wikidata.org is live (with some caveats)
Hey, why wbgetentity does not exist in wikidata.org's API? In the version of the code currently deployed the module is still named wbgetitems. This will change to wbgetentities on the next update of our deployed code though. Cheers -- Jeroen De Dauw http://www.bn2vs.com Don't panic. Don't be evil. -- ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] wikidata.org is live (with some caveats)
Hey, Phase 2 isn't started yet. Work on phase 2 has definitely started already, it just is not deployed on wikidata.org yet. Cheers -- Jeroen De Dauw http://www.bn2vs.com Don't panic. Don't be evil. -- ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
[Wikidata-l] Possible config breaking change
Hey, In this commit [0] I moved the content model constants back to the repo after they somehow found their way into lib. At the point the constants got moved to lib, the requirements for the settings loosened slightly, so they increase again now, and can potentially cause breakage. The change in requirements is that the repo settings (esp those using the content model constants) are placed after repo instead of after lib in LocalSettings. [0] https://gerrit.wikimedia.org/r/#/c/26651/ Cheers -- Jeroen De Dauw http://www.bn2vs.com Don't panic. Don't be evil. -- ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] Complete Datamodel in WON
Hey, As explained in the text, the aliases are not distinguished from other property values in the data model right now. This was the status of the discussion when we last talked about this, but we can also re-introduce aliases as a special field (I see why this would be useful). Daniel had an argument against this, saying that many other property values could also work as aliases in certain domains (e.g. binomial names of biological species). So the special status of the alias in the data model was questioned. Right, that makes sense to implement at some point if there really is demand for this. This is rather harder to implement then what we're currently doing and is blocked by phase 2 stuff and probably phase 3 stuff, while we want to have it in phase 1 already. A while back we also had a related discussion where Daniel took the position that we should also not have special labels and descriptions. The conclusion of that was that we will have them but that we will make them accessible via the same interface as regular properties (at least for read ops). if two items have the same description, can one of them use an alias that is the title of the other? Good question. Right now this is not enforced. Then again, right now aliases are not used anywhere for lookups except in the fulltext search thing, where this restriction is not really relevant. Denny, Daniel, any thoughts on this? This is also based on a preliminary decision made a while back: the idea was that properties, while not having Wikipedia articles, will still need unique string identifiers that can be used in wikitext (e.g. queries) where one does not want to address properties by ID or by label+description pairs. This seems odd to me - you sure the term TitleRecord is being used consistently through the data model and this thread? I'm using it as GlobalSiteId PageName. I do agree you would probably not want to put label and description in wikitext, and that just the label might or might not be sufficient, even if they are unique per language. If you need an id that really is always unique you can just use the p12345 thing. Since most of the editing of these will happen via GUIs (right?) this seems to be quite acceptable. Or does anybody see a better approach? In any case, why would you resort to GlobalSiteId PageName rather then label description? What makes it so odd is that the GlobalSiteId PageName is meant to indicate equivalence of items across sites, which is rather different then using it to identify properties in wikitext. It seems that a property could at best have a list of PropertyValueSnaks (no auxiliary Snaks, no references, no statement rank). Why not have a list of claims? Cheers -- Jeroen De Dauw http://www.bn2vs.com Don't panic. Don't be evil. -- ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
[Wikidata-l] Complete Datamodel in WON
Hey, I have some questions/remarks on the Complete Datamodel in WON section Markus wrote up yesterday: https://meta.wikimedia.org/wiki/Wikidata/Data_model#Complete_Datamodel_in_WON There are several things we have not modelled yet, which I'm currently not going to comment on. For those we did already implement or thought about implementing, there are a few things that do not match what's written in this section. SiteLanguageCode All occurrences of this should be replaced by GlobalSiteIdentifier, as it's NOT a language code. ItemDescription := 'ItemDescription(' Item {TitleRecord} [MultilingualTextValue] [MultilingualTextValue] {Statement} ')' This is missing the aliases stuff, which would be { UserLanguageCode String }. GeoCoordinatesValue := 'GeoCoordinatesValue(' decimal decimal decimal ')' Altitude is probably something we will not have in many cases, so I think this ought to be optional. Another optional argument would be the globe to which the coordinates belong. Different globes have different ways of measuring coordinates, so a specific set of coordinates that is valid on one might mean something else on another and simply be invalid on a third. PropertyDescription := 'PropertyDescription(' Property {TitleRecord} [MultilingualTextValue] [MultilingualTextValue] ')' Right now the Property interface is very similar to the Item one, except that Item has what we're calling sitelinks (in the here discussed WON it's called TitleRecords), and property does not. That's the first difference. Copy paste error? Or am I misinterpreting the notation? As with Item, it's missing the aliases. Although we have not implemented this yet, the Entity interface implies that it contains a list of statements. I added this after some discussion with Denny. As a result, both Items and Properties have a list of statements. That's the third difference with the WON stuff. Here the question is if non-Item Entities should have statements or not. There is no consensus on this within the team yet, so I will start a new thread about it so we can discuss it in more detail. Cheers -- Jeroen De Dauw http://www.bn2vs.com Don't panic. Don't be evil. -- ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
[Wikidata-l] Entities, statements and claims
Hey, There is some disagreement regarding the interface to access statements in various entities. == Current implementation == This is not fully implemented yet, but it's what the interface implies will be done. Entities contain a list of statements. In other words, all items, properties and queries can have statements. Obvious objection: statements for properties and queries should not have associated rank and references. They could just have a list of claims. Original reason to go with this approach anyway: the alternative is to put statement handling in Item and add claim handling in either both Property and Query or in a common base. This would result in duplication and loss of common interface for the Entities. == New proposal == I think we can accommodate all the concerns listed above as follows: * All Entities provide access to a list of claims. * Properties and Queries contain a list of claims. * Items contain a list of statements to which they provide access both as list of statements and list of claims. The apparent list of claims would correspond to a filter on rank=primary followed by a map from statement to it's claim on the list of statements. Any objections to implementing it like that? Cheers -- Jeroen De Dauw http://www.bn2vs.com Don't panic. Don't be evil. -- ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] SMW and Wikidata
Hey, As a SMW developer I am hoping that Wikidata will make people more aware of the advantages of structuring data in their wikis and thus end up with more users of SMW. Similarly I'm hoping that more developers will get interest, and that we'll have people working on both projects, exchanging ideas and code. SMW subjectively seems to be encountering quality control issues lately The latest release ought to be quite stable, and definitely not less stable then earlier releases. The qualify control could be a lot better though, but as it is, we simply do not have the resources for this. Most SMW development is done by either people that are creating additional features for their own or clients use-cases or by volunteers. If you want to help increasing the qualify and can throw developer time or other resources at it, then I'm more then willing to point out the places where we could use help. Cheers -- Jeroen De Dauw http://www.bn2vs.com Don't panic. Don't be evil. -- ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] list for bug emails
Hey, Wikidata WikidataClient WikidataRepo Although the project is called WikiData, the software is called Wikibase. So we should have Wikibase and Wikibase client for the extensions, and Wikidata for the project, although I'm not sure we really need the later. Cheers -- Jeroen De Dauw http://www.bn2vs.com Don't panic. Don't be evil. -- ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Re: [Wikidata-l] Fwd: [Wiki-research-l] Wikidata opinion piece in The Atlantic
Hey, I've been following the usage of WikiData on twitter, and for the last week or so, more then half the tweets have been pointing to this article. Apparently people like to criticize :) Cheers -- Jeroen De Dauw http://www.bn2vs.com Don't panic. Don't be evil. -- ___ Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l