[Wikidata] Action plan for improving the data import process

2017-12-13 Thread Navino Evans
Hi all,

Following on from a session I co-hosted at WikidataCon
,
we've put together a project page aimed at creating an action plan for for
improving the data import process:

Data import processes map
 (outlines
what we have already, and what is still needed)
Discussion page

(discussion
points and suggestions)

The main objective is to use this area to build a picture of what already
exists and what's missing in the data import process. This can then be used
to create actionable tasks (which we're proposing be managed on
Phabricator).

If you have anything to add please go ahead and edit the project page
and/or join the discussion.

We want to get as many community members as possible involved in these
early stages of planning, so please spread the word to anyone you think
will be interested in the data import process! :)


Many thanks,

Nav



-- 

*nav...@histropedia.com *

@NavinoEvans 

-

   www.histropedia.com

Twitter Facebo
ok
Google +

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


[Wikidata] Grant application for extending work with UNESCO data

2017-04-05 Thread Navino Evans
Hi all,


If you have a moment spare, please take a look at this WMF grant application
.
Feedback and endorsements will be greatly appreciated!

The application is to extend John Cummings' brilliant work making UNESCO
content available on Wikimedia projects, and includes a portion allocated
to me to continue the work on importing data from UNESCO and partner
agencies into Wikidata. It also covers improvements to the workflow and
documentation for the data import process, building on our previous work
getting the following pages together:

Wikidata:Data Import Hub


Wikidata:Data donation
4

Wikidata:Data Import Guide


Wikidata:Partnerships and data imports



As well as the data import work described above, the main goals are:

   1. UNESCO’s publication workflows incorporate sharing open license
   content on Wikimedia projects.
   2. Support other Intergovernmental Organisations and the wider public to
   share content on Wikimedia projects.
   3. Support Wikimedia contributors to easily discover and use UNESCO
   content and the documentation produced.



Many thanks!


Nav
-- 

*nav...@histropedia.com *

@NavinoEvans 
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Label gaps on Wikidata - (SPARQL help needed. SERVICE wikibase:label)

2017-02-25 Thread Navino Evans
On 24 February 2017 at 22:00, Rick Labs  wrote:

> Nav,
>
> YES!!! that's it! Your SPARQL works perfectly, exactly what I wanted.
>
> Thanks very much. Just had to learn how to get the CVS into Excel as
> UTF-8, not hard. Can finally see what objects people want immediately below
> "Organizations", worldwide. (yes, whats evolved is pretty darn "chaotic")
> Very much appreciated.
>
> Rick


Excellent!! Very happy to help. Best of luck cleaning up the chaos :)


-- 

*nav...@histropedia.com *

@NavinoEvans 

-

   www.histropedia.com

Twitter Facebo
ok
Google +

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Label gaps on Wikidata - (SPARQL help needed. SERVICE wikibase:label)

2017-02-24 Thread Navino Evans
Hi Rick,

Is this what you're after? http://tinyurl.com/z7ru9yr

Once you run the query there is a download drop-down menu, just above the
query results on the right hand side of the screen - it has a range of
options including CSV.

Hope that helps!

Nav




On 24 February 2017 at 02:25, Rick Labs  wrote:

> Thanks Stas & especially Kingsley for the example:
>
> # All subclasses of a class example
> # here all subclasses of P279 Organization (Q43229)
> SELECT ?item ?label ?itemDescription ?itemAltLabel
> WHERE
> {
>  ?item wdt:P279 wd:Q43229;
>rdfs:label ?label .
>  # SERVICE wikibase:label { bd:serviceParam wikibase:language
> "en,de,fr,ja,cn,ru,es,sv,pl,nl,sl,ca,it" }
>  FILTER (LANG(?label) = "en")
> }
> ORDER BY ASC(LCASE(?itemLabel))
>
> When I pull the FILTER line out of above I have almost what I need - "the
> universe" of all sub classes of organization (regardless of language).  I
> want all subclasses in the output, not just those available currently with
> an English label.
>
> In the table output, is it possible to get: a column for language code,
> and get the description to show up  (if available for that row)? That would
> be very helpful prior to my manual operations.
>
> Can I easily export the results table to CSV or Excel?  I can filter and
> sort easily from there provided I have the hooks.
>
> Thanks very much!
>
> Rick
>
> .
>
>
>
>
>
> On 2/23/2017 1:22 PM, Kingsley Idehen wrote:
>
> On 2/23/17 12:59 PM, Stas Malyshev wrote:
>
> Hi!
>
> On 2/23/17 7:20 AM, Thad Guidry wrote:
>
> In Freebase we had a parameter %lang=all
>
> Does the SPARQL label service have something similar ?
>
> Not as such, but you don't need it if you want all the labels, just do:
>
> ?item rdfs:label ?label
>
> and you'd get all labels. No need to invoke service for that, the
> service is for when you have specific set of languages you're interested
> in.
>
>
> Yep.
>
> Example at: http://tinyurl.com/h2sbvhd
>
> --
> Regards,
>
> Kingsley Idehen   
> Founder & CEO
> OpenLink Software   (Home Page: http://www.openlinksw.com)
>
> Weblogs (Blogs):
> Legacy Blog: http://www.openlinksw.com/blog/~kidehen/
> Blogspot Blog: http://kidehen.blogspot.com
> Medium Blog: https://medium.com/@kidehen
>
> Profile Pages:
> Pinterest: https://www.pinterest.com/kidehen/
> Quora: https://www.quora.com/profile/Kingsley-Uyi-Idehen
> Twitter: https://twitter.com/kidehen
> Google+: https://plus.google.com/+KingsleyIdehen/about
> LinkedIn: http://www.linkedin.com/in/kidehen
>
> Web Identities (WebID):
> Personal: http://kingsley.idehen.net/dataspace/person/kidehen#this
> : 
> http://id.myopenlink.net/DAV/home/KingsleyUyiIdehen/Public/kingsley.ttl#this
>
>
>
> ___
> Wikidata mailing 
> listWikidata@lists.wikimedia.orghttps://lists.wikimedia.org/mailman/listinfo/wikidata
>
>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>


-- 

*nav...@histropedia.com *

@NavinoEvans 

-

   www.histropedia.com

Twitter Facebo
ok
Google +

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


[Wikidata] New features for Wikidata Query Timeline tool

2016-12-13 Thread Navino Evans
Hi all,

We’ve just released some cool new features for the Histropedia Wikidata
Query Timeline  tool.

Here's a quick overview of what's new, along with some examples (more
examples available in the drop down menu on the query input page
) :

*1. Vertical spacing controls + auto-fit mode*
A new control panel on the timeline for controlling the space between rows
of events.
This includes an 'auto' mode that keeps everything visible as you zoom and
scroll (set to ‘on’ by default for small timelines).
*Example: *


   - Heritage structures in London  - should
  load in auto mode. Try zooming in to an area with your
mousewheel to see it
  auto adjust. You can click on the arrows in the new panel (right
of screen)
  to adjust the spacing manually.



*2. 'Colour scale' colour coding*
This is created automatically if you choose to colour code by a variable
that returns a number (e.g. population, height, etc).
*Examples:*
*Click the [image: Inline images 4] icon to see the colour code key*


   - Discovery of the chemical elements  –
  colour coded by atomic number (shows that heavy elements are discovered
  much later than light ones)
  - Things located within 20km of the Statue of Liberty
   - colour coded by distance from the
  Statue of Liberty (light colours are closest)



*3. Multiple filters*
You can now have multiple filter options on a timeline (previously only
allowed a single filter option)
*Examples:*
*Click on the [image: Inline images 1] icon to see the available filters.*


   - The Louvre Collections   - filter by
  creator, genre, movement, material used and room


   - People born on this day  – filter by
  gender, occupation, education, cause of death and ethnic group



*4. Automatic detection of timeline data from the SPARQL query*
Just paste a SPARQL query in to the input box on the query input page
 and click 'generate
timeline'. Any query that has the timeline view built available on the
Wikidata Query Service should work automatically.
You can then optionally use the 'map variables' section to add *date
precision*, *colour codes *and *filters*, or to override anything that
automatic detection got wrong.



*5. New help popups *
There are now lots of new instructions on the query input page
 (just click on the
little help icons *[image: Inline images 2]* ). This includes info on how
to add colour codes and filters to a timeline.



Let me know if you have any feedback or suggestions!


Cheers :)


Navino

-- 

*nav...@histropedia.com *

@NavinoEvans 
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Data import hub, data preperation instructions and import workflow for muggles

2016-11-25 Thread Navino Evans
Many thanks for the info Marco :)

I'll get in touch for an API when I get some time to try that out.

Best,

Navino

On 22 November 2016 at 20:28, Marco Fossati <foss...@spaziodati.eu> wrote:

> Hi Navino,
>
> Currently, there is an (undocumented and untested) API endpoint accepting
> POST requests as QuickStatements datasets:
> https://github.com/Wikidata/primarysources/tree/master/backe
> nd#import-statements
> If you want to try it, feel free to privately ping me for an API token.
>
> As a side note, the primary sources tool is undergoing a Wikimedia
> Foundation grant renewal request to give it a radical uplift:
> https://meta.wikimedia.org/wiki/Grants:IEG/StrepHit:_Wikidat
> a_Statements_Validation_via_References/Renewal
>
> Best,
>
> Marco
>
> On 11/22/16 14:00, Navino Evans wrote:
>
>> Thanks Marco!
>>
>> Do you know if there's a system in place yet for adding new data to the
>> Primary Sources Tool?  I thought it was still only covering Freebase
>> data at the moment, but it should be in the import guide for sure if it
>> can be used for new data sets already.
>>
>> Cheers,
>>
>> Navino
>>
>>
>>
>>
>> On 22 November 2016 at 09:43, Marco Fossati <foss...@spaziodati.eu
>> <mailto:foss...@spaziodati.eu>> wrote:
>>
>> Hi John, Navino,
>>
>> the primary sources tool uses the QuickStatements syntax for
>> large-scale non-curated dataset imports, see:
>> https://www.wikidata.org/wiki/Wikidata:Data_donation#3._Work
>> _with_the_Wikidata_community_to_import_the_data
>> <https://www.wikidata.org/wiki/Wikidata:Data_donation#3._
>> Work_with_the_Wikidata_community_to_import_the_data>
>> https://www.wikidata.org/wiki/Wikidata:Primary_sources_tool
>> <https://www.wikidata.org/wiki/Wikidata:Primary_sources_tool>
>>
>> Best,
>>
>> Marco
>>
>> On 11/21/16 21:39, Navino Evans wrote:
>>
>> I've just added some more to the page in the previously 'coming
>> soon' Self import
>> <https://www.wikidata.org/wiki/User:John_Cummings/wikidataim
>> port_guide#Option_2:_Self_import
>> <https://www.wikidata.org/wiki/User:John_Cummings/wikidataim
>> port_guide#Option_2:_Self_import>>
>> section,
>> as it seemed like this is actually the place where
>> QuickStatements and
>> mix'n'match should come in.
>> I've tried to keep the details out, and just give a guide to
>> choosing
>> which tool/approach to use for a particular situation. The
>> mechanics of
>> using the tools etc should all be on other pages I presume.
>>
>> Cheers,
>>
>> Nav
>>
>>
>> On 21 November 2016 at 16:24, john cummings
>> <mrjohncummi...@gmail.com <mailto:mrjohncummi...@gmail.com>
>> <mailto:mrjohncummi...@gmail.com
>> <mailto:mrjohncummi...@gmail.com>>> wrote:
>>
>> Hi Magnus
>>
>> I've avoided mentioning those for now as I know you are
>> working on
>> new tools, also I'm not very good at using them so wouldn't
>> write
>> good instructions :) I hope that once this is somewhere
>> 'proper'
>> that others with more knowledge can add this information in.
>>
>> My main idea with this is to break up the steps so people can
>> collaborate on importing datasets and also learn skills
>> along the
>> workflow over time rather than having to learn everything in
>> one go.
>>
>> Thanks
>>
>> John
>>
>> On 21 November 2016 at 17:11, Magnus Manske
>> <magnusman...@googlemail.com
>> <mailto:magnusman...@googlemail.com>
>> <mailto:magnusman...@googlemail.com
>> <mailto:magnusman...@googlemail.com>>>
>> wrote:
>>
>> There are other options to consider:
>> * Curated import/sync via mix'n'match
>> * Batch-based import via QuickStatements (also see
>> rewrite plans
>>     at
>> https://www.wikidata.org/wiki/User:Magnus_Manske/quick_state
>> ments2
>> <https://www.wikidata.org/wiki/User:Magnus_Manske/quick_stat
>> ements2>
>>
>> <https:/

Re: [Wikidata] Data import hub, data preperation instructions and import workflow for muggles

2016-11-22 Thread Navino Evans
Thanks Marco!

Do you know if there's a system in place yet for adding new data to the
Primary Sources Tool?  I thought it was still only covering Freebase data
at the moment, but it should be in the import guide for sure if it can be
used for new data sets already.

Cheers,

Navino




On 22 November 2016 at 09:43, Marco Fossati <foss...@spaziodati.eu> wrote:

> Hi John, Navino,
>
> the primary sources tool uses the QuickStatements syntax for large-scale
> non-curated dataset imports, see:
> https://www.wikidata.org/wiki/Wikidata:Data_donation#3._Work
> _with_the_Wikidata_community_to_import_the_data
> https://www.wikidata.org/wiki/Wikidata:Primary_sources_tool
>
> Best,
>
> Marco
>
> On 11/21/16 21:39, Navino Evans wrote:
>
>> I've just added some more to the page in the previously 'coming
>> soon' Self import
>> <https://www.wikidata.org/wiki/User:John_Cummings/wikidataim
>> port_guide#Option_2:_Self_import> section,
>> as it seemed like this is actually the place where QuickStatements and
>> mix'n'match should come in.
>> I've tried to keep the details out, and just give a guide to choosing
>> which tool/approach to use for a particular situation. The mechanics of
>> using the tools etc should all be on other pages I presume.
>>
>> Cheers,
>>
>> Nav
>>
>>
>> On 21 November 2016 at 16:24, john cummings <mrjohncummi...@gmail.com
>> <mailto:mrjohncummi...@gmail.com>> wrote:
>>
>> Hi Magnus
>>
>> I've avoided mentioning those for now as I know you are working on
>> new tools, also I'm not very good at using them so wouldn't write
>> good instructions :) I hope that once this is somewhere 'proper'
>> that others with more knowledge can add this information in.
>>
>> My main idea with this is to break up the steps so people can
>> collaborate on importing datasets and also learn skills along the
>> workflow over time rather than having to learn everything in one go.
>>
>> Thanks
>>
>> John
>>
>> On 21 November 2016 at 17:11, Magnus Manske
>> <magnusman...@googlemail.com <mailto:magnusman...@googlemail.com>>
>> wrote:
>>
>> There are other options to consider:
>> * Curated import/sync via mix'n'match
>> * Batch-based import via QuickStatements (also see rewrite plans
>> at https://www.wikidata.org/wiki/User:Magnus_Manske/quick_state
>> ments2
>> <https://www.wikidata.org/wiki/User:Magnus_Manske/quick_stat
>> ements2> )
>>
>> On Mon, Nov 21, 2016 at 3:11 PM john cummings
>> <mrjohncummi...@gmail.com <mailto:mrjohncummi...@gmail.com>>
>> wrote:
>>
>> Dear all
>>
>>
>> Myself and Navino Evans have been working on a bare bone as
>> possible workflow and instructions for making importing data
>> into Wikidata available to muggles like me. We have written
>> instructions up to the point where people would make a
>> request on the 'bot requests' page to import the data into
>> Wikidata.
>>
>>
>> Please take a look and share your thoughts
>>
>>
>> https://www.wikidata.org/wiki/User:John_Cummings/Dataimporth
>> ub
>> <https://www.wikidata.org/wiki/User:John_Cummings/Dataimport
>> hub>
>>
>> https://www.wikidata.org/wiki/User:John_Cummings/wikidataimp
>> ort_guide
>> <https://www.wikidata.org/wiki/User:John_Cummings/wikidataim
>> port_guide>
>>
>>
>> Thanks very much
>>
>>
>> John
>>
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> <mailto:Wikidata@lists.wikimedia.org>
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>> <https://lists.wikimedia.org/mailman/listinfo/wikidata>
>>
>>
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org <mailto:Wikidata@lists.wikimedia.org
>> >
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>> <https://lists.wikimedia.org/mailman/listinfo/wikidata>
>>
>>
>>
>> ___
>> Wikidata mailing list
>> 

Re: [Wikidata] Data import hub, data preperation instructions and import workflow for muggles

2016-11-21 Thread Navino Evans
I've just added some more to the page in the previously 'coming soon' Self
import
<https://www.wikidata.org/wiki/User:John_Cummings/wikidataimport_guide#Option_2:_Self_import>
section,
as it seemed like this is actually the place where QuickStatements and
mix'n'match should come in.
I've tried to keep the details out, and just give a guide to choosing which
tool/approach to use for a particular situation. The mechanics of using the
tools etc should all be on other pages I presume.

Cheers,

Nav


On 21 November 2016 at 16:24, john cummings <mrjohncummi...@gmail.com>
wrote:

> Hi Magnus
>
> I've avoided mentioning those for now as I know you are working on new
> tools, also I'm not very good at using them so wouldn't write good
> instructions :) I hope that once this is somewhere 'proper' that others
> with more knowledge can add this information in.
>
> My main idea with this is to break up the steps so people can collaborate
> on importing datasets and also learn skills along the workflow over time
> rather than having to learn everything in one go.
>
> Thanks
>
> John
>
> On 21 November 2016 at 17:11, Magnus Manske <magnusman...@googlemail.com>
> wrote:
>
>> There are other options to consider:
>> * Curated import/sync via mix'n'match
>> * Batch-based import via QuickStatements (also see rewrite plans at
>> https://www.wikidata.org/wiki/User:Magnus_Manske/quick_statements2 )
>>
>> On Mon, Nov 21, 2016 at 3:11 PM john cummings <mrjohncummi...@gmail.com>
>> wrote:
>>
>>> Dear all
>>>
>>>
>>> Myself and Navino Evans have been working on a bare bone as possible
>>> workflow and instructions for making importing data into Wikidata available
>>> to muggles like me. We have written instructions up to the point where
>>> people would make a request on the 'bot requests' page to import the data
>>> into Wikidata.
>>>
>>>
>>> Please take a look and share your thoughts
>>>
>>>
>>> https://www.wikidata.org/wiki/User:John_Cummings/Dataimporthub
>>>
>>> https://www.wikidata.org/wiki/User:John_Cummings/wikidataimport_guide
>>>
>>>
>>> Thanks very much
>>>
>>>
>>> John
>>> ___
>>> Wikidata mailing list
>>> Wikidata@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>>
>>
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>
>>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>


-- 

*nav...@histropedia.com <nav...@histropedia.com>*

@NavinoEvans <https://twitter.com/NavinoEvans>

-

   www.histropedia.com

Twitter <https://twitter.com/Histropedia>Facebo
<https://www.facebook.com/Histropedia>ok
<https://www.facebook.com/Histropedia>Google +
<https://plus.google.com/+Histropedia>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Render sparql queries using the Histropedia timeline engine

2016-08-11 Thread Navino Evans
Yay, fun indeed!
I can see all of the BCE dates are out by one on the timeline, will get
that fixed.

Thanks a lot for updating the JSON spec and filling me in on the details,
that's cleared a few things up :-)

On 11 Aug 2016 12:40, "Daniel Kinzler" <daniel.kinz...@wikimedia.de> wrote:

> Hi Navino!
>
> Thank you for your awesome work!
>
> Since this has caused some confusion again recently, I want to caution you
> about
> a major gotcha regarding dates in RDF and JSON: they use different
> conventions
> to represent years BCE. I just updated our JSON spec to reflect that
> reality,
> see <https://www.mediawiki.org/wiki/Wikibase/DataModel/JSON#time>.
>
> There is a lot of confusion about this issue throughout the linked data
> web,
> since the convention changed between XSL 1.0 (which uses -0044 to
> represent 44
> BCE, and -0001 to represent 1 BCE) and XSL 1.1 (which uses -0043 to
> represent 44
> BCE, and + to represent 1 BCE). Our JSON uses the traditional
> numbering (1
> BCE is -0001), while RDF uses the astronomical numbering (1 BCE is +).
>
> Yay, fun.
>
> Am 10.08.2016 um 21:49 schrieb Navino Evans:
> > Hi all,
> >
> >
> >
> > At long last, we’re delighted to announce you can now render sparql
> queries
> > using the Histropedia timeline engine \o/
> >
> >
> > Histropedia WikidataQuery Viewer
> > <http://histropedia.com/showcase/wikidata-viewer.html>
>
>
> --
> Daniel Kinzler
> Senior Software Developer
>
> Wikimedia Deutschland
> Gesellschaft zur Förderung Freien Wissens e.V.
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Render sparql queries using the Histropedia timeline engine

2016-08-10 Thread Navino Evans
Cheers Dan! :-)

I actually still need to sort out a bug reporting system, but I've got that
one noted down for now.

Will add a link to the app soon for a better way to report things.

Best,

Navino

On 10 Aug 2016 21:36, "Dan Garry" <dga...@wikimedia.org> wrote:

Neat! It's always exciting to see awesome things like this built on top of
the Wikidata Query Service.

One small piece of feedback: the timelines look quite blurry
<https://i.imgur.com/k56aeFD.png> on my computer. Should I file this as a
bug report somewhere? :-)

Thanks, and keep up the great work!

Dan

On 10 August 2016 at 12:49, Navino Evans <nav...@histropedia.com> wrote:

> Hi all,
>
>
>
> At long last, we’re delighted to announce you can now render sparql
> queries using the Histropedia timeline engine \o/
>
>
> Histropedia WikidataQuery Viewer
> <http://histropedia.com/showcase/wikidata-viewer.html>
>
>
>
> Unlike the main Histropedia site this tool renders timelines with data
> directly from live Wikidata queries. It lets you map query variables to
> values used to render the timeline. A few notable extra features compared
> with the built in timeline view on the Wikidata query service:
>
> *Precision* - You can render each event according to the precision of the
> date (as long as you add date precision to your query). It will default to
> day precision if you leave this out.
>
> *Rank *– The events on the timeline have a rank defined by the order of
> your sparql query results. You can also choose a query variable to use for
> rank, but it’s not really needed if you use ORDER BY in your query to
> control the order of results. Higher ranked events are placed more
> prominently on the timeline.
>
> *URL* – You can choose whichever URL you like from your query results,
> which will be opened in a new tab when you double click on an event on the
> timeline.
>
> *Automatic colour code / filter* – You can choose any variable in your
> sparql query to use for colour coding and filtering. From what I could tell
> from the preview, this seems to be the same as the new map layers feature
> that is close to launch on the Wikidata Query service (which looks awesome
> by the way!)
>
> Also similar to the ‘group by property’ feature on Magnus’ Listeria tool,
> but using an arbitrary variable from the sparql results instead of a
> Wikidata property.
>
>
> *Some cool examples:*
>
> Note: click on the droplet icon (top right) to see the colour code key and
> filter options
>
>
>- Discoveries about planetary systems, colour coded by type of object
><http://tinyurl.com/zlqupz9> (only items with an image and discoverer)
>- Who's birthday is today? colour coded by country of citizenship
><http://tinyurl.com/hla7nqb>
>- Oil paintings at the Louvre, colour coded by creator
><http://tinyurl.com/zu7cygv>
>- Descendants of Alfred the Great, colour coded by religion, in
>Japanese <http://tinyurl.com/h75utbg> – Note: select ‘no value’ in the
>filter panel for a fun edit list of people missing religion statement
>:)
>
> More examples on a dropdown list from the query input page
> <http://www.histropedia.com/showcase/wikidata-viewer.html> in the tool.
>
>
>
>
> The tool has been created by myself and fellow Histropedia co-founder Sean
> using our newly released JavaScript library. We are only just learning to
> code, and it’s a very early stage app so please let me know if anything
> breaks!
>
>
> You can find more info on the JS library (called HistropediaJS) on this
> announcement from the Histropedia mailing list
> <https://groups.google.com/forum/?utm_medium=email_source=footer#!topic/histropedia-i/5_9_nBqvMx0>
>
>
>
>
> Cheers!
>
>
>
> Navino
>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>


-- 
Dan Garry
Lead Product Manager, Discovery
Wikimedia Foundation

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


[Wikidata] Render sparql queries using the Histropedia timeline engine

2016-08-10 Thread Navino Evans
Hi all,



At long last, we’re delighted to announce you can now render sparql queries
using the Histropedia timeline engine \o/


Histropedia WikidataQuery Viewer




Unlike the main Histropedia site this tool renders timelines with data
directly from live Wikidata queries. It lets you map query variables to
values used to render the timeline. A few notable extra features compared
with the built in timeline view on the Wikidata query service:

*Precision* - You can render each event according to the precision of the
date (as long as you add date precision to your query). It will default to
day precision if you leave this out.

*Rank *– The events on the timeline have a rank defined by the order of
your sparql query results. You can also choose a query variable to use for
rank, but it’s not really needed if you use ORDER BY in your query to
control the order of results. Higher ranked events are placed more
prominently on the timeline.

*URL* – You can choose whichever URL you like from your query results,
which will be opened in a new tab when you double click on an event on the
timeline.

*Automatic colour code / filter* – You can choose any variable in your
sparql query to use for colour coding and filtering. From what I could tell
from the preview, this seems to be the same as the new map layers feature
that is close to launch on the Wikidata Query service (which looks awesome
by the way!)

Also similar to the ‘group by property’ feature on Magnus’ Listeria tool,
but using an arbitrary variable from the sparql results instead of a
Wikidata property.


*Some cool examples:*

Note: click on the droplet icon (top right) to see the colour code key and
filter options


   - Discoveries about planetary systems, colour coded by type of object
    (only items with an image and discoverer)
   - Who's birthday is today? colour coded by country of citizenship
   
   - Oil paintings at the Louvre, colour coded by creator
   
   - Descendants of Alfred the Great, colour coded by religion, in Japanese
    – Note: select ‘no value’ in the filter
   panel for a fun edit list of people missing religion statement :)

More examples on a dropdown list from the query input page
 in the tool.




The tool has been created by myself and fellow Histropedia co-founder Sean
using our newly released JavaScript library. We are only just learning to
code, and it’s a very early stage app so please let me know if anything
breaks!


You can find more info on the JS library (called HistropediaJS) on this
announcement from the Histropedia mailing list





Cheers!



Navino
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


[Wikidata] Wikidata helpers in Edinburgh for Repository Fringe 2016 ?

2016-06-01 Thread Navino Evans
Hi all,

Just posting this on the off chance off finding some Wikidatans based in or
around Edinburgh who would be free to help out in a practical Wikidata
editing session on 1st August?

Here's a link to the event: http://rfringe16.blogs.edina.ac.uk/

The session is basically an introduction to Wikidata, demo of some of the
coolest things about Wikidata, then practical session to teach everyone how
to edit.


We're expecting an audience of up to 64 people, and currently have two
people with wikidata knowledge so a couple more would be very handy for
practical part.

Cheers :)

Nav



-- 
___

The Timeline of Everything

www.histropedia.com

Twitter  Facebo
ok
 Google +

   L inke
dIn

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Units are live! \o/

2015-09-09 Thread Navino Evans
So much exciting news lately! many congratulations! :)

On 9 September 2015 at 22:29, David Cuenca Tudela  wrote:

> Just created "area" after two years of waiting! yay! congratulations! \o/
>
> On Wed, Sep 9, 2015 at 11:08 PM, Lydia Pintscher <
> lydia.pintsc...@wikimedia.de> wrote:
>
>> On Wed, Sep 9, 2015 at 9:49 PM, Lydia Pintscher
>>  wrote:
>> > Hey everyone :)
>> >
>> > As promised we just enabled support for quantities with units on
>> > Wikidata. So from now on you'll be able to store fancy things like the
>> > height of a mountain or the boiling point of an element.
>> >
>> > Quite a few properties have been waiting on unit support before they
>> > are created. I assume they will be created in the next hours and then
>> > you can go ahead and add all of the measurements.
>>
>> For anyone who is curious: Here is the list of properties already
>> created since unit support is available:
>>
>> https://www.wikidata.org/w/index.php?title=Special:ListProperties/quantity=50=103
>> and here is the list of properties that were waiting on unit support:
>> https://www.wikidata.org/wiki/Wikidata:Property_proposal/Pending/2
>> Those should change over the next hours/days.
>>
>>
>> Cheers
>> Lydia
>>
>> --
>> Lydia Pintscher - http://about.me/lydia.pintscher
>> Product Manager for Wikidata
>>
>> Wikimedia Deutschland e.V.
>> Tempelhofer Ufer 23-24
>> 10963 Berlin
>> www.wikimedia.de
>>
>> Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
>>
>> Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
>> unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
>> Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
>>
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>
>
>
>
> --
> Etiamsi omnes, ego non
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>


-- 
___

The Timeline of Everything

www.histropedia.com

Twitter  Facebo
ok
 Google +

   L inke
dIn

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


[Wikidata] Upcoming short presentation in London

2015-06-22 Thread Navino Evans
Hi all,

I will be doing a short presentation about Wikidata at an ISKO conference
in London Knowledge Organization – making a difference
http://www.iskouk.org/content/knowledge-organization-making-difference
(13th - 14th July)

The title of the talk is Wikidata: the potential of structured data to
enable applications such as Histropedia

Unfortunately it's a bit misleading as I've changed the focus of the talk,
but missed the boat for changing the title. It's actually going to be
entirely about Wikidata, with a brief mention of a range of third party
uses (the description in the programme will reflect this though).

The organisers have asked that I spread the word about the conference, so
please do let anyone know who may interested in attending :)

Best wishes,


Navino
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Upcoming short presentation in London

2015-06-22 Thread Navino Evans
Thanks Gerard! will do :)

Best wishes,

Nav

On 22 June 2015 at 20:38, Gerard Meijssen gerard.meijs...@gmail.com wrote:

 Hoi,
 Have fun :) and do the good job :) you have been doing :)
 Thanks,
  GerardM

 On 22 June 2015 at 14:07, Navino Evans nav...@histropedia.com wrote:

 Hi all,

 I will be doing a short presentation about Wikidata at an ISKO conference
 in London Knowledge Organization – making a difference
 http://www.iskouk.org/content/knowledge-organization-making-difference
 (13th - 14th July)

 The title of the talk is Wikidata: the potential of structured data to
 enable applications such as Histropedia

 Unfortunately it's a bit misleading as I've changed the focus of the
 talk, but missed the boat for changing the title. It's actually going to be
 entirely about Wikidata, with a brief mention of a range of third party
 uses (the description in the programme will reflect this though).

 The organisers have asked that I spread the word about the conference, so
 please do let anyone know who may interested in attending :)

 Best wishes,


 Navino

 ___
 Wikidata mailing list
 Wikidata@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata



 ___
 Wikidata mailing list
 Wikidata@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata




-- 
___

The Timeline of Everything

www.histropedia.com

Twitter https://twitter.com/Histropedia Facebo
https://www.facebook.com/Histropediaok
https://www.facebook.com/Histropedia Google +
https://plus.google.com/u/0/b/104484373317792180682/104484373317792180682/posts
   L http://www.linkedin.com/company/histropedia-ltdinke
http://www.linkedin.com/company/histropedia-ltddIn
http://www.linkedin.com/company/histropedia-ltd
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata-l] Query generator spreadsheet

2015-03-18 Thread Navino Evans
Thanks Thomas, that looks really handy. I definitely think we need more
template based solutions like this.

Best,

Navino
On 16 Mar 2015 20:34, Thomas Douillard thomas.douill...@gmail.com wrote:

 Looks cool, see also wrote the beginning of a query generator in mediawiki
 templates : https://www.wikidata.org/wiki/Template:WDQ

 2015-03-09 22:03 GMT+01:00 Navino Evans nav...@histropedia.com:

 Hi all,

 We've been using WDQ queries a lot recently to update timelines in the
 Histropedia directory and, while trying to speed up the process, I ended up
 creating a very crude query generator tool over the weekend. After getting
 a bit carried away with it, it seemed worth sharing as it could actually be
 a handy tool for Wikidata editors, or anyone else interested in
 experimenting with queries.

 I have no real coding experience, so it's just made in a humble
 spreadsheet! You can choose the input values for a query and the links to
 Histropedia, AutoList and WDQ will update automatically.The idea is
 obviously that this sort of tool should be written in Javascript by someone
 who actually knows what they're doing ;) but they do work and can easily be
 customised by adding more Wikidata items to the available options.

 I've made 3 different types of query generator so far, which I'll publish
 shortly as free templates on Google Sheets:

 1) Date range query generator
 https://docs.google.com/spreadsheets/d/1A8wyqVc5USJ_T8ncFfYIsmJzfy8LIdmH2QLgeyegRAM/edit?usp=sharing


 2) People finder query generator
 https://docs.google.com/spreadsheets/d/16-x7NGyHTAtEJBa5uF9ZYBhNj6BhhCJrtDaKCcJIFuE/edit?usp=sharing


 3) Family tree query generator
 https://docs.google.com/spreadsheets/d/1dNTBoi-QbI0t-t0Ba4Ex5-sx7Q03qRecE27bapQGpAw/edit?usp=sharing


 They are publicly editable spreadsheets so feel free to make a copy if
 someone is using the one you land on. And certainly don't hold back if
 anyone wishes to make any improvements.

 I hope someone finds it useful/fun to mess around with these, but I
 thought it was a nice story to share in itself - that a non-coder can sit
 down and create something genuinely useful by tapping into the power of
 Wikidata, and of course Magnus' amazing tools! :)

 Cheers,

 Navino

 --
 ___

 Histropedia

 The Timeline for all of History
 www.histropedia.com

 Follow us on:
 Twitter https://twitter.com/Histropedia Facebo
 https://www.facebook.com/Histropediaok
 https://www.facebook.com/Histropedia Google +
 https://plus.google.com/u/0/b/104484373317792180682/104484373317792180682/posts
L http://www.linkedin.com/company/histropedia-ltdinke
 http://www.linkedin.com/company/histropedia-ltddIn
 http://www.linkedin.com/company/histropedia-ltd


 ___
 Wikidata-l mailing list
 Wikidata-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata-l



 ___
 Wikidata-l mailing list
 Wikidata-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata-l


___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Query generator spreadsheet

2015-03-10 Thread Navino Evans
Cheers Fabian, Magnus!

Just had a quick look at the import-json function - looks pretty awesome
:-)  I probably won't be able to resist using it to make a mark-II mock up
at some point.

Thanks for the heads up about the upcoming Wikidata graph database, I was
aware that was in the pipeline. We'll probably crack on with trying to make
a couple of these query builders properly anyway, and make modifications as
required in the future (after learning some SPARQL by the sounds of it!).

On 10 March 2015 at 09:39, Magnus Manske magnusman...@googlemail.com
wrote:

 Nice! One could even display the results in the spreadsheet, using this:
 http://blog.fastfedora.com/projects/import-json
 (haven't tried, though)

 Note that there is a proper Wikidata graph database being developed by
 WMF. Once that reaches production, WDQ will likely become a wrapper around
 that database (for backwards compatibility). SPARQL (shudder) seems to be
 the future...

 On Tue, Mar 10, 2015 at 9:14 AM Fabian Tompsett 
 fabian.tomps...@wikimedia.org.uk wrote:

 Great work, Navino!

 I agree, using google docs for the prototype is a really good way of
 getting something started

 all the best

 Fabian Tompsett,
 Volunteer Support Organiser,
 Wikimedia UK,
 Address: 56-64 Leonard St,
 Shoreditch,
 London EC2A 4LT
  Phone:020 7065 0990
 *Mobile: *07840 455 746


 Wikimedia UK is a Company Limited by Guarantee registered in England and
 Wales, Registered No. 6741827. Registered Charity No.1144513. Registered
 Office 4th Floor, Development House, 56-64 Leonard Street, London EC2A
 4LT. United Kingdom. Wikimedia UK is the UK chapter of a global
 Wikimedia movement. The Wikimedia projects are run by the Wikimedia
 Foundation (who operate Wikipedia, amongst other projects).

 Visit http://www.wikimedia.org.uk/ and @wikimediauk

 On 9 March 2015 at 21:42, Navino Evans nav...@histropedia.com wrote:

 Many thanks Markus :-)

 The plan for us now is to get our developer to start making some of
 these 'query modules', and publish them as open source. They would be
 use-able as a stand-alone tool, but more importantly be embeddable and
 easily customisable in layout etc so that third parties could use them
 easily. The hope going forward would be that we could eventually create an
 open source, searchable library of such query modules, each one specialised
 enough that they can be very easily understood by a human putting in the
 data.

 Navino


 On 9 March 2015 at 21:32, Markus Krötzsch mar...@semantic-mediawiki.org
  wrote:

 Awesome work :-). I love your use of Google Docs as a UI prototyping
 tool. We could really use a few more special-purpose querying tools.

 Markus

 On 09.03.2015 22:03, Navino Evans wrote:

 Hi all,

 We've been using WDQ queries a lot recently to update timelines in the
 Histropedia directory and, while trying to speed up the process, I
 ended
 up creating a very crude query generator tool over the weekend. After
 getting a bit carried away with it, it seemed worth sharing as it could
 actually be a handy tool for Wikidata editors, or anyone else
 interested
 in experimenting with queries.

 I have no real coding experience, so it's just made in a humble
 spreadsheet! You can choose the input values for a query and the links
 to Histropedia, AutoList and WDQ will update automatically.The idea is
 obviously that this sort of tool should be written in Javascript by
 someone who actually knows what they're doing ;) but they do work and
 can easily be customised by adding more Wikidata items to the available
 options.

 I've made 3 different types of query generator so far, which I'll
 publish shortly as free templates on Google Sheets:

 1) Date range query generator
 https://docs.google.com/spreadsheets/d/1A8wyqVc5USJ_
 T8ncFfYIsmJzfy8LIdmH2QLgeyegRAM/edit?usp=sharing


 2) People finder query generator
 https://docs.google.com/spreadsheets/d/16-
 x7NGyHTAtEJBa5uF9ZYBhNj6BhhCJrtDaKCcJIFuE/edit?usp=sharing


 3) Family tree query generator
 https://docs.google.com/spreadsheets/d/1dNTBoi-QbI0t-t0Ba4Ex5-
 sx7Q03qRecE27bapQGpAw/edit?usp=sharing


 They are publicly editable spreadsheets so feel free to make a copy if
 someone is using the one you land on. And certainly don't hold back if
 anyone wishes to make any improvements.

 I hope someone finds it useful/fun to mess around with these, but I
 thought it was a nice story to share in itself - that a non-coder can
 sit down and create something genuinely useful by tapping into the
 power
 of Wikidata, and of course Magnus' amazing tools! :)

 Cheers,

 Navino

 --
 ___

 Histropedia

 The Timeline for all of History
 www.histropedia.com http://www.histropedia.com/

 Follow us on:
 Twitter https://twitter.com/Histropedia Facebo
 https://www.facebook.com/Histropediaok
 https://www.facebook.com/Histropedia Google +
 https://plus.google.com/u/0/b/104484373317792180682/
 104484373317792180682/posts
 L http://www.linkedin.com/company/histropedia-ltdinke

Re: [Wikidata-l] Query generator spreadsheet

2015-03-10 Thread Navino Evans
That's good to know. Looking forward to playing around with it when there's
a test server up and running.

Best,

Navino

On 10 March 2015 at 13:28, Markus Kroetzsch markus.kroetz...@tu-dresden.de
wrote:

 On 10.03.2015 14:15, Navino Evans wrote:

 Cheers Fabian, Magnus!

 Just had a quick look at the import-json function - looks pretty awesome
 :-)  I probably won't be able to resist using it to make a mark-II mock
 up at some point.

 Thanks for the heads up about the upcoming Wikidata graph database, I
 was aware that was in the pipeline. We'll probably crack on with trying
 to make a couple of these query builders properly anyway, and make
 modifications as required in the future (after learning some SPARQL by
 the sounds of it!).


 This should be a minor issue. SPARQL is complex, but you can create
 queries as in your UIs just like you do for WDQ, only using different text
 templates for expressing the various conditions. But once you have these,
 it is quite easy to use as well.

 We hope that we can set up a test server very soon for people to try out
 SPARQL on Wikidata even before the official WMF service goes online.

 Cheers,

 Markus

 --
 Markus Kroetzsch
 Faculty of Computer Science
 Technische Universität Dresden
 +49 351 463 38486
 http://korrekt.org/


 ___
 Wikidata-l mailing list
 Wikidata-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata-l




-- 
___

Histropedia

The Timeline for all of History
www.histropedia.com

Follow us on:
Twitter https://twitter.com/Histropedia Facebo
https://www.facebook.com/Histropediaok
https://www.facebook.com/Histropedia Google +
https://plus.google.com/u/0/b/104484373317792180682/104484373317792180682/posts
   L http://www.linkedin.com/company/histropedia-ltdinke
http://www.linkedin.com/company/histropedia-ltddIn
http://www.linkedin.com/company/histropedia-ltd
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Query generator spreadsheet

2015-03-09 Thread Navino Evans
Many thanks Markus :-)

The plan for us now is to get our developer to start making some of these
'query modules', and publish them as open source. They would be use-able as
a stand-alone tool, but more importantly be embeddable and easily
customisable in layout etc so that third parties could use them easily. The
hope going forward would be that we could eventually create an open source,
searchable library of such query modules, each one specialised enough that
they can be very easily understood by a human putting in the data.

Navino


On 9 March 2015 at 21:32, Markus Krötzsch mar...@semantic-mediawiki.org
wrote:

 Awesome work :-). I love your use of Google Docs as a UI prototyping tool.
 We could really use a few more special-purpose querying tools.

 Markus

 On 09.03.2015 22:03, Navino Evans wrote:

 Hi all,

 We've been using WDQ queries a lot recently to update timelines in the
 Histropedia directory and, while trying to speed up the process, I ended
 up creating a very crude query generator tool over the weekend. After
 getting a bit carried away with it, it seemed worth sharing as it could
 actually be a handy tool for Wikidata editors, or anyone else interested
 in experimenting with queries.

 I have no real coding experience, so it's just made in a humble
 spreadsheet! You can choose the input values for a query and the links
 to Histropedia, AutoList and WDQ will update automatically.The idea is
 obviously that this sort of tool should be written in Javascript by
 someone who actually knows what they're doing ;) but they do work and
 can easily be customised by adding more Wikidata items to the available
 options.

 I've made 3 different types of query generator so far, which I'll
 publish shortly as free templates on Google Sheets:

 1) Date range query generator
 https://docs.google.com/spreadsheets/d/1A8wyqVc5USJ_
 T8ncFfYIsmJzfy8LIdmH2QLgeyegRAM/edit?usp=sharing


 2) People finder query generator
 https://docs.google.com/spreadsheets/d/16-x7NGyHTAtEJBa5uF9ZYBhNj6BhhCJr
 tDaKCcJIFuE/edit?usp=sharing


 3) Family tree query generator
 https://docs.google.com/spreadsheets/d/1dNTBoi-QbI0t-t0Ba4Ex5-
 sx7Q03qRecE27bapQGpAw/edit?usp=sharing


 They are publicly editable spreadsheets so feel free to make a copy if
 someone is using the one you land on. And certainly don't hold back if
 anyone wishes to make any improvements.

 I hope someone finds it useful/fun to mess around with these, but I
 thought it was a nice story to share in itself - that a non-coder can
 sit down and create something genuinely useful by tapping into the power
 of Wikidata, and of course Magnus' amazing tools! :)

 Cheers,

 Navino

 --
 ___

 Histropedia

 The Timeline for all of History
 www.histropedia.com http://www.histropedia.com/

 Follow us on:
 Twitter https://twitter.com/Histropedia Facebo
 https://www.facebook.com/Histropediaok
 https://www.facebook.com/Histropedia Google +
 https://plus.google.com/u/0/b/104484373317792180682/
 104484373317792180682/posts
 L http://www.linkedin.com/company/histropedia-ltdinke
 http://www.linkedin.com/company/histropedia-ltddIn
 http://www.linkedin.com/company/histropedia-ltd



 ___
 Wikidata-l mailing list
 Wikidata-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata-l



 ___
 Wikidata-l mailing list
 Wikidata-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata-l




-- 
___

Histropedia

The Timeline for all of History
www.histropedia.com

Follow us on:
Twitter https://twitter.com/Histropedia Facebo
https://www.facebook.com/Histropediaok
https://www.facebook.com/Histropedia Google +
https://plus.google.com/u/0/b/104484373317792180682/104484373317792180682/posts
   L http://www.linkedin.com/company/histropedia-ltdinke
http://www.linkedin.com/company/histropedia-ltddIn
http://www.linkedin.com/company/histropedia-ltd
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] Query generator spreadsheet

2015-03-09 Thread Navino Evans
Hi all,

We've been using WDQ queries a lot recently to update timelines in the
Histropedia directory and, while trying to speed up the process, I ended up
creating a very crude query generator tool over the weekend. After getting
a bit carried away with it, it seemed worth sharing as it could actually be
a handy tool for Wikidata editors, or anyone else interested in
experimenting with queries.

I have no real coding experience, so it's just made in a humble
spreadsheet! You can choose the input values for a query and the links to
Histropedia, AutoList and WDQ will update automatically.The idea is
obviously that this sort of tool should be written in Javascript by someone
who actually knows what they're doing ;) but they do work and can easily be
customised by adding more Wikidata items to the available options.

I've made 3 different types of query generator so far, which I'll publish
shortly as free templates on Google Sheets:

1) Date range query generator
https://docs.google.com/spreadsheets/d/1A8wyqVc5USJ_T8ncFfYIsmJzfy8LIdmH2QLgeyegRAM/edit?usp=sharing


2) People finder query generator
https://docs.google.com/spreadsheets/d/16-x7NGyHTAtEJBa5uF9ZYBhNj6BhhCJrtDaKCcJIFuE/edit?usp=sharing


3) Family tree query generator
https://docs.google.com/spreadsheets/d/1dNTBoi-QbI0t-t0Ba4Ex5-sx7Q03qRecE27bapQGpAw/edit?usp=sharing


They are publicly editable spreadsheets so feel free to make a copy if
someone is using the one you land on. And certainly don't hold back if
anyone wishes to make any improvements.

I hope someone finds it useful/fun to mess around with these, but I thought
it was a nice story to share in itself - that a non-coder can sit down and
create something genuinely useful by tapping into the power of Wikidata,
and of course Magnus' amazing tools! :)

Cheers,

Navino

-- 
___

Histropedia

The Timeline for all of History
www.histropedia.com

Follow us on:
Twitter https://twitter.com/Histropedia Facebo
https://www.facebook.com/Histropediaok
https://www.facebook.com/Histropedia Google +
https://plus.google.com/u/0/b/104484373317792180682/104484373317792180682/posts
   L http://www.linkedin.com/company/histropedia-ltdinke
http://www.linkedin.com/company/histropedia-ltddIn
http://www.linkedin.com/company/histropedia-ltd
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] WikiData for Research Project Idea: Structured History

2014-12-31 Thread Navino Evans
Thanks for sharing those links Susanna :)  really fascinating to find out
about the related projects and discussions.

I'll join in to any discussions about this on Wikidata - it's very relevant
to the long term goals of Histropedia http://www.histropedia.com/
(interactive
timeline powered by Wikidata, which I'm a co-founder of) so I may be able
to add a useful perspective.

I'm not sure where it is on the roadmap, but hopefully we'll soon get
access to the 'before' and 'after' times for dates on Wikidata. I imagine
this will open up a vast range of different uses for digital humanities
projects, that are often dealing with uncertain time ranges. For example,
we plan on using this data in Histropedia to visualise uncertainty in date
ranges of events displayed on the timeline interface.

Regards,

Navino

On 31 December 2014 at 15:20, Susanna Ånäs susanna.a...@wikimedia.fi
wrote:

 There is very interesting and relevant work done by a group of scholars
 about modeling time for recording historical events as structured data.
 Please have a look at http://dh.stanford.edu/topotime/ and
 http://perio.do/narrative/.

 I am following the discussion related to the development of the
 http://www.openhistoricalmap.org/ at
 https://lists.openstreetmap.org/listinfo/historic. In my mind, it would
 be a good idea to keep in sync with and provide insight into the other open
 projects for historical data, in this case geodata. The question of
 standards is being discussed right now :)

 Best,
 Susanna

 2014-12-30 19:45 GMT+02:00 Gerard Meijssen gerard.meijs...@gmail.com:

 Hoi,
 The Wikipedia article on the subject has probably most if not all
 relevant details..
 Thanks,
   GerardM

 On 30 December 2014 at 16:39, Gerard Meijssen gerard.meijs...@gmail.com
 wrote:

 Hoi,
 The most important people, as far as Wikidata is concerned, are the
 Wikidata developers. As long as they indicate that the software conforms to
 the standard we are good.

 There is no problem in them having the standard and publishing what is
 expected of the use of timestamps and time diffs/
 Thanks,
  GerardM

 On 29 December 2014 at 18:00, Paul Houle ontolo...@gmail.com wrote:

 Gerard,  tell me about it.

 It's hard to find anyone who has even seen ISO 8601 so there is not
 general compatibility between tools that accept ISO 8601 (date)?(times?);
  the xsd:datetime (defined mainly as a restriction of ISO 8601) is closer
 to an open standard,  but people aren't so sure about extra digits in the
 date fields,  but maybe we will need them to deal with the year 1
 problem.

 IEEE 744 is a similar scandal since it hasn't been read by most
 developers,  particularly systems developers,  so it is unlikely that FP
 operations in your favorite language are completely conformant.

 Now IEEE does have the Get802 program which lets you get slightly aged
 documents for networking standards and ISO does release the occasional
 standard for free such as ISO 20222 but there is a big difference between
 those two and the other organizations like the OMG,  W3C,  IETF,  and FIPS
 that publish standards for free and manage to somehow pay the bills.

 On Mon, Dec 29, 2014 at 6:31 AM, Gerard Meijssen 
 gerard.meijs...@gmail.com wrote:

 Hoi.
 The fact that ISO has its standards behind a paywall is its shame.
 However, it does not necessarily imply anything about the use of the
 standard.
 Thanks,
  Gerard

 NB a paywall seriously hampers acceptance of standards

 On 29 December 2014 at 12:20, Jeff Thompson j...@thefirst.org wrote:

  The ISO standard for CIDOC CRM is behind a pay wall with a patent
 notice. Can it be used in an open knowledge system?


 On 2014-12-29 9:49, Dov Winer wrote:

  Hi Sam,

  CIDOC/CRM is the ontology of choice for Structured History
 as it is anchored on modelling events.

  An excellent project based on it is the ResearchSpace from
 the British Museum.
 See:
 http://www.researchspace.org/
 http://www.researchspace.org/home/rsandcrm
 http://cidoc-crm.org/

  Enjoy,
 Dov


 ___
 Wikidata-l mailing 
 listWikidata-l@lists.wikimedia.orghttps://lists.wikimedia.org/mailman/listinfo/wikidata-l



 ___
 Wikidata-l mailing list
 Wikidata-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata-l



 ___
 Wikidata-l mailing list
 Wikidata-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata-l




 --
 Paul Houle
 Expert on Freebase, DBpedia, Hadoop and RDF
 (607) 539 6254paul.houle on Skype   ontolo...@gmail.com
 http://legalentityidentifier.info/lei/lookup

 ___
 Wikidata-l mailing list
 Wikidata-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata-l




 ___
 Wikidata-l mailing list
 Wikidata-l@lists.wikimedia.org
 

Re: [Wikidata-l] statements on properties are live (+deletion propagation)

2014-12-03 Thread Navino Evans
That's fantastic news! :) Just curious... when statements linking to other
properties has been implemented, will that include the ability to start
defining a class tree for the properties? It would be amazing if you could
one day run queries like all items with any subclass of location set to
Berlin, getting all people born in there, items located there etc with a
single simple instruction.

Cheers,

Nav


On 3 December 2014 at 21:34, Magnus Manske magnusman...@googlemail.com
wrote:

 Hray! :-)

 On Wed, Dec 3, 2014 at 9:18 PM, Lydia Pintscher 
 lydia.pintsc...@wikimedia.de wrote:

 Hey folks :)

 We just enabled statements on properties. You can for example use this to:
 * describe mappings to other projects' vocabularies
 * indicate constraints for the usage of this property (currently
 stored in templates on the talk page of the property)
 * store information about where this property is used in other
 projects to make sure they can be notified when major changes are made
 to how the property is used
 * provide links to extensive documentation about and showcases for the
 usage of this property
 * store patterns for how an identifier should be expanded to form a
 proper URL
 * ...

 Some notes:
 * You can't yet make statements linking to other properties to for
 example indicate that property X is the inverse of property Y. We are
 adding a new datatype for that. It'll come soon. The ticket for that
 is https://phabricator.wikimedia.org/T75302
 * Constraint violation reports currently don't take these statements
 into account. They will continue to use the templates on the
 property's talk page. A team of students is currently working on
 overhauling the whole constraint system. It'll take them a while still
 though.

 Unrelated but also important for you to know: If a page is deleted on
 Wikipedia/Commons/... that has a sitelink in an item on Wikidata that
 link will be removed automatically now.


 Cheers
 Lydia

 --
 Lydia Pintscher - http://about.me/lydia.pintscher
 Product Manager for Wikidata

 Wikimedia Deutschland e.V.
 Tempelhofer Ufer 23-24
 10963 Berlin
 www.wikimedia.de

 Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.

 Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
 unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
 Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.

 ___
 Wikidata-l mailing list
 Wikidata-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata-l



 ___
 Wikidata-l mailing list
 Wikidata-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata-l




-- 
___

Histropedia
The Timeline for all of History
www.histropedia.com

Follow us on:
Twitter https://twitter.com/Histropedia Facebo
https://www.facebook.com/Histropediaok
https://www.facebook.com/Histropedia Google +
https://plus.google.com/u/0/b/104484373317792180682/104484373317792180682/posts
   L http://www.linkedin.com/company/histropedia-ltdinke
http://www.linkedin.com/company/histropedia-ltddIn
http://www.linkedin.com/company/histropedia-ltd
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l