We have a hackathon starting tomorrow morning (California time). It would
be fantastic if we could hack on adding our gene wikidata content to a
Wikipedia instance using this new ability. We too have been anxiously
awaiting this development.
Is there a sandbox environment somewhere that we
On Thu, May 7, 2015 at 12:32 PM, Lydia Pintscher
lydia.pintsc...@wikimedia.de wrote:
On Thu, May 7, 2015 at 9:28 PM, Benjamin Good ben.mcgee.g...@gmail.com
wrote:
We have a hackathon starting tomorrow morning (California time). It
would
be fantastic if we could hack on adding our gene
could be generated as a
whole.
-Data could be queried/exported from WD based on an ontology by simply
selecting the whole or parts of an ontology.
This approach has been suggested and discussed by Benjamin Good, Elvira
Mitraka, Andra Wagmeester, Andrew Su and me. As an example, we put
together
,
Have you checked tool-labs already?
http://tools.wmflabs.org/hay/directory/#/keyword/wikidata
Maybe you find something useful...
Best,
Claudia
On Dec 15, 2014, at 12:42 PM, Lydia Pintscher
lydia.pintsc...@wikimedia.de wrote:
On Wed, Dec 10, 2014 at 8:28 PM, Benjamin Good ben.mcgee.g
Are there are any tools currently that could do something like this:
Input: A category/class - something that hangs off subclassOf P279
relationships. For example: gene Q7187
Output: an interactive visual representation of the properties that are
being used by the entities connected to this
Sorry, not sure if this is the right place to post this bug report?
https://www.wikidata.org/wiki/Wikidata:List_of_properties/all#Medicine
reports quite a few messages like:
*The time allocated for running scripts has expired.**The time allocated
for running scripts has expired.**The time
a string, a
coordinate, etc.
I *think* the reason you're getting some results for
CLAIM[486:string] is because the system is somehow matching these up
with the items that have null/unknown values for 486 and returning
those.
Andrew.
On 27 October 2014 03:52, Benjamin Good ben.mcgee.g
.
On 24 October 2014 18:13, Benjamin Good ben.mcgee.g...@gmail.com wrote:
Using the autolist 2 tool (http://tools.wmflabs.org/autolist/index.php),
I
enter the WDQ query:
CLAIM[486:D008180]
which I think means Give me the items that have MeSH id = D008180
. It
does indeed return
Using the autolist 2 tool (http://tools.wmflabs.org/autolist/index.php), I
enter the WDQ query:
CLAIM[486:D008180]
which I think means Give me the items that have MeSH id = D008180 . It
does indeed return an item with that claim Q1495661
but also returns items that have an *no value* as the
, 2014 at 10:54 PM, Benjamin Good ben.mcgee.g...@gmail.com
wrote:
Very cool project! It really shows some great potential for wikidata.
In quickly playing with it I found it difficult to find properties that
matched my intention most of the time. I think this is not the fault of
the tool
Very cool project! It really shows some great potential for wikidata. In
quickly playing with it I found it difficult to find properties that
matched my intention most of the time. I think this is not the fault of
the tool but an indication of an area where wikidata could be improved
(e.g.
Magnus is right on here. We are hoping to begin things by establishing
wikidata as a central hub for gene identifier lookup and matching. (This
is a big a problem in bioinformatics). With this (hopefully fairly bit
stable) starting point of wikidata entities, we will be expanding the
statements
potential
for structuring large sets of biological data. Thanks for your excellent
work!
Cheers,
Eric
https://www.wikidata.org/wiki/User:Emw
On Mon, Oct 6, 2014 at 4:21 PM, Benjamin Good ben.mcgee.g...@gmail.com
wrote:
I thought folks might like to know that every human gene (according
I thought folks might like to know that every human gene (according to the
United States National Center for Biotechnology Information) now has a
representative entity on wikidata. I hope that these are the seeds for
some amazing applications in biology and medicine.
Well done Andra and
wrote:
Wow! That's pretty cool work!
Do you have any plans to keep the data fresh?
On Mon Oct 06 2014 at 1:22:12 PM Benjamin Good ben.mcgee.g...@gmail.com
wrote:
I thought folks might like to know that every human gene (according to
the United States National Center for Biotechnology
Would be happy to. Let me know suggested size and how to get it over to
you.
thanks
-Ben
On Mon, Oct 6, 2014 at 1:40 PM, Lydia Pintscher
lydia.pintsc...@wikimedia.de wrote:
Hey Ben and Andra,
On Mon, Oct 6, 2014 at 10:30 PM, Benjamin Good ben.mcgee.g...@gmail.com
wrote:
Yes indeed we do
Sounds good. Will do.
On Mon, Oct 6, 2014 at 1:53 PM, Lydia Pintscher
lydia.pintsc...@wikimedia.de wrote:
On Mon, Oct 6, 2014 at 10:48 PM, Benjamin Good ben.mcgee.g...@gmail.com
wrote:
Would be happy to. Let me know suggested size and how to get it over to
you.
Sweet. As long as you
Based on the CHEBI ontology perspective, alcohol is a class with subclasses
like 'aromatic alcohol' which has subclasses like 'benzyl alcohols' which
has subclasses like 'methylbenzyl alcohol' and so on.
These relationships seem worth capturing and subclass seems like a
reasonable way to do it.
for the disease identifiers
though of course we would love to have some feedback about that!
-Ben
On Fri, Sep 12, 2014 at 1:28 AM, Joe Filceolaire filceola...@gmail.com
wrote:
Is this being inported into wikidata?
Joe
On 11 Sep 2014 23:16, Benjamin Good ben.mcgee.g...@gmail.com wrote:
FYI we are already
biology.
On 11 September 2014 17:06, Benjamin Good ben.mcgee.g...@gmail.com
wrote:
Thats great Andy! Hope to see you connecting with
https://www.wikidata.org/wiki/Wikidata:WikiProject_Molecular_biology -
specifically with regard to drug information.
cheers
-Ben
On Wed, Sep 10
How are related properties calculated?
Is the definition of a Class something that has a subclass relationship?
Or?
Very cool...
-Ben
On Mon, Sep 8, 2014 at 9:24 AM, Markus Krötzsch
mar...@semantic-mediawiki.org wrote:
On 08.09.2014 14:53, Markus Krötzsch wrote:
...
://www.mediawiki.org/wiki/PWB
Both branches support Wikidata
Best
On 8/29/14, Benjamin Good ben.mcgee.g...@gmail.com wrote:
Which python framework should a new developer use to make a wikidata
editing bot?
thanks
-Ben
It does, but only on the very bottom with a see also.
Somehow I ended up on
https://github.com/jcreus/pywikidata
first.
which is two years out of date and very similarly named..
-ben
On Fri, Aug 29, 2014 at 10:17 AM, Derric Atzrott
datzr...@alizeepathology.com wrote:
There is
Which python framework should a new developer use to make a wikidata
editing bot?
thanks
-Ben
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Tom,
I totally agree with your sentiments here. Two questions.
Do you believe there is any valuable use for upper ontologies in the
wikidata system at all at this stage?
Could you describe how you see a bubble-up classification scheme working in
this context in a little detail? I can imagine
OK, thanks for your reply. We will watch for new developments and
incorporate them into our work as they are ready.
Keep up the good work on this important project!
-Ben
On Fri, Sep 13, 2013 at 1:20 PM, Daniel Kinzler daniel.kinz...@wikimedia.de
wrote:
Am 13.09.2013 18:24, schrieb Benjamin
Thanks Lydia, here is a direct link to our wikidata project idea.
http://sulab.org/gsoc13/#idea1
There is clearly already a lot of interest in wikidata among the gsoc
student community. If students (or potential mentors) have additional
ideas for biology-related wikidata projects, please do get
Thanks for the feedback. I think we will push forward and work directly on
wikidata. This is a conversion from database(s)-wikipedia to
database(s)-wikidata-wikipedia
All of our code will be and is open source. We'd be happy to share and to
build on what other bot developers are doing. Is
I am considering the task of converting the templates from the gene
articles in Wikipedia (http://en.wikipedia.org/wiki/Portal:Gene_Wiki) to
use/create wikidata assertions. This involves an extensive update of the
template structure as well as the code for the bot that keeps them in sync
with
29 matches
Mail list logo