or RDF/Wikidata [thanks!]
Hi Joachim,
On 14-02-2018 7:32, Neubert, Joachim wrote:
> Hi Aidan, hi José,
>
> I'm a bit late - sorry!
Likewise! :)
> What came to my mind as an perhaps easy extension: Can or could the browser
> be seeded with an external property (for example P261
@lists.wikimedia.org] Im Auftrag von
> Andy Mabbett
> Gesendet: Mittwoch, 21. Februar 2018 13:16
> An: Discussion list for the Wikidata project
> Betreff: Re: [Wikidata] Metadata about Persistent Identifiers
>
> On 21 February 2018 at 12:03, Neubert, Joachim wrote:
>
>
So, we should be able to formally specify the "domain" of identifiers. Perhaps
that could be derived from the type constraints in linked properties, but I
think it would make sense as an explicit property on the identifier.
Some identifiers, e.g., GND, VIAF, require special attention because the
Hi Aidan, hi José,
I'm a bit late - sorry!
What came to my mind as an perhaps easy extension: Can or could the browser be
seeded with an external property (for example P2611, TED speaker ID)?
That would allow to browse some external dataset (e.g., all known TED speakers)
by the facets provided
Hi Sebastian,
This is huge! It will cover almost all currently existing German companies.
Many of these will have similar names, so preparing for disambiguation is a
concern.
A good way for such an approach would be proposing a property for an external
identifier, loading the data into Mix-n-m
Hi Andrew, all,
In my eyes, a large incentive for the maintainers of external databases - as I
am one for the ZBW German National Library for Economics - is the data they can
earn: not only in terms of property values and attached Wikipedia pages, but
also in terms identifiers and links to othe
aid in your question
> that "Of course we make sure that neither of the ids exist in WD so far", but
> how did you do that?
>
> -Osma
>
> Neubert, Joachim kirjoitti 21.08.2017 klo 12:36:
> > Hi Osma,
> >
> > re. adding missing items, I&
Hi Osma,
re. adding missing items, I've made good experiences with creating input files
for Quickstatements2 (see
https://github.com/zbw/repec-ras/blob/master/bin/create_missing_wikidata.pl).
I've discussed how to best do this in the Wikidata Project Chat before, and
received valuable advice.
list for the Wikidata project.; Neubert, Joachim
> Betreff: Re: [Wikidata] SQID: Lookup by value of external identifier?
>
> Update:
>
> We now also have this natively in SQID using the syntax
> find=property:value in the view, e.g.
>
> https://tools.wmflabs.org/sqid/#/vi
Hi,
does anybody know if SQID can be invoked with an URL, which includes an
external id property and its value - similar to
https://tools.wmflabs.org/wikidata-todo/resolver.php?prop=P227&value=120434059
? That could give end users a really nice display.
Cheers, Joachim
__
for the Wikidata project.
Betreff: Re: [Wikidata] Multilingual and synonym support for M'n'm / was:
Mix'n'Match with existing (indirect) mappings
On Tue, Jun 13, 2017 at 6:25 PM Neubert, Joachim
mailto:j.neub...@zbw.eu>> wrote:
Hi Magnus, Osma,
I suppose the scenario Os
Hi Magnus, Osma,
I suppose the scenario Osma pointed out is quite common for knowledge
organization systems and in particular thesauri: Matching could take advantage
of multilingual labels and also of synonyms, which are defined in the KOS.
For the populating STW Thesaurus for Economics ID (P39
Hi Osma,
sorry for jumping in late. I've been at ELAG last week, talking about a very
similar topic (Wikidata as authority linking hub,
https://hackmd.io/p/S1YmXWC0e). Our use case was porting an existing mapping
between RePEc author IDs and GND IDs into Wikidata (and furtheron extending it
th
Hi Stas,
You are right, the header should be correct. There seems to be a long-lasting
issue in Apache HttpClient
(https://issues.apache.org/jira/browse/HTTPCLIENT-923), dating back to a
disambiguity in an old Netcape cookie spec.
Sorry, I've to address this to the Apache guys. Cheers, Joachi
In a federated query on my own (Fuseki) endpoint, which reaches out to the
Wikidata endpoint, with values already bound, it seems that I get for each
bound value an entry in the Fuseki log like this:
[2017-04-24 19:43:33] ResponseProcessCookies WARN Invalid cookie header:
"Set-Cookie: WMF-Last
For wikidata, there exists a resolver at
https://tools.wmflabs.org/wikidata-todo/resolver.php, which allows me to build
URLs such as
https://tools.wmflabs.org/wikidata-todo/resolver.php?quick=VIAF:12307054 , or
https://tools.wmflabs.org/wikidata-todo/resolver.php?prop=P227&value=120434059
in or
Or simply the item link "(Q42)" at the top of the page? On the wikidata wiki
item page, the wikipedia list is top right, so it's a matter of personal
preference. Editing is accessible easily from there.
Cheers, Joachim
> -Ursprüngliche Nachricht-
> Von: Wikidata [mailto:wikidata-boun...
for the Wikidata project.
> Betreff: Re: [Wikidata] SQID: the new "Wikidata classes and properties
> browser"
>
> On 21.04.2016 10:22, Neubert, Joachim wrote:
> > Hi Markus,
> >
> > Great work!
> >
> > One short question: Is there a way to switch
l 2016 10:49
> An: Discussion list for the Wikidata project.
> Betreff: Re: [Wikidata] SQID: the new "Wikidata classes and properties
> browser"
>
> On 21.04.2016 10:22, Neubert, Joachim wrote:
> > Hi Markus,
> >
> > Great work!
> >
> > One sho
Hi Markus,
Great work!
One short question: Is there a way to switch to labels and descriptions in
another language, by URL or otherwise?
Cheers, Joachim
> -Ursprüngliche Nachricht-
> Von: Wikidata [mailto:wikidata-boun...@lists.wikimedia.org] Im Auftrag von
> Markus Kroetzsch
> Gesende
Hi Stas,
Thanks for your explanation! I've to perhaps do some tests on my own systems ...
Cheers, Joachim
-Ursprüngliche Nachricht-
Von: Wikidata [mailto:wikidata-boun...@lists.wikimedia.org] Im Auftrag von Stas
Malyshev
Gesendet: Donnerstag, 18. Februar 2016 19:12
An: Discussion list f
hema:inLanguage ?language .
>filter (contains(str(?sitelink), 'wikipedia'))
>filter (lang(?wdLabel) = ?language && ?language in ('en', 'de')) }
>
-Ursprüngliche Nachricht-
Von: Ruben Verborgh [mailto:ruben.verbo...@ugent.be]
Gesendet: Do
dvanced SPARQL queries for LOD uses in an environment where you can
not guarantee a high level of reliability.
Cheers, Joachim
-Ursprüngliche Nachricht-
Von: Wikidata [mailto:wikidata-boun...@lists.wikimedia.org] Im Auftrag von
Neubert, Joachim
Gesendet: Dienstag, 16. Februar 2016
Well, another use case for nearly-immediate updates:
I'll do a presentation next week, in which I intend to demonstrate that I can
add a Wikidata value online, which then is available immediately for my
application - as well as for the whole rest of the world. (In Library Land,
that's a real b
On 16.02.2016 13:56, Neubert, Joachim wrote:
> Hi Markus,
>
> Great that you checked that out. I can confirm that the simplified query
> worked for me, too. It took 15.6s and revealed roughly the same number of
> results (323789).
>
> When I loaded the results into http://zb
Hi Markus,
Great that you checked that out. I can confirm that the simplified query worked
for me, too. It took 15.6s and revealed roughly the same number of results
(323789).
When I loaded the results into http://zbw.eu/beta/sparql/econ_pers/query, an
endpoint for "economics-related" persons,
It's great how this discussion evolves - thanks to everybody!
Technically, I completely agree that in practice it may prove impossible to
predict the load a query will produce. Relational databases have invested years
and years in query optimization (e.g., Oracles cost based optimizer, which
re
Hi Lydia,
I agree on using the right tool for the job. Yet, it isn’t always obvious what
is right and what the limitations of a tool are.
For me, it’s perfectly ok when a query runs for 20 minutes, when it spares me
some hours of setting up a specific environment for one specific dataset (and
Hi Marcus,
thank you very much, your code will be extremely helpful for solving my current
need. And though not a Java programmer, I may be even able to adjust it to
similar queries.
On the other side, it's some steps away from the promises of Linked data and
SPARQL endpoints. I extremely valu
Hi Stas,
Thanks for your answer. You asked how long the query runs: 8.21 sec (having
processed 6443 triples), in an example invocation. If roughly linear, that
could mean 800-1500 sec for the whole set. However, I would expect a clearly
shorter runtime: I routinely use queries of similar comple
I try to extract all mappings from wikidata to the GND authority file, along
with the according wikipedia pages, expecting roughly 500,000 to 1m triples as
result.
However, with various calls, I get much less triples (about 2,000 to 10,000).
The output seems to be truncated in the middle of a s
31 matches
Mail list logo