F/Wikidata [thanks!]
Hi Joachim,
On 14-02-2018 7:32, Neubert, Joachim wrote:
> Hi Aidan, hi José,
>
> I'm a bit late - sorry!
Likewise! :)
> What came to my mind as an perhaps easy extension: Can or could the browser
> be seeded with an external property (for example P2611, TED spe
@lists.wikimedia.org] Im Auftrag von
> Andy Mabbett
> Gesendet: Mittwoch, 21. Februar 2018 13:16
> An: Discussion list for the Wikidata project
> Betreff: Re: [Wikidata] Metadata about Persistent Identifiers
>
> On 21 February 2018 at 12:03, Neubert, Joachim <j.neub...@zbw.eu
So, we should be able to formally specify the "domain" of identifiers. Perhaps
that could be derived from the type constraints in linked properties, but I
think it would make sense as an explicit property on the identifier.
Some identifiers, e.g., GND, VIAF, require special attention because
Hi Aidan, hi José,
I'm a bit late - sorry!
What came to my mind as an perhaps easy extension: Can or could the browser be
seeded with an external property (for example P2611, TED speaker ID)?
That would allow to browse some external dataset (e.g., all known TED speakers)
by the facets
Hi Sebastian,
This is huge! It will cover almost all currently existing German companies.
Many of these will have similar names, so preparing for disambiguation is a
concern.
A good way for such an approach would be proposing a property for an external
identifier, loading the data into
Hi Andrew, all,
In my eyes, a large incentive for the maintainers of external databases - as I
am one for the ZBW German National Library for Economics - is the data they can
earn: not only in terms of property values and attached Wikipedia pages, but
also in terms identifiers and links to
that "Of course we make sure that neither of the ids exist in WD so far", but
> how did you do that?
>
> -Osma
>
> Neubert, Joachim kirjoitti 21.08.2017 klo 12:36:
> > Hi Osma,
> >
> > re. adding missing items, I've made good experiences
Hi Osma,
re. adding missing items, I've made good experiences with creating input files
for Quickstatements2 (see
https://github.com/zbw/repec-ras/blob/master/bin/create_missing_wikidata.pl).
I've discussed how to best do this in the Wikidata Project Chat before, and
received valuable advice.
Hi,
does anybody know if SQID can be invoked with an URL, which includes an
external id property and its value - similar to
https://tools.wmflabs.org/wikidata-todo/resolver.php?prop=P227=120434059
? That could give end users a really nice display.
Cheers, Joachim
for the Wikidata project.
Betreff: Re: [Wikidata] Multilingual and synonym support for M'n'm / was:
Mix'n'Match with existing (indirect) mappings
On Tue, Jun 13, 2017 at 6:25 PM Neubert, Joachim
<j.neub...@zbw.eu<mailto:j.neub...@zbw.eu>> wrote:
Hi Magnus, Osma,
I suppose the scenario Osma
Hi Magnus, Osma,
I suppose the scenario Osma pointed out is quite common for knowledge
organization systems and in particular thesauri: Matching could take advantage
of multilingual labels and also of synonyms, which are defined in the KOS.
For the populating STW Thesaurus for Economics ID
Hi Osma,
sorry for jumping in late. I've been at ELAG last week, talking about a very
similar topic (Wikidata as authority linking hub,
https://hackmd.io/p/S1YmXWC0e). Our use case was porting an existing mapping
between RePEc author IDs and GND IDs into Wikidata (and furtheron extending it
Hi Stas,
You are right, the header should be correct. There seems to be a long-lasting
issue in Apache HttpClient
(https://issues.apache.org/jira/browse/HTTPCLIENT-923), dating back to a
disambiguity in an old Netcape cookie spec.
Sorry, I've to address this to the Apache guys. Cheers,
In a federated query on my own (Fuseki) endpoint, which reaches out to the
Wikidata endpoint, with values already bound, it seems that I get for each
bound value an entry in the Fuseki log like this:
[2017-04-24 19:43:33] ResponseProcessCookies WARN Invalid cookie header:
"Set-Cookie:
For wikidata, there exists a resolver at
https://tools.wmflabs.org/wikidata-todo/resolver.php, which allows me to build
URLs such as
https://tools.wmflabs.org/wikidata-todo/resolver.php?quick=VIAF:12307054 , or
https://tools.wmflabs.org/wikidata-todo/resolver.php?prop=P227=120434059
in order
the Wikidata project.
> Betreff: Re: [Wikidata] SQID: the new "Wikidata classes and properties
> browser"
>
> On 21.04.2016 10:22, Neubert, Joachim wrote:
> > Hi Markus,
> >
> > Great work!
> >
> > One short question: Is there a way to switch t
l 2016 10:49
> An: Discussion list for the Wikidata project.
> Betreff: Re: [Wikidata] SQID: the new "Wikidata classes and properties
> browser"
>
> On 21.04.2016 10:22, Neubert, Joachim wrote:
> > Hi Markus,
> >
> > Great work!
> >
> > One
Hi Markus,
Great work!
One short question: Is there a way to switch to labels and descriptions in
another language, by URL or otherwise?
Cheers, Joachim
> -Ursprüngliche Nachricht-
> Von: Wikidata [mailto:wikidata-boun...@lists.wikimedia.org] Im Auftrag von
> Markus Kroetzsch
>
Hi Stas,
Thanks for your explanation! I've to perhaps do some tests on my own systems ...
Cheers, Joachim
-Ursprüngliche Nachricht-
Von: Wikidata [mailto:wikidata-boun...@lists.wikimedia.org] Im Auftrag von Stas
Malyshev
Gesendet: Donnerstag, 18. Februar 2016 19:12
An: Discussion list
guage .
>filter (contains(str(?sitelink), 'wikipedia'))
>filter (lang(?wdLabel) = ?language && ?language in ('en', 'de')) }
>
-Ursprüngliche Nachricht-
Von: Ruben Verborgh [mailto:ruben.verbo...@ugent.be]
Gesendet: Donnerstag, 18. Februar 2016 14:02
An: wikidata@list
queries for LOD uses in an environment where you can
not guarantee a high level of reliability.
Cheers, Joachim
-Ursprüngliche Nachricht-
Von: Wikidata [mailto:wikidata-boun...@lists.wikimedia.org] Im Auftrag von
Neubert, Joachim
Gesendet: Dienstag, 16. Februar 2016 15:48
On 16.02.2016 13:56, Neubert, Joachim wrote:
> Hi Markus,
>
> Great that you checked that out. I can confirm that the simplified query
> worked for me, too. It took 15.6s and revealed roughly the same number of
> results (323789).
>
> When I loaded the results into http://zbw.eu/
It's great how this discussion evolves - thanks to everybody!
Technically, I completely agree that in practice it may prove impossible to
predict the load a query will produce. Relational databases have invested years
and years in query optimization (e.g., Oracles cost based optimizer, which
Hi Marcus,
thank you very much, your code will be extremely helpful for solving my current
need. And though not a Java programmer, I may be even able to adjust it to
similar queries.
On the other side, it's some steps away from the promises of Linked data and
SPARQL endpoints. I extremely
I try to extract all mappings from wikidata to the GND authority file, along
with the according wikipedia pages, expecting roughly 500,000 to 1m triples as
result.
However, with various calls, I get much less triples (about 2,000 to 10,000).
The output seems to be truncated in the middle of a
25 matches
Mail list logo