Of course.  You can cache the individual entities somewhere inside the
server-system where they can be stuck together very quickly,  or you
can cache them on the client.

On Wed, Apr 2, 2014 at 12:04 PM, Magnus Manske
<magnusman...@googlemail.com> wrote:
> Could one of the front-ends (squid?) perform a simple batch service, by just
> concatenating the /entity/ JSON for requested items? That could effectively
> run on the cache and still deliver batches.
>
>
> On Wed, Apr 2, 2014 at 4:44 PM, Paul Houle <ontolo...@gmail.com> wrote:
>>
>> I've been thinking about this kind of problem in my own systems.  Name
>> and link generation from entities is a cross-cutting concern that's
>> best separated from other queries in your application.  With SPARQL
>> and multiple languages each with multiple rdf:label it is awkward to
>> write queries that bring labels back with identifiers,  particularly
>> if you want to apply rules that amount "if an ?lang label doesn't
>> exist for a topic,  show a label from a language that uses that uses
>> the same alphabet as ?lang in preference to any others."  Another
>> issue too is that the design and business people might have some
>> desire for certain kinds of labels and it's good to be able to change
>> that without changing your queries.
>>
>> Anyway,  a lot of people live on the other end of internet connections
>> with 50ms, 2000ms or more latency to the network core,  plus sometimes
>> the network has a really bad day or even a bad few seconds.  For every
>> hundred or so TCP packets you send across the modern internet,  you
>> lose one.  The fewer packets you send per interaction the less likely
>> the user is going to experience this.
>>
>> If 20 names are looked up sequentially and somebody is on 3G cellular
>> with 300ms latency,  the user needs to wait six seconds for this data
>> to load on top of the actual time moving the data and waiting for the
>> server to get out of it's own way.  This is using jQuery so it's very
>> likely the page has other Javascript geegaws in that work OK for the
>> developer who lives in Kansas City but ordinary folks in Peoria might
>> not have the patience to wait until your page is fully loaded.
>>
>>
>> Batch queries give users performance they can feel,  even if they
>> demand more of your server.  In my system I am looking at having a
>> "name lookup" server that is stupidly simple and looks up precomputed
>> names in a key value store,  everything really stripped down and
>> efficient with no factors of two left on the floor.  I'm looking at
>> putting a pretty ordinary servlet that writes HTML in front of it,
>> but a key thing is that the front of the back end runs queries in
>> parallel to fight latency,  which is the scourge of our times.  (It's
>> the difference between Github and Altassian)
>>
>> On Wed, Apr 2, 2014 at 4:36 AM, Daniel Kinzler
>> <daniel.kinz...@wikimedia.de> wrote:
>> > Hey Denny! Awesome tool!
>> >
>> > It's so awesome, we are already wondering about how to handle the load
>> > this may
>> > generate.
>> >
>> > As far as I can see, qlabel uses the wbgetentities API module. This has
>> > the
>> > advantage of allowing the labels for all relevant entities to be fetched
>> > with a
>> > single query, but it has the disadvantage of not being cacheable.
>> >
>> > If qlabel used the .../entity/Q12345.json URLs to get entity data, that
>> > would be
>> > covered by the web caches (squid/varnish). But it would mean one request
>> > per
>> > entity, and would also return the full entity data, not just the  labels
>> > in one
>> > language. So, a lot more traffic.
>> >
>> > If this becomes big, we should probably offer a dedicated web interface
>> > for
>> > fetching labels of many entities in a given language, using nice,
>> > cacheable
>> > URLs. This would mean a new cache entry per language per combination of
>> > entities
>> > - potentially, a large number. However, the combination of entities
>> > requested is
>> > determiend by the page being localized - that is, all visitors of a
>> > given page
>> > in a given language would hit the same cache entry. That seems workable.
>> >
>> > Anyway, we are not there quite yet, just something to ponder :)
>> >
>> > -- daniel
>> >
>> >
>> > Am 01.04.2014 20:14, schrieb Denny Vrandečić:
>> >> I just published qLabel, an Open Source jQuery plugin that allows to
>> >> annotate
>> >> HTML elements with Wikidata Q-IDs (or Freebase IDs, or, technically,
>> >> with any
>> >> other Semantic Web / Linked Data URI), and then grabs the labels and
>> >> displays
>> >> them in the selected language of the user.
>> >>
>> >> Put differently, it allows for the easy creation of multilingual
>> >> structured
>> >> websites. And it is one more way in which Wikidata data can be used, by
>> >> anyone.
>> >>
>> >> Contributors and users are more than welcome!
>> >>
>> >>
>> >> <http://google-opensource.blogspot.com/2014/04/qlabel-multilingual-content-without.html>
>> >>
>> >>
>> >>
>> >> _______________________________________________
>> >> Wikidata-l mailing list
>> >> Wikidata-l@lists.wikimedia.org
>> >> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>> >>
>> >
>> >
>> > --
>> > Daniel Kinzler
>> > Senior Software Developer
>> >
>> > Wikimedia Deutschland
>> > Gesellschaft zur Förderung Freien Wissens e.V.
>> >
>> > _______________________________________________
>> > Wikidata-l mailing list
>> > Wikidata-l@lists.wikimedia.org
>> > https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>>
>>
>>
>> --
>> Paul Houle
>> Expert on Freebase, DBpedia, Hadoop and RDF
>> (607) 539 6254    paul.houle on Skype   ontolo...@gmail.com
>>
>> _______________________________________________
>> Wikidata-l mailing list
>> Wikidata-l@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>
>
>
>
> --
> undefined
>
> _______________________________________________
> Wikidata-l mailing list
> Wikidata-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>



-- 
Paul Houle
Expert on Freebase, DBpedia, Hadoop and RDF
(607) 539 6254    paul.houle on Skype   ontolo...@gmail.com

_______________________________________________
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l

Reply via email to