On 07/03/2013 23:34 , Tobie Langel wrote:
In which case, isn't part of the solution to paginate your data, and
parse those pages separately?
Assuming you can modify the backend. Also, data doesn't necessarily have
to get all that bulky before you notice on a somewhat sluggish device.
Even if an async API for JSON existed, wouldn't the perf bottleneck
then simply fall on whatever processing needs to be done afterwards?
But for that part you're in control of whether your processing is
blocking or not.
Wouldn't some form of event-based API be more indicated? E.g.:
var parser = JSON.parser();
> parser.parse(src);
> parser.onparse = function(e) { doSomething(e.data); };
I'm not sure how that snippet would be different from a single callback API.
There could possibly be value in an event-based API if you could set it
up with a filter, e.g. JSON.filtered("$.*").then(function (item) {});
which would call you for ever item in the root object. Getting an event
for every information item that the parser processes would likely flood
you in events.
Yet another option is a pull API. There's a lot of experience from the
XML planet in APIs with specific performance characteristics. They would
obviously be a lot simpler for JSON; I wonder how well that experience
translates.
--
Robin Berjon - http://berjon.com/ - @robinberjon