Le 07/03/2013 23:18, David Rajchenbach-Teller a écrit :
(Note: New on this list, please be gentle if I'm debating an
inappropriate issue in an inappropriate place.)

Actually, communicating large JSON objects between threads may cause
performance issues. I do not have the means to measure reception speed
simply (which would be used to implement asynchronous JSON.parse), but
it is easy to measure main thread blocks caused by sending (which would
be used to implement asynchronous JSON.stringify).

I have put together a small test here - warning, this may kill your browser:
    http://yoric.github.com/Bugzilla-832664/

While there are considerable fluctuations, even inside one browser, on
my system, I witness janks that last 300ms to 3s.

Consequently, I am convinced that we need asynchronous variants of
JSON.{parse, stringify}.
I don't think this is necessary as all the processing can be done a worker (starting in the worker even). But if an async solution were to happen, I think it should be all the way, that is changing the JSON.parse method so that it accepts not only a string, but a stream of data. Currently, one has to wait until the entire string before being able to parse it. That's a waste of time for big data which is your use case (especially if waiting for data to come from the network) and probably a misuse of memory. With a stream, temporary strings can be thrown away.

David

Reply via email to