I fully agree that any asynchronous JSON [de]serialization should be
stream-based, not string-based.

Now, if the main heavy duty work is dealing with the large object, this
can certainly be kept on a worker thread. I suspect, however, that this
is not always feasible.

Consider, for instance, a browser implemented as a web application,
FirefoxOS-style. The data that needs to be collected to save its current
state is held in the DOM. For performance and consistency, it is not
practical to keep the DOM synchronized at all times with a worker
thread. Consequently, data needs to be collected on the main thread and
then sent to a worker thread.

Similarly, for a 3d game, until workers can perform some off-screen
WebGL, I suspect that a considerable amount of complex game data needs
to reside on the main thread, because sending the appropriate subsets
from a worker to the main thread on demand might not be reactive enough
for 60 fps. I have no experience with such complex games, though, so my
intuition could be wrong.

Best regards,
 David


On 3/8/13 11:53 AM, David Bruant wrote:
> I don't think this is necessary as all the processing can be done a
> worker (starting in the worker even).
> But if an async solution were to happen, I think it should be all the
> way, that is changing the JSON.parse method so that it accepts not only
> a string, but a stream of data.
> Currently, one has to wait until the entire string before being able to
> parse it. That's a waste of time for big data which is your use case
> (especially if waiting for data to come from the network) and probably a
> misuse of memory. With a stream, temporary strings can be thrown away.
> 
> David


-- 
David Rajchenbach-Teller, PhD
 Performance Team, Mozilla

Reply via email to