On 07/01/2014 12:04 PM, Fitzgerald, Nick wrote:
On 7/1/14, 10:52 AM, Jason Orendorff wrote:

Events are *not* delivered synchronously. As JS code executes, a log is
written. Occasionally, the log is then parsed and records are delivered to
devtools code via the enterFunction and leaveFunction methods above. (This
batching should improve performance, by minimizing C++-to-JS-to-C++ and
cross-compartment calls.)

Because all the devtools are designed to work remotely from day one*, we
will be sending these logs over the Remote Debugging Protocol from the
debuggee device (the Firefox OS / Fennec phone, etc) to the debugger device
(desktop Firefox) where the data will be processed in a worker and
eventually displayed to the user.

It would be a shame if we did this:

1. Collect log in SpiderMonkey
2. Parse log into JS objects
3. Deliver to hooks devtools set
4. Re-serialize JS objects into a log for transport
5. Send log over RDP
6. Parse log into JS objects again

When if the log was exposed to devtools as some kind of blob / typed array
that we can send across the RDP as binary data, we could do this:

1. Collect the log in SpiderMonkey on the debuggee device
2. Deliver the log blob to a hook the devtools set
3. Send log blob over RDP
4. On the debugger device, devtools code asks Debugger to parse the blob

This way we aren't repeatedly parsing and serializing (to potentially
different formats!) for no good reason.

One of the issue with the blob logic is that the intent of making analysis is to be able to inspect elements. One of the idea was to be able to proxy objects, such as we can still provide a boxing mechanism, which makes sense for synchronous analysis as they have in Jalangi.

On the other hand, now that I am thinking more about it, I do wonder to what extend having an asynchronous view on objects might be helpful compared to some unique identifier of an object.

In which case, if you want to find the value corresponding to one identifier, you will have to watch for objects mutations as well. Knowing that objects allocations/mutations/deallocations will stream you the list of modifications made to all objects.

If we go through the RDP, then I guess we want the asynchronous tracing to just provide an ArrayBuffer of its log based on a list of callbacks (not functions), such as we can easily write the server-side of the pipeline.

Then, I guess we want a second function into which we feed the ArrayBuffer and it calls all the callbacks (provided as a list to the first function). And this would be on the client side of the Debugger.

// producer
var watched = ["enterFunction", "leaveFunction", "setObject", "newObject", "freeObject"];
dbg.addLogListener(watched, function (stream) {
  // ... send stream over the network, or locally ...
});

// consumer
var watcher = {
  enterFunction: function (event) { ... };
  leaveFunction: function (event) { ... };
  setObject: function (event) { ... };
  newObject: function (event) { ... };
  freeObject: function (event) { ... };
};

function onStreamReceived(stream) {
  dbg.dispatchLogEvents(stream, watcher);
}

--
Nicolas B. Pierron
_______________________________________________
dev-tech-js-engine-internals mailing list
[email protected]
https://lists.mozilla.org/listinfo/dev-tech-js-engine-internals

Reply via email to