Thanks for the history lesson :) I'm not underestimating
backwards-compatibility, but since so much of old web content is brittle
in so many ways, it's hard to know in advance exactly which problems are
hiding under that back-compat umbrella.
One more question: Perhaps I'm misunderstanding what you mean by "throw
the document away", but if it means the document gets discarded, garbage
collected, and the DOM for that page doesn't exist any more... if you
had a page that mutated a hibernated document with a *single* DOM call,
then no exception would be thrown but the page would vanish and perhaps
eventually reload. But if the page mutated a hibernated document with
*two* DOM calls, wouldn't the second one fail anyway, because the
document was thrown away? So we trade one exception for another...
On 8/24/2010 11:47 PM, Boris Zbarsky wrote:
In Gecko's case, because such documents should not trigger network
requests, run any script (including mutation event handlers), etc.
The fact that the cache even exists should be as not-exposed to
everyone as possible.
...
This would almost certainly break pages, which get no exceptions if
they try to do this right now. Worse, it would break them in a
timing-dependent way.
...
Don't underestimate "historical reasons". They're also known as
"compatibility with deployed content". ;)
Note that in Gecko documents with active network requests never go
into hibernation right now, precisely because we don't want to have to
buffer potentially-arbitrary amounts of data (see JPEG push) for
arbitrary lengths of time. We still wouldn't want to do so in this
case...
That's a tunable policy issue, right? I.e., buffer X Kb of data, then
terminate active network requests.
The problem with that approach is that when the user then navigates
back to the page it will be broken (e.g. if they left it early in the
load its DOM may not even all be there).
Hmm, that is tricky, and a timing bug indeed. That points out another
potential question -- if the user navigates away from a page with an
active XHR/JPEG-push/whatever connection, will Gecko then force that
page to stay alive, and continue running script? -- but that leads us
farther away from the original question of this thread, regarding
detaching and reattaching iframes with documents. My only two cents
here are that it seems like these scenarios (abrupt iframe detachment,
abrupt history navigation, where abrupt means "while event handlers
might be queued or network activity might still be running") are almost
but currently not quite behaviorally equivalent to ones with bursty or
slow network connections; maybe that gives us some wiggle room to
simplify the spec and/or implementations. But like others on this
thread, I've gone as far as I understand; I defer to the experts what
the correct behavior here ought to be.
~ben