On Fri, Aug 7, 2009 at 11:39 AM, Arthur Kalmenson <arthur.k...@gmail.com>wrote:

> I don't think firebug counts the initial request to fetch the host
> page, so two requests. One for the nocache.js and another for the
> cachable HTML. With the inlining of the nocache.js file, you could get
> it down to 0 requests if the retrieved page is cached forever.
>

Well, since it can't be strongly named (if it was then the page with the
link would have to be weakly cached for when it changed), the host page
can't be cached forever but it can be cached based on the frequency you
might need to update it.  Note that you need to keep the old compiled script
around for as long as the host page might be cached (and any server
resources it might need, which makes server RPC changes difficult), since
caches with the old host page will be referring to the old compiled output.


> Are you saying to inline the generated JS in the host page too? How
> could you do that? Don't you need the selector script to pick the
> correct compiled version? Maybe I'm just not understanding what you
> mean.
>

See the example below for how it currently works and the change usign
server-side selection.

The problem is that the browser will fetch foo.html and get the proper
compiled output for their browser/locale/etc.  An intervening cache, say at
an ISP, will then return that foo.html response to a different user with
different parameters.  As I mentioned in the other message, you have to
either make the host page uncacheable (which is only feasible with small
compiled scripts) or rely on caches honoring the Vary headers, which is
problematic on the Internet in general though might be feasible if you
control the network your users use to access it.

Currently you have a host page, say Showcase.html, that is cached based on
the frequency you expect to update it and contains a reference to the
selection script.
...
<script language='javascript' src='showcase/showcase.nocache.js'></script>

That then fetches the selection script, which is cached based on the
frequency you expect to update the app but with must-validate set and
executes in the browser, and chooses a strong-named JS file for the compiled
output, say XYZ.cache.html.  That file is then cached for 1 year ("forever"
according to the HTTP spec).

In the new scheme, you have a host page that is generated by the server from
a template.  That is mark cacheable for the shorter of the time you expect
to update the host page template or recompile your app, with must-revalidate
set.  It directly includes a script tag referencing XYZ.cache.html, where
the server made the deferred binding decision based on the information
available in the request.  In the typical case, it will use the User-Agent
header and perhaps the Accept-Language header and the locale query parameter
in the URL.

Since the results of the request for Showcase.html now depend on those
parameters of the HTTP request, you have to make sure that a cache doesn't
return a previous copy served to someone with Firefox to the next guy to ask
for it who might be using Safari.  One way to do that is to make sure it
isn't cached at all (or at least with must-validate), and the other is to
rely on the caches to correctly interpret Vary headers so they know that the
response you returned may be different if those headers are different.  As a
side note, it doesn't know that you don't care about the difference between
IE 7.0 and IE 7.0.1 (making these up so you get the point), or even the
screen resolution that some user agents send in the request, so even caches
that accept the Vary header will be less effective.  There are other ways
around it with newer standards, but even fewer of the caches out there will
make use of those features so it will be a while before they are actually
useful.

-- 
John A. Tamplin
Software Engineer (GWT), Google

--~--~---------~--~----~------------~-------~--~----~
http://groups.google.com/group/Google-Web-Toolkit-Contributors
-~----------~----~----~----~------~----~------~--~---

Reply via email to