On Fri, 21 Jan 2005, Chad Gard wrote:

> That's what I'll do if I can't make it work the way I want.  However, 
> I'd like to be able to avoid the overhead of additional queries, and 
> additional browser connections to the web server.  I already get all 
> the data I need for other parts of the page, but there could be as 
> many 25 small graphs in the page, with 9 being typical.

I think you're out of luck.

What you want, in essence, is a way to generate an HTML document that 
contains, inline, all the data to compose the page, So instead of

  <img src="foo.png" alt="Foo" height="42" width="100">

you seem to want something more like this:

  <img alt="Foo" height="42" width="100">
<!-- lots of binary data here -->
  </img>

This is not, as far as I know, remotely possible.

You may be able to get something similar with XHTML, or maybe a 
combination of XHTML and SVG, which is a vector graphics format that 
encodes the data to produce the image in an XML format. But with regular 
old HTML that can be expected to work on any browser in contemporary 
usage (which will include some users of IE 4+, Netscape 4+, etc), this 
is going to be a dead end.

Why are you so worried about the additional queries? They don't really 
produce a lot of overhead. If your concen is that the iniial request 
gets all the data needs to generate everything, and you don't want to 
replicate all that effort, you can look into some caching tricks:

 * have the initial request generate and cache the images

 * have a request for an image check the cache before attempting to 
generate anything: if it's there, return it, if not, generate and cache 
it for future use

 * have the cache be pruned automatically by some combination of age (no 
old data or unused data) or size (set a ceiling for growth)

I really think that this is your best available approach. Serving 
everything in one document isn't really a viable option. If you want to 
get better performance on the server, some kind of caching system is 
probably the way to go. If the approach above is too complex, you can 
consider setting up a caching proxy server between your server that 
generates the pages and the client -- both Apache and Squid can do this, 
among others. 




-- 
Chris Devers

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>


Reply via email to