Brent Barr <b.b...@f5.com> wrote:

> I've gone to a system that uses a python file to generate the graphs on the 
> fly.  Request the HTML page and it takes 5+ seconds to go get the data and 
> generate the charts for that page.  Means I only generate charts a few times 
> a week, and that the data is always the freshest.

That's how I do it (though I'm on GNU/Linux).
All the web pages (or more accurately, the images) are CGI scripts that 
generate an RRD script on the fly. Most of them take parameters, for example 
setting the period to day, week, etc or selecting an IP. I have the --lazy 
option set, so RRD will only actually generate a new graph if the data is newer 
than the cached image file.
With the number of potential images, it would be a huge waste of resources (not 
to mention needing a much much more powerful system) generating the thousands 
of potential graphs - most of which will be rarely (if ever) be looked at.

An example :
We have a /24 subnet at work, and I log the traffic by IP address at the border 
gateway. Excluding the network and broadcast addresses, that's 254 IPs, with in 
and out for each. I can draw a graph for any IP for a day, week, month, or year 
- and with or without a line to show max (significantly different from average 
once you start aggregating data). 254x4x2 = 2032 graphs, and that's only part 
of the logging I do. Most of them will not be looked at other than "once in a 
while" so it would be a complete waste of resources to regenerate them 
periodically.

_______________________________________________
rrd-users mailing list
rrd-users@lists.oetiker.ch
https://lists.oetiker.ch/cgi-bin/listinfo/rrd-users

Reply via email to