On Tue, Mar 12, 2002 at 02:49:48PM +0100, Myk Melez wrote: > Interestingly, FLUSH appears to increase the overall performance of > template processing.
Hi Myk, I just had a chance to take a look at your patch. There's a minor problem in that you can't safely change the parameter order of the process() method because there will be code out there (e.g. plugin) that's calling $context->process($template, $vars) and won't expect to have to call $context->process($template, $outstream, $vars), so you'd have to move the outstream down to be the last parameter. The larger problem, as you've already identified, is that there's no way to correctly flush output in the right order without explicitly coding it in your template and I'm very wary about adding something as core standard that can create unpredictable behaviour. I guess the speedup is due to the fact that you're flushing output rather than building up a large string containing the output. Presumably Perl saves on the memory allocation and doesn't have to copy potentially large strings on and off the stack all the time. This is another one of the things that I wanted to crack properly for version 3. For starters we should be using string refs instead of strings for storing the output (faster to pass around). Better still would be to have an output buffer object which is effectively a stack of output strings representing the nested output accrued so far. Calling flush() on the output object would Do The Right Thing without us having to worry about the underlying magic. Not exactly sure of the details yet. There is, of course, the delicate balancing act between having a flexible output method and having a fast one. For example, calling $context->output($stuff) every time instead of $output = $stuff would make it possible to flush output as it is collected, but would make things slower. > but the data does suggest that flushing to the output > stream regularly could significantly increase the general performance of > the Template Toolkit. I'll think some more on this... A
