> On 1 Mar 2017, at 09:04, Vincent Massol <[email protected]> wrote:
> 
> Hi,
> 
>> On 28 Feb 2017, at 23:25, ktc <[email protected]> wrote:
>> 
>> That extension is on the right track to what we would need, unfortunately it
>> only works for groovy.
>> 
>> I wasn't thinking that it would have to be at the JVM level but maybe at the
>> rendering context level or context level in general?  The context could then
>> can keep track of how deeply nested the inclusions are as well as keep track
>> of how long the request has been running.  These limits can then be checked
>> at critical times while rendering is occurring and then throw an Exception
>> to abort the process should a limit be broken.  It wouldn't necessarily have
>> to guarantee 100% accuracy in the first iteration of the feature, but a best
>> effort could be good for some protection.
> 
> If you check the groovy pas

^^^^
I meant “page” not “pas” ;)

Thanks
-Vincent

> I linked to in my first response you’ll see that the limiter will fail as 
> soon as a java api is called so indeed there’s no way to guarantee a response 
> time. 
> 
> Now back to what you mentioned above, the only loop that would make sense is 
> the MacroTransformation component since macros are what could take time 
> (especially the script macros). So yes, it would be possible to stop the 
> rendering there (i.e. stop evaluating transformations when they take too much 
> time). Actually we already have a protection there to avoid infinite cycle (a 
> macro generating itself).
> 
> That’s certainly doable and not too hard but you need to understand that it 
> wouldn’t guarantee anything. Could you please raise a jira issue for this 
> idea so that it’s recorded and if anyone has an interest in implementing it 
> they can find the idea?
> 
> Thanks
> -Vincent
> 
>> --
>> View this message in context: 
>> http://xwiki.475771.n2.nabble.com/Page-Complexity-limiter-tp7602880p7602882.html
>> Sent from the XWiki- Dev mailing list archive at Nabble.com.

Reply via email to