Hi Denis,

On 12/8/2010 12:04 PM, Denis Lila wrote:
I'm not sure how the closed interval is awkward.  Isn't it just proper
choice of ">= and<= vs.>  and<" in the testing method?

In the filtering function, yes, but I was referring to cubicRootsInAB in
Helpers:122-133 where we iterate through intervals. For each interval,
we have the values of the function at the ends, and if the left one
is 0, we just add it as a zero and skip the call to CubicNewton. In order
to get roots in [A,B], we would either have to test both endpoints (which
would be more expensive and it would force us to find a way of avoiding
duplicate roots), or we would have to deal with the last interval as
a special case. The latter is not that bad, but it is more awkward than
what we have now.

Perhaps it would be better if RootsInAB would advertise that it is returning approximations of a high precision, but they won't be exact and roots near the endpoints may not be caught and so the caller should be prepared to evaluate the endpoints manually to see if they represent interesting values for the purposes of why the roots were requested.

And then do that in the functions that call it.

How about "if the 3 segments of the control polygon are all close to
each other in length and angle", then the optimization applies.  Is
that easy to test?

Hmm, that would actually be extremely easy to test and it would cost almost
nothing. We already compute the control polygon lengths in onLeaf, and
we're already assuming that every leaf is "flat enough", so we probably
don't even need to check the angles. Comparing lengths should be good enough.
I'll try this out.

Actually, even if the lengths aren't close the lengths may give you enough information about the acceleration along the curve that you can do a decent approximation of the accelerated T value. The T could be biased by some formula that is weighted by the ratios of the control polygon lengths. As a very crude example, say you assumed that if the scaled leaf length fell into the first polygon segment's length then t should be proportionally a value from 0 to 1/3, and if it fell between the first poly len and the second then it would be proportionally a value from 1/3 to 2/3, etc. The code might look like this:

    polylen = l01 + l12 + l23
    linelen = l03
    // If l01==l12==l23 then most of the following becomes
    // a NOP and t=leaflen/linelen
    polyleaflen = leaflen * polylen / linelen;
    if polyleaflen < l01 then t = (polyleaflen/l01)/3
    else if polyleaflen < l01+l12 then t = ((pll-l01)/l12 + 1)/3
    else if t = ((pll-l01-l12)/l23)+2)/3

An even better approximation would involve some more math, but that might be better than simply using the linear interpolation along the segment connecting their endpoints.

(Also, note that in the original code we probably would have just been dashing along the flattened curve anyway and so we might have just been using the raw linear t in that case - so anything we do here is a refinement of what we used to do and "icing on the cake" to some extent)...

                        ...jim

Reply via email to