On Apr 24, 1:50 pm, mark mcclure <[email protected]> wrote:
> On Apr 23, 10:55 am, Wally <[email protected]> wrote:
>
> > If speed & accuracy are important, look at:
> >  www.polyarc.us/pack
> >  www.polyarc.us/packer.js
>
> It really depends on when you need the speed.  In my
> experiments, packer encodes lines faster than the polyline
> encoder *but* the performance of the map after encoding is
> much better using the polyline encoder.
>
> Here's an example
> Encode:http://facstaff.unca.edu/mcmcclur/GoogleMaps/EncodePolyline/EncodeAnd...
> Packer:http://facstaff.unca.edu/mcmcclur/GoogleMaps/EncodePolyline/EncodeAnd...
>
> Both of these map display the exact same data - an outline of
> the British coatline using about 50,000 points.  Both of them
> are encoded "on the fly" to give you a sense of encoding
> times.  Packer encodes in 12-15 seconds while the polyline
> encoder encodes in 20-25 seconds.  Indeed, packer is faster at
> encoding.  Now hit the zoom button once.  The polyline encoder
> version changes levels in well under a second.  The packer
> version changes levels in about 5 seconds. These tests were
> performed on my MacBook Pro Laptop running Firefox at 2.16 GHz.

The zoom level algorithm can be changed.  A less aggressive sequence
of zoom levels may improve performance but hurt appearance at very
deep zoom levels.  It requires some experimenting.

> It seems to me that rendering speed after encoding is much more
> important than encoding speed, particularly when you have the
> option to pre-encode.  The real way to improve encoding speed
> is to implement the encoding algorithm in a compiled language,
> like C.  Then the encoding will run about 1000 times faster and
> this issue is moot.

I agree.  Unfortunately, the API requires "fromEncoded" polys to
produce polys with holes.  The API ought to generate zoom level arrays
automatically for standard GPolys but again it does not do it.

> I wrote the polyline encoder in Javascript
> mainly because I knew that everyone in this group would likely
> be able to read and understand Javascript.  Andrew contacted me
> and told me about some newer ports and I'll add those to the
> polyline encoding page.
>
> > It uses integer arithmetic for speed.  It eliminates Lat/Lon
> > projection bias for accuracy.  It has no function calls in the
> > main loop.  It converges quickly with multiple pivots in the
> > same pass.  It compensates for a Douglas-Peucker algorithm
> > deficiency (D-P was designed for polylines not for polygons).
>
> I don't get this.  (1) In compiled languages at least, I expect
> floating point arithmetic to be the fastest available.  Is
> there something about Javascript that makes integer arithmetic
> faster?

Actually, it is just the opposite, fixed point math is faster than
floating point math.  In the old 80x86 days, a separate 80x87 floating
point chip did the floating point math.  In today's modern chips with
multiple cores, cacheing & piplining, it may be less important.

> (2)  What's this Lat/Lng bias?  The API expects Lat/Lngs.

A degree of Latitude is not equal to a degree of Longitude except at
the equator.  The API must project Lat/Lon coordinates into pixels in
order to render the poly.

> (3) Douglas-Peucker works fine with polygons, as the example
> provided here shows.

Rather than use an arbitrary starting point, I split the polygon into
four polylines at the extreme points (the four points tangent to the
bounding box).  In the case of ties, it may produce additional line
segments.  I played with the entire convex hull but the improvement
was not worth the extra overhead.

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Google Maps API" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/Google-Maps-API?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to