Re: [Talk-us] Over-digitized imports?

2010-08-24 Thread Alan Mintz

At 2010-08-19 20:24, Alan Mintz wrote:
I'm mapping in this area: 
http://www.openstreetmap.org/?lat=34.08242lon=-118.639zoom=17


Along the north side of the tertiary road (whose name is not rendered, but 
is Saddle Peak Road) is a state park polygon (Topanga State Park) imported 
from CASIL. In this small segment, the road is approximated with less than 
70 nodes, while the park polygon segment alongside uses over 1200. I've 
noticed similar beauty in other data from this source and others (like 
the Bakersfield data mentioned recently on the list).


Should imports make an effort to un-smooth such data to some extent, for 
the benefit of editing and rendering performance, storage, etc?


As a test case, I used JOSM's Simplify Way on the ways that make up 
Topanga State Park. After playing around a bit, in advanced preferences, I 
set simplify-way.max-error to 0.2, which still modeled curves to maybe 
single-digit meter errors, yet removed 73% of the nodes (from 9103 to 2466).


The ways are:
38458997
38459009
38459010
38459013
45753168
45753173
45753175

I'm not embarking on a mission to simplify all ways - just taking the 
(short time) to do this when I'm mapping an area anyway and see it could be 
of benefit.


--
Alan Mintz alan_mintz+...@earthlink.net


___
Talk-us mailing list
Talk-us@openstreetmap.org
http://lists.openstreetmap.org/listinfo/talk-us


Re: [Talk-us] Over-digitized imports?

2010-08-24 Thread Gregory Arenius
Computer storage and processing time are relatively cheap and only getting
cheaper at an exponential rate.

OSM volunteer time is very limited.

Given those facts I wouldn't worry about unsmoothing except in
particularly egregious cases. I just don't see the benefit in decreased
storage and processing costs as being worth anywhere near the cost in man
hours it would take to do the unsmoothing work.

Cheers,
Greg
___
Talk-us mailing list
Talk-us@openstreetmap.org
http://lists.openstreetmap.org/listinfo/talk-us


Re: [Talk-us] Over-digitized imports?

2010-08-24 Thread Zeke Farwell
In looking at some of those ways I'd say your simplification is completely
warranted.  The curves still look very smooth after you've removed 73% of
the nodes.

I agree with others that storage is cheap and saving space in the DB may not
be that important.  More nodes make for smoother, more detailed ways but
after a certain density is reached more nodes offer diminishing
returns.  Also when nodes are very dense on a way it becomes hard to select
said way with out zooming very far in.Clearly there's no need to
systematically remove excess nodes on already imported data, but if
simplifying a way makes your editing easier and doesn't reduce the
detail/smoothness (clearly this is a judgement call) I say go for it.

Zeke


On Tue, Aug 24, 2010 at 4:14 AM, Alan Mintz
alan_mintz+...@earthlink.netalan_mintz%2b...@earthlink.net
 wrote:

 At 2010-08-19 20:24, Alan Mintz wrote:

 I'm mapping in this area:
 http://www.openstreetmap.org/?lat=34.08242lon=-118.639zoom=17

 Along the north side of the tertiary road (whose name is not rendered, but
 is Saddle Peak Road) is a state park polygon (Topanga State Park) imported
 from CASIL. In this small segment, the road is approximated with less than
 70 nodes, while the park polygon segment alongside uses over 1200. I've
 noticed similar beauty in other data from this source and others (like the
 Bakersfield data mentioned recently on the list).

 Should imports make an effort to un-smooth such data to some extent, for
 the benefit of editing and rendering performance, storage, etc?


 As a test case, I used JOSM's Simplify Way on the ways that make up
 Topanga State Park. After playing around a bit, in advanced preferences, I
 set simplify-way.max-error to 0.2, which still modeled curves to maybe
 single-digit meter errors, yet removed 73% of the nodes (from 9103 to 2466).

 The ways are:
 38458997
 38459009
 38459010
 38459013
 45753168
 45753173
 45753175

 I'm not embarking on a mission to simplify all ways - just taking the
 (short time) to do this when I'm mapping an area anyway and see it could be
 of benefit.

 --
 Alan Mintz alan_mintz+...@earthlink.net



 ___
 Talk-us mailing list
 Talk-us@openstreetmap.org
 http://lists.openstreetmap.org/listinfo/talk-us

___
Talk-us mailing list
Talk-us@openstreetmap.org
http://lists.openstreetmap.org/listinfo/talk-us


[Talk-us] Over-digitized imports?

2010-08-19 Thread Alan Mintz
I'm mapping in this area: 
http://www.openstreetmap.org/?lat=34.08242lon=-118.639zoom=17


Along the north side of the tertiary road (whose name is not rendered, but 
is Saddle Peak Road) is a state park polygon (Topanga State Park) imported 
from CASIL. In this small segment, the road is approximated with less than 
70 nodes, while the park polygon segment alongside uses over 1200. I've 
noticed similar beauty in other data from this source and others (like 
the Bakersfield data mentioned recently on the list).


Should imports make an effort to un-smooth such data to some extent, for 
the benefit of editing and rendering performance, storage, etc?


I suppose if one really wanted to not lose the detail, some calculation 
could be done during import to determine the radius and arc degrees that 
were used to create the curves in the first place and then tag those in 
some way (in case the day comes when we can feed that to the renderer 
instead of approximating curves with lines). Of course, it might make more 
sense to get the original curve data in this case instead of 
reverse-engineering it.


--
Alan Mintz alan_mintz+...@earthlink.net


___
Talk-us mailing list
Talk-us@openstreetmap.org
http://lists.openstreetmap.org/listinfo/talk-us


Re: [Talk-us] Over-digitized imports?

2010-08-19 Thread Nathan Edgars II
On Thu, Aug 19, 2010 at 11:24 PM, Alan Mintz
alan_mintz+...@earthlink.net wrote:
 Should imports make an effort to un-smooth such data to some extent, for
 the benefit of editing and rendering performance, storage, etc?

http://wiki.openstreetmap.org/wiki/Convert_shp_to_osm_using_grass_and_gpsbabel
recommends doing this.

___
Talk-us mailing list
Talk-us@openstreetmap.org
http://lists.openstreetmap.org/listinfo/talk-us


Re: [Talk-us] Over-digitized imports?

2010-08-19 Thread Apollinaris Schoell

On 19 Aug 2010, at 20:35 , Nathan Edgars II wrote:

 On Thu, Aug 19, 2010 at 11:24 PM, Alan Mintz
 alan_mintz+...@earthlink.net wrote:
 Should imports make an effort to un-smooth such data to some extent, for
 the benefit of editing and rendering performance, storage, etc?
 

hm, yes and no. currently we render only to zoom level 17, but it might change 
in future. And we don't map for a specific renderer. As long as the point 
density is reasonable why not keep it. 

 http://wiki.openstreetmap.org/wiki/Convert_shp_to_osm_using_grass_and_gpsbabel
 recommends doing this.
 

it describes a method but I can't see any recommendation here.
btw JOSM can do the same. default is a bit aggressive but can be changed in the 
enhanced settings


 ___
 Talk-us mailing list
 Talk-us@openstreetmap.org
 http://lists.openstreetmap.org/listinfo/talk-us


___
Talk-us mailing list
Talk-us@openstreetmap.org
http://lists.openstreetmap.org/listinfo/talk-us