Hi ,
I would just like to launch a little OSGDEM contest ...
What is the biggest dataset you have processed with OSGDEM ?
Up to now, I have processed the BlueMarble dataset :
* 86400x43200 color
+ 21600x10800 geometry
+ 86400x43200 normal map
+ 32768x16184 clouds
Which took approx 24 hours to process and produced 40 GB final data
I guess of course , that I don't have the biggest dataset around ...but
my obvious and evilly intent is to evaluate the potentials risks of
processing a very big dataset , because I would like to process an
approx 2.6Gpix * 1.3 Gpix dataset.
I would then get a 4 TB dataset ... and 13 levels .
Do you guys think this is feasible ? Shall the process time increase
exponentially ?
Do you have experience with such data ?
Thanks for your experiences
--
Remy Deslignes
Ingenieur Developement / Software Engineer
Tel: +33 (0)1.53.90.11.19
===========================================
Silicon Worlds S.A.
224, rue Saint Denis
75002 Paris France
Tel: +33 (0)1.53.90.11.11
Fax: +33 (0)1.53.90.11.12
http://www.silicon-worlds.fr
===========================================
_______________________________________________
osg-users mailing list
osg-users@openscenegraph.net
http://openscenegraph.net/mailman/listinfo/osg-users
http://www.openscenegraph.org/