Greetings,

While testing lasblock yesterday (code checked out a few days ago and compiled under Linux 4.4) I ran into a problem with a big dataset:

h...@tudledoux:~/data/las/ahn2_original/gefilterd$ time lasblock -c 10000 -p 3 bigone.las
terminate called after throwing an instance of 'std::length_error'
  what():  vector::reserve
Aborted

real    0m5.066s
user    0m4.620s
sys     0m0.432s


Reading how lasblock works, I did expect it to struggle with very large dataset, but here it crashed after 4s. The dataset "bigone.las" has ~280 million points, and is 5.3GB. With smaller datasets (~20M points) it works flawlessly.

Is it simply that allocating 2 arrays of 280M is too much and then it aborts?

Thanks for your help,
Hugo

-----------------------------------
Hugo Ledoux
Delft University of Technology | OTB, section GIS Technology
Jaffalaan 9 | 2628 BX Delft | the Netherlands
[email protected] | tel: +31 15 2786114 | fax: +31 15 2782745
http://www.gdmc.nl/ledoux | http://geomatics.tudelft.nl
_______________________________________________
Liblas-devel mailing list
[email protected]
http://lists.osgeo.org/mailman/listinfo/liblas-devel

Reply via email to