On Dec 16, 2010, at 2:11 PM, Peter Tittmann wrote:

> Greetings all,
> 
> I wanted to see if someone could advise as to the most efficient use of 
> LibLAS for the following:
> 
> I have a series of 200-500mb las files with classified points.
> 
> I have  a program (C++) that reads in .txt files  and doesn't like them 
> bigger than about 10mb. I have written some python to batch the C++ app for 
> multiple tiles. Obviously, using LibLAS to load the .las files directly into 
> the app would be best but due to time/resource constraints thats not going to 
> happen. 

I have just added the ability for lasblock to output .las files using the 
--write-points option.  Because of the way the chipper works, we can't filter 
*and* chip the data at the same time, so you'll have to do either the filtering 
or the chipping first, and then the other.  

> 
> I need to subset the large tiles spatially and by return #/classification. My 
> idea at the moment is to use write a script to batch las2las2 to produce text 
> files for each combination of spatial extent and point class, then use the 
> python api to produce text files that are digestible by the app. 

las2las2 is gone.  It's all just las2las now.  The previous incarnation is 
called las2las-old, which retains the old calling conventions and arguments if 
you need it.

> 
> My question (finally), is weather this is the best way to approach this 
> problem from an efficiency standpoint. I have used the python api to read 
> through the points in the original (large) .las files and spit out text files 
> with my criteria but its very brute force and slow. 

Using the above new functionality:

lasblock my_500_mb_file.las --capacity 1000000 --write-points 
million_point_file.las

#!/bin/bash
for i in $(ls million_point_file*.las)
        las2las $i $i-filtered.las --my-filter-options 
done

Hope this helps,

Howard_______________________________________________
Liblas-devel mailing list
Liblas-devel@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/liblas-devel

Reply via email to