I'm really confused.

You're talking about putting the data into sqlite which suggests there 
really isn't so much log data and it could be filtered with a hacky shell 
script. But then you're talking about a lot of heavy optimisation which 
suggests you really may need to put in custom effort. Precisely how much 
log data really needs to be filtered? You're unlikely to be able to filter 
much of the data faster than the system utilities which are often very old 
and well-optimised C code. I'm reminded about the old story of the McIlroy 
and Knuth word count programs.

Anyway while this is a very enlightening discussion it is probably 
worthwhile to reuse as much existing system utilities and code as you can 
instead of writing your own.

-- 
You received this message because you are subscribed to the Google Groups 
"mechanical-sympathy" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to mechanical-sympathy+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to