> There's the minimalist school of software design, perhaps best summarized by 
> cat-v.org's "Harmful Software" page. A lot of people dismiss it as a 
> hairshirt cult, but it's still important to understand their position.

Throwing away the structure of your data (or pretending that you can ignore it) 
has nothing to do with "minimalism".

> Anyway the question was "what would be the fastest way" (not the most 
> elegant, powerful, interoperable, future-proof, etc) to filter 100,000 CSV 
> records. This seems like a real-world business problem, and re-engineering 
> the whole business process to do it all "the right way" isn't always an 
> option.

_"I cannot import the data into a database because then it wouldn't be a 
real-world business problem anymore and it would be a re-engineering of a whole 
business process"._ Come on, you should know by now that you cannot fool me 
with big words.

> I think the comparison of Nim and Golang is a great counter-argument against 
> taking minimalism to impractical extremes, but their arguments should still 
> be treated with respect.

Oh there is no reason to not respect Pike, since he mostly agrees with me 
anyway: 
[http://doc.cat-v.org/bell_labs/good_bad_ugly/slides.pdf](http://doc.cat-v.org/bell_labs/good_bad_ugly/slides.pdf)

"Compare the famous spell pipeline with an interactive spell-checker."  
"Tool approach works only at a particular level; for instance, integration is 
limited in scope. Again, compare spell vs. spell-checkers; pipes of commands 
vs. COM; pushing data into troff vs. importing through OLE, etc."  
"Once, the commonest Unix program was grep. Today, it’s emacs or  


mozilla. People prefer integrated environments and browsers."

> So, regardless of where one stands on Ivory Tower anti-cat -v-ism questions, 
> this is a matter of finding the fastest grep.

No, the original problem was about _removing_ entries from a CSV file, 
something which grep cannot do particularly well... 

Reply via email to