On 2017-10-06 15:35, Fred Cisin via cctalk wrote:
On Fri, 6 Oct 2017, emanuel stiebler via cctalk wrote:
It was fun optimizing/ tuning your software, when computers cost thousand of dollars, and the only thing which would make them go faster
was to work on the algorithms.
Now, it is much easier to make it faster. Just buy a faster bigger one, and you don't have to worry about it ...

"Throw hardware at it."

I maintain that no matter how fast it is, there are still situations where it would be important to be even faster than the extra hardware can be with inefficient code.  And, no matter how fast it is, there are those who will slow it down by such things as downloading unwanted advertising along with the content, . . .

No matter how big it is, it is still going to be possible to come up with more data than will fit in RAM.  Suchas Google's data, or the NSA LottaByte data center.  Or modeling weather?

I completely agree with you. (I probably should have put some smileys in there somewhere). I do most of my work in embedded, and still counting bytes, and searching for a better algorithms ...

Reply via email to