On 27/11/2014 23:17, Matt Mahoney via AGI wrote:

I am intimately familiar with the process of developing data
compression software. It is an iterative process. You think you have a
good idea of what changes ought to improve compression. But then you
do the experiment and you are right maybe less than half of the time.
Even if you are a fast coder, you can see that development time is
limited by the CPU power available to do the tests and gain a couple
bits of knowledge.

It sounds unlikely to me. Most software companies spend an order of magnitude
more on human resources than they do on computational capacity.  In the case of
compression, there are many possible changes which could be tested in parallel -
and then the useful ones combined to form the basis of the next generation of
the product. Even if the process has an exceptionally slow build-test cycle, 
that
doesn't stop the project from developing in parallel - and absorbing human
resources in the form of programmers, testers, trainers, management, sales
and marketing.  Indeed, the technical side of general-purpose data compression
development is fairly easy to parallelise - due to all the different sorts of 
data
that need to be compressed.  Computers are pretty cheap these days. I think
that it is rare for the machines to be a serious bottleneck.
--
__________
 |im |yler  http://timtyler.org/  [email protected]  Remove lock to reply.



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to