On Fri 11 Aug, Sengan wrote:
> I don't buy this: for a long time the embedded hard realtime people
> refused to use CPUs with cache because they would be
> "non-deterministic".

(I assume "non-deterministic" in this context means we can't determine the
execution time of a bit of code, even knowing the initial state of variables
etc...)

I think you'll find that this is still the prevailing view for most in the
DSP community (and I can't think of many 'harder' real time applications).

> They finally gave up, 

What, all of them? :-)
The only people I know who have tried this for DSP have come to regret it.

> realizing that CPU's with caches are much faster.

Not necessarily. IMHO most conventional caches are a less than ideal silicon
fix for what is essentially a software problem. Typically DSP's use fast
internal RAM that provides the same service as a 'cache', but which bits of
code and data are cached is under the control of the software, not the
hardware.
(Some also have an additional 'cache', but let's not get into that...) 

> If garbage collection is relatively cheap and makes it 10x faster to
> make a reliable system, they'll end up using it too.

I would certainly agree that automatic garbage collection is far better
than the perils of using explicit allocations/deallocations from a
conventional heap. But I don't do that either, in code which I know has to
be reliable.

If I was using a language which did not allow me the option of using
static allocation and destructive updates (it used dynamic allocation and
run-time GC instead) I think in practice will actually take far longer to
make systems _reliable_ (having first eliminated all bugs from my own code,
presumably). In fact I'm not convinced it's possible.
Useful?..Yes, Cheap?..Yes, Reliable?..I'm sceptical:-(

How can I ensure that such a system won't ever run out of heap space?
Some kind of proof? Exhaustive testing?

> Non deterministic is not that important. Worst time is.

Yes, I agree in principle, but how would you determine the very worst case
time in a cached/GC machine? You need a _real_ worst case figure to know your
system performs to spec.

Regards
-- 
Adrian Hey


Reply via email to