On Sun, 12 Jun 2005 20:22:28 -0400, Roy Smith <[EMAIL PROTECTED]> wrote:

>How far down do you have to go?  What makes bytes of memory, data busses, 
>and CPUs the right level of abstraction?

They're things that can be IMO genuinely accept
as "obvious". Even "counting" is not the lowest
level in mathematic... there is the mathematic
philosohy direction. From "counting" you can go
"up" in the construction direction (rationals,
reals, functions, continuity and the whole
analysis area) building on the counting concept
or you can go "down" asking yourself what it
does really mean counting, what do you mean
with a "proof", what really is a "set".
However the "counting" is naturally considered
obvious for our minds and you can build the
whole life without the need to look at lower
levels and without getting bitten too badly for
that simplification.

Also lower than memory and data bus there is
of course more stuff (in our universe looks
like there is *always* more stuff no mattere
where you look :-) ), but I would say it's
more about electronic than computer science.

>Why shouldn't first-year CS students study "how a computer works" at the 
>level of individual logic gates?  After all, if you don't know how gates 
>work, things like address bus decoders, ALUs, register files, and the like 
>are all just magic (which you claim there is no room for).

It's magic if I'm curious but you can't answer
my questions. It's magic if I've to memorize
because I'm not *allowed* to understand.
It's not magic if I can (and naturally do) just
ignore it because I can accept it. It's not
magic if I don't have questions because it's
for me "obvious" enough.

>> Also concrete->abstract shows a clear path; starting
>> in the middle and looking both up (to higher
>> abstractions) and down (to the implementation
>> details) is IMO much more confusing.
>
>At some point, you need to draw a line in the sand (so to speak) and say, 
>"I understand everything down to *here* and can do cool stuff with that 
>knowledge.  Below that, I'm willing to take on faith".  I suspect you would 
>agree that's true, even if we don't agree just where the line should be 
>drawn.  You seem to feel that the level of abstraction exposed by a 
>language like C is the right level.  I'm not convinced you need to go that 
>far down.  I'm certainly not convinced you need to start there.

I think that if you don't understand memory,
addresses and allocation and deallocation, or
(roughly) how an hard disk works and what's
the difference between hard disks and RAM then
you're going to be a horrible programmer.

There's no way you will remember what is O(n),
what O(1) and what is O(log(n)) among containers
unless you roughly understand how it works.
If those are magic formulas you'll just forget
them and you'll end up writing code that is
thousands times slower than necessary.

If you don't understand *why* "C" needs malloc
then you'll forget about allocating objects.

Andrea
-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to