I don't quite think I understand what you are saying. Are you saying that mathematical models are not a good foundation for computer science because computers are really made out of electronic gates?
All I need to do is show that my model reduces to some basic physical implementation (with perhaps some allowances for infinity) and then I can promptly forget about that messy business and proceed to use my clean mathematical model. The reason any model of computation exists is that it is easier to think about a problem in some terms than in others. By showing how to transform one model to another you make it possible to choose exactly how you wish to solve a problem. The reason we do not work directly in what are called "von Neumann machines" is that they are not convenient for all kinds of problems. However we can build a compiler to translate anything to anything else so we I don't see why anybody would care. On Thu, Apr 18, 2013 at 5:53 PM, Mark Janssen <dreamingforw...@gmail.com>wrote: > [ The Types Forum, http://lists.seas.upenn.edu/mailman/listinfo/types-list] > > On Mon, Apr 15, 2013 at 2:53 AM, Moez AbdelGawad <moeza...@outlook.com> > wrote: > >> I'm not quite sure I understand your question, but I'll give it a shot. > >> :-) > > > > I'm in this same camp too :) > > I am very thankful for the references given by everyone. > Unfortunately my library does not have the titles and it will be some > time before I can acquire them. I hope it not too intrusive to offer > a few points that I've garnered from this conversation until I can > study the history further. > > The main thing that I notice is that there is a heavy "bias" in > academia towards mathematical models. I understand that Turing > Machines, for example, were originally abstract computational concepts > before there was an implementation in hardware, so I have some > sympathies with that view, yet, should not the "Science" of "Computer > Science" concern itself with how to map these abstract computational > concepts into actual computational hardware? Otherwise, why not keep > the field within mathematics and philosophy (where Logic traditionally > has been)? I find it remarkable, for example, that the simple > continued application of And/Or/Not gates can perform all the > computation that C.S. concerns itself with and these form the basis > for computer science in my mind, along with Boolean logic. (The > implementation of digital logic into physical hardware is where C.S. > stops and Engineering begins, I would argue.) > > But still, it seems that there are two ends, two poles, to the whole > computer science enterprise that haven't been sufficiently *separated* > so that they can be appreciated: logic gates vs. logical "calculus" > and symbols. There is very little crossover as I can see. Perhaps > the problem is the common use of the Greek root "logikos"; in the > former, it pertains to binary arithmetic, where in the latter, it > retains it's original Greek pertaining to *speech* and symbols, > "logos"). Further, one can notice that in the former, the progression > has been towards more sophisticated Data Structures (hence the > evolution towards Object-Orientation), where in the latter (I'm > guessing, since it's not my area of expertise) the progression has > been towards function sophistication (where recursion seems to be > paramount). > > In any case, I look forward to diving into the books and references > you've all offered so generously so that I can appreciate the field > and its history better. > > Mark Janssen > Pacific Lutheran University > Tacoma, Washington >
-- http://mail.python.org/mailman/listinfo/python-list