Software abstraction can be a red herring.  It is sometimes used as an
excuse for poor problem specification: I want something sufficiently
general to solve my particular problems, whatever that is.

A lot of the debate here seems to be about symbols, which are not an
abstraction at all: they directly correspond to the thing they
represent, as with the US flag and the republic for which it stands.
Most languages have operators which are not "intuitively obvious". For
example, in Perl, $; does not scream out "subscript separator" unless
you have learned it, and similarly but maybe slightly less so for ^=
(bitwise OR and assign in C++).  These operator tend to involve non
alphanumeric characters so that parsing is made easier.  Arguing about
specific choices made is like speculating who in their right mind
would come up with K as the symbol for potassium, rather than, say, P.
Symbols tend to change over time.  Mathematical symbols (which for
some reason laypeople believe to be etched in stone) change quite
rapidly: Newton's Principia is unreadable to the average present-day
mathematician.

Abstractions seldom work out as neatly in computer science as in
mathematics.  Take addition, for example. Mathematicians happily add
anything where it is semiplausible to do so.  Adding numbers on
computers has pitfalls like overflow and type coercion, and the fact
that computer addition is not associative, a bane when using
"optimizing" compilers in numerical work.  J provides a nice
generalization to arrays, but even there you have to know about rank
before you can use it effectively.

I am working on a problem right now that involves finding the (linear
algebra, not J) rank of linear operators.  Abstractly, this is very
easy to do.  The operators can be represented by symmetric, positive
semidefinite matrices with integer entries.  All you have to do is
find the eigenvalues and see how many are positive.  There are many
theoretical ways of doing this, but they have different computational
requirements and different numerical stability properites.  For
example, finding the roots of the characteristic polynomial, which is
taught in undergraduate linear algebra classes, is a horrendous way to
do this.  Gaussian elimination over the integers can in principle
solve the problem exactly, but you rapidly run into huge numbers, and
computations become slow.  Any floating-point method, like singular
value decomposition (probably the method of choice) introduces the
problem of telling whether an eigenvalue is really meant to be zero.
Abstractions can be great tools of thought, but for writing computer
programs sometimes the devil is in the details.

Abstractions obviously are useful, and ideas such as array,
stacks and hash tables can be applied quite generally.  There is
something to be said for declarative languages like SQL, where you
specify what to do but not how to do it.  J does this in some cases,
most notably with sorting.  This puts the burden on the implementor
rather than the programmer, and seems to work well in limited problem
domains.  However, I don't believe in magic.

Best wishes,

John



----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

Reply via email to