As long as we're off topic, but interesting - I was born in 1938.
MIT had a 704 and punch cards and maybe FORTRAN in the fifties.
A classmate wrote FORTH as a portable program for big telescopes.

Programming tools have run off in different directions since then,
just as today's tool section in a Big Box is very different from
before the baby boom hit the market.

Everybody that needs a tool has a good chance of finding what
they're looking for, but they'll look for what they've used before
or maybe a little better.

Y'see, the human brain only weighs three pounds. It hasn't been
possible to fit the sum of human knowledge into one brain for one
or two hundred years, depending on the capability of the brain.

So we have to specialize, and the specialties keep getting narrower.

I helped design an industrial process control system in the eighties.
One architect couldn't do it - it took one for hardware (Motorola 68K),
one for programming (Unix), and one for process control (me, with 20
years of industrial control experience). It was a grand educational
experience.

So when someone says that this or that tool is absolutely useless, I
take that as a sign of narrow specialization, in the dark about the rest
of the world but egocentrically sure the rest of the world must be like
them. It's a law of human nature, like "The prospect of wealth motivates
deceit." Today's politicians seem unable to suppress that motivation.

Veering back towards the list topic, I found this in "Four Laws that 
Drive the Universe" by Peter Atkins (it's about thermodynamics):

"Energy is conserved because time is uniform: time flows steadily, it
does not bunch up and run faster then spread out and run slowly. ...
If time were to bunch up and spread out, then energy would not be
conserved."

But maybe uniform time is only a local effect, on the scale of the
Universe. As far as we know, conservation of energy is not violated, 
so we have to find the causes of observed time variations in the
hardware we've built or the programs we've written.

Wait, what about Einstein's relativity? The flow of time is still
uniform
in a manner that can be predicted from the equations.

Bill Hawkins


-----Original Message-----
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On
Behalf Of wje
Sent: Saturday, August 16, 2008 11:10 AM
To: Discussion of precise time and frequency measurement
Subject: Re: [time-nuts] I want a good micro-controller

   You certainly don't need formal training to be a good programmer;
I've
   seen plenty of code from CS grads that's terrible, and very nice code
   from art majors.
   In my book, a good program is one one that's organized logically,
well
   documented, and performs the job it was designed to do. A good
   programmer is someone that produces such programs. That's it. The
   problem is that, with the advent PCs and easily-accessible
programming
   tools, everyone thinks they can write code, and many can't. Then what
   you end up with is a tangled mess that's unmaintainable and
   indecipherable.
   It's interesting that any number of EE's will take great care in
   circuit design, but then throw together some poorly-designed code to
   run their beautiful circuit. But, this has been endemic in the
hardware
   industry for as long as I've been around. Hardware companies
frequently
   have the attitude that it's the hardware that's important and the
   software is just one of those minor bits that has to get tacked on.
   This was true even for some companies that should have known better;
   there were plenty of HW engineers I ran into back in the old Digital
   days that, even though they were building minicomputers, really
   considered software an unfortunate requirement that had to be shipped
   with their beautiful hardware.
   Ah well, this is really wandering off-topic and my blood pressure's
   going up. I think I'll go write some C code for an 8-bit micro to
calm
   down. And yes, I use vi. :)
Bill Ezell
----------
They said 'Windows or better'
so I used Linux.

   Scott Newell wrote:

At 07:36 AM 8/16/2008, wje wrote:



I have both EE and CS degrees, and I work in both worlds. In my humble
(but completely accurate and stable) opinion, Basic is not a programming
language. It's a tool of Satan designed to convince people that they are
programmers when they really should stick to their janitorial duties.
This is a subset of the general problem that everyone thinks they are
programmers, and usually think their code is perfect. But, that's a rant
for a different audience.


So, how do you tell if you're not a programmer, but pretending to be
one?  My code is far from perfect, but it can usually be made to get the
job done.  I try not to cut too many corners, and the ones that I do cut
bother me.  But when you're the lone programmer on projects, it's hard
to know if you're crummy or decent, since there's no one to measure
against.  (Of course, there's the metric of 'product shipped, product
works, bossman happy, paycheck cashed', but that doesn't distinguish
between good and bad programmers, just programmers that can fool others
along with themselves.)



_______________________________________________
time-nuts mailing list -- time-nuts@febo.com To unsubscribe, go to
https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.


_______________________________________________
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.

Reply via email to