Re: Why are still not at 64 bits [was Can't figure out Firefox Plugin Requirement ]

2007-02-18 Thread Tom Buskey

On 2/17/07, Jeffry Smith [EMAIL PROTECTED] wrote:




And don't forget that real Engineers (Professional Engineers) sign
their work and take responsibility for failures (reputation, money,
etc).



Not all real engineers need a PE.

Civil Engineers do.
Some Mechanical Engineers do.

There's a process for getting the PE stamp:
1) Get an engineering degree
2) take the Engineering In Training exam and pass
3) keep a journal of engineering decisions w/ a PE's verification
4) after 4(?) years you can submit that journal to take the PC exam

I got my EIT but didn't go beyond that.

For most computer related jobs, you only need experience and to say you're
competent.  Then the hiring company has to agree with you.

Engineering has established:
material strengths
calculations
construction methods
factors of safety

I don't see that in most of the computer world.
___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: Why are still not at 64 bits [was Can't figure out Firefox Plugin Requirement ]

2007-02-18 Thread Jeffry Smith

On 2/18/07, Tom Buskey [EMAIL PROTECTED] wrote:

On 2/17/07, Jeffry Smith [EMAIL PROTECTED] wrote:


 And don't forget that real Engineers (Professional Engineers) sign
 their work and take responsibility for failures (reputation, money,
 etc).

Not all real engineers need a PE.


In some states you do - some have laws on the books that to call
yourself an Engineer means you have a PE - otherwise you're just an
Engineer in Training (EIT) or engineering staff.



Civil Engineers do.
Some Mechanical Engineers do.


Aerospace Engineers do.  In some states, if you're called and
Engineer, it means you have the PE.

Not everyone in a firm needs the PE, but at least for Aerospace, you
need a PE in the firm to sign off - putting his license on the line.




There's a process for getting the PE stamp:
1) Get an engineering degree
2) take the Engineering In Training exam and pass
3) keep a journal of engineering decisions w/ a PE's verification
4) after 4(?) years you can submit that journal to take the PC exam


5) Continue your studies (continuing education)  - same as medical or
other professional.  It's not a one-time deal.



I got my EIT but didn't go beyond that.

For most computer related jobs, you only need experience and to say you're
competent.  Then the hiring company has to agree with you.

Engineering has established:
material strengths
calculations
construction methods
factors of safety


More importantly (at least for aerospace) it's got a mathematically
based scientific method for approaching those, based on a body of
knowledge that is built on the past.  As in repeatable results.  I
used to work in a materials lab.  One of the things we did was
destruction testing of composites.  Again and again.  To establish the
failure points and the root causes - so we could say with certainty
this material, made out of these composites in this proportiion, laid
out in this pattern, will fail at X point - and anyone who repeated
the experiment would have the same results.



I don't see that in most of the computer world.



And that's why many argue it's not engineering - it's art.

jeff
___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: Why are still not at 64 bits [was Can't figure out Firefox Plugin Requirement ]

2007-02-17 Thread Jon 'maddog' Hall
On Sat, 2007-02-17 at 02:43 -0500, Bill McGonigle wrote:
 On Feb 17, 2007, at 01:22, Nigel Stewart wrote:
 
  The Engineers I've worked with tend towards the
  just make it work philosophy.  Interpret the
  spec as narrowly and specifically as possible,
  and rely on nobody being rude enough to point out
  the unhandled cases, fragile assumptions, or
  broader incompatibilities.  It's the schedule,
  stupid!
 
 That's a good illustration of why real Engineers get so worked up  
 when line computer programmers call themselves Engineers.
 
 'cause that's no way to build a bridge...
 
 -Bill
 
Like the classic swings picture of software development, I now picture
in my mind the bridge built by a software engineer:

o lots of fancy do-dads that do nothing, but have to be painted
  every year so they do not rust
o exit ramps going off the bridge, most of which do not go
  anywhere, and let the car fall into the water

When I was just a young lad they built the Baltimore Beltway (now
unceremoniously called Interstate 695).  On that road was the first
multi-level overpass bridge I ever saw.  Besides the main roadway, there
were two levels of overpass, one directly above the other.  But the
topmost level of overpass did not have any roads going to it, and did
not for MANY years.  So you drove past it, seeing this topmost level
that no one could use, or even access.

Eventually they built roads to it, but I have never seen where those
roads go, or where they take you, so I wonder if the roads were merely
added to belatedly justify the third level...and ease the taxpayer's
mind that their money was not wasted.

md

___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: Why are still not at 64 bits [was Can't figure out Firefox Plugin Requirement ]

2007-02-17 Thread Jon 'maddog' Hall

 
 And don't forget that real Engineers (Professional Engineers) sign
 their work and take responsibility for failures (reputation, money,
 etc).
 
 jeff

In the case of FOSS, so do programmers (well, their reputation at
least)

And it is interesting that in study after study, whether engineering or
a manufacturing line, where the people have a chance to sign their
name to the product the quality goes up.

md

___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: Why are still not at 64 bits [was Can't figure out Firefox Plugin Requirement ]

2007-02-16 Thread Bill McGonigle

On Feb 17, 2007, at 01:22, Nigel Stewart wrote:


The Engineers I've worked with tend towards the
just make it work philosophy.  Interpret the
spec as narrowly and specifically as possible,
and rely on nobody being rude enough to point out
the unhandled cases, fragile assumptions, or
broader incompatibilities.  It's the schedule,
stupid!


That's a good illustration of why real Engineers get so worked up  
when line computer programmers call themselves Engineers.


'cause that's no way to build a bridge...

-Bill

-
Bill McGonigle, Owner   Work: 603.448.4440
BFC Computing, LLC  Home: 603.448.1668
[EMAIL PROTECTED]   Cell: 603.252.2606
http://www.bfccomputing.com/Page: 603.442.1833
Blog: http://blog.bfccomputing.com/
VCard: http://bfccomputing.com/vcard/bill.vcf

___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: Why are still not at 64 bits [was Can't figure out Firefox Plugin Requirement ]

2007-02-15 Thread Ben Scott

On 2/14/07, Paul Lussier [EMAIL PROTECTED] wrote:

I find it mind-boggling that the Alpha came out what, 16-18 years ago
with 64 bit technology and it *still* hasn't caught on in the
mainstream.  Why is that?


 Well, AMD64 (64-bit address space only became available on
mainstream hardware a year or two ago.  So there isn't that much
hardware out there.  That alone discourages closed-source vendors from
building 64-bit binaries.

 Next, 64-bit support with Microsoft Windows absolutely sucks.  For
Win XP, you have to get a special build of the OS, which is available
only by buying a new PC, and is supported only though the PC OEM --
call Microsoft, and they tell you to drop dead.  The situation is
little better with the server flavors of Windows.

 With Windows, 64-bit drivers are very hard to find.  Most hardware
not made in the past few years does not have drivers and never will.
Even newer stuff is poorly supported.  This is partly because nobody
wants to write drivers for an OS with almost no support.  However,
there's also the fact that the 64-bit versions of Windows will not
permit an unsigned driver to be loaded.  Period.  I guess you have
to submit everything to Microsoft for an expensive certification
process.  Microsoft innovation at work again.

 AMD64 (and Intel's clone) doesn't support Virtual Mode (running old
so-called 16-bit code) when switched into Long Mode (64-bit mode).
So anyone running old DOS or Win16 crap (and there's still a lot of
that) cannot even use the AMD64 stuff.

 Then there's the fact that every moron programmer in the world (and
there are legions of moron programmers) assume integers and pointers
are 32-bits, and their code breaks horribly if recompiled for a 64-bit
architecture.  So even if you have source, it's not just a matter of
recompiling, in most cases.

 I suspect Microsoft still has a lot of 64-bit unclean code.  Linux
has been dealing with this since 1995 on the Alpha, and most good
Linux programs (and quite a few bad ones) are 64-bit clean these days.
Think how bad the average Windows program is, and then realize that
half of them are worse.  (With apologies to George Carlin.)  In the
'doze world, it's a horror show.

 Meanwhile, most programs don't need a 64-bit address space at all.
They rarely need more than 24 bits of address space.  Sure, the
additional registers can yield a speed improvement for some code, but
not enough to make it overwhelmingly compelling.

 So there's little benefit, and lots of problems.

 This affects Linux mainly because there's a lot of stuff that isn't
really Linux software, but rather, 'doze software shoehorned into
'nix.  Adobe (nee Macromedia) Flash is supported, but only as a
closed-source, somewhat buggy, 32-bit binary.  Then there are things
like mplayer, which rip 32-bit binaries from the 'doze land and hook
them into Linux.  (Things like this are one of the many reasons closed
data formats are even worse than closed code.)

-- Ben
___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: Why are still not at 64 bits [was Can't figure out Firefox Plugin Requirement ]

2007-02-15 Thread Ben Scott

On 2/15/07, Bill McGonigle [EMAIL PROTECTED] wrote:

   * are there any gotchas with running 32-bit apps under a linux
that's native to x86-64?


 Source or binary?

 With source, well-written code just needs to be recompiled.  Of
course, we all know that a lot, if not most, code is *NOT*
well-written.  See my other message in this thread WRT moron
programmers for more.

 Binary: My understanding is that a 32-bit binary can be run under a
64-bit kernel, but you need a 32-bit environment to do so.  So any
libraries the binary depends on also need to be built (for x86-32) and
installed in parallel with their x86-64 counterparts.  I could be
wrong on this; I haven't verified the information's authenticity, and
I certainly haven't tried it.

-- Ben
___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: Why are still not at 64 bits [was Can't figure out Firefox Plugin Requirement ]

2007-02-15 Thread Shawn K. O'Shea


  Binary: My understanding is that a 32-bit binary can be run under a
64-bit kernel, but you need a 32-bit environment to do so.  So any
libraries the binary depends on also need to be built (for x86-32) and
installed in parallel with their x86-64 counterparts.  I could be
wrong on this; I haven't verified the information's authenticity, and
I certainly haven't tried it.


This is correct. I've recently dealt with this at work. I've also seen
a decent amount of traffic regarding this issue on the CentOS mailing
list.

-Shawn
___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: Why are still not at 64 bits [was Can't figure out Firefox Plugin Requirement ]

2007-02-15 Thread Michael ODonnell


 Binary: My understanding is that a 32-bit binary can be run under
 a 64-bit kernel, but you need a 32-bit environment to do so.
 So any libraries the binary depends on also need to be built (for
 x86-32) and installed in parallel with their x86-64 counterparts.
 I could be wrong on this; I haven't verified the information's
 authenticity, and I certainly haven't tried it.

IIRC, a while back there was much gnashing of teeth about how
they hadn't quite dealt w/all the places where a 32bit app
(along w/its various 32bit libs) and the 64bit kernel might
disagree on the layout of various data structures, which
is important since a number of syscalls involve pointers to
structures in the caller's address space.  I assume (hope)
this is old news by now...
 
___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: Why are still not at 64 bits [was Can't figure out Firefox Plugin Requirement ]

2007-02-15 Thread Jon 'maddog' Hall

 
   Binary: My understanding is that a 32-bit binary can be run under a
 64-bit kernel, but you need a 32-bit environment to do so.  So any
 libraries the binary depends on also need to be built (for x86-32) and
 installed in parallel with their x86-64 counterparts.  I could be
 wrong on this; I haven't verified the information's authenticity, and
 I certainly haven't tried it.
 
In theory this is true.

Assuming that the instruction set of the 64-bit processor supports the
instruction set of the 32-bit processor, or has a mode that it can enter
when it does a context switch that is completely compatible, then this
can be done.

You simply make sure that all programs are loaded into a virtual space
that is in the lower 32 bits of virtual address space.  The 32-bit
libraries have to be written to maintain the 32-bit interfaces, and
supply the 64-bit addresses to the kernel, OR the kernel has
modes/interfaces to accept both 32 and 64 bit addresses.

The bigger problem is with currently stored data and current programming
languages.

A very popular programming language in Unix did not define what size an
int was very well.  It kind of wobbled and said that it was the
natural size or something like that.  It also did not specify what
endianism the program should use. This was done so the compiler could
make decisions to get good integer performance.  So if I plopped down an
array of ints on the disk, how big are they REALLY?  And what endian
format were they when I read them?  Worse yet, if I plop it down with a
32-bit machine and try to read it with a 64-bit machine, what am I
getting?  Changes to this language in relatively recent years have made
this easier to control, but there is a LOT of legacy code out there that
does not change so well*.  Now there is a lot of legacy data.

I actually had a talk with Dennis Ritchie at a conference one time,
telling him about the problem, and suggesting a good solution.  He told
me to tell Steve Johnson, who was maintaining the ATT compilers then,
as Dennis had moved on.  I told Steve, but nothing ever came of it until
MUCH later when issues forced it.

Now if you used programming language that defines this very well, or a
database engine to store and receive your data, with its paranoid data
dictionary making you define everything about the data (size, endianism,
etc., etc.) or you have network engineers defining your interfaces so
they know what type of system you are coming from, what type you are
going to, and what type of network neutral data you are going to have,
then everything is peachy.

Unfortunately, as Ben has noted, there are crappy programmers out there
in the world, and tons of legacy, binary-only software.  We no longer
even know what language they were written in, much less have the sources
so we can change them.  Heck probably a good percentage of those
programs the people don't even know what the program does, they just
run them.**

In 1989 as we were trying to create a version of Ultrix that ran on
MIPS.  Most UNIX systems (AIX, HP/UX, SunOS/Solaris) were big-endian.

Ultrix was little-endian.  So was SCO, but who cared about them, because
they were running on that slow, crappy Intel processor.

But we were trying to become compatible with the rest of the industry,
so we created a kernel that could either use big endian or little-endian
applications at the same time.  Just a matter of engineering and code.

But most of our customers had tons of little-endian data.  And we
realized that if we started down this path we would have customers
buying big-endian applications to treat little-endian data.

So we never released the functionality.

maddog

*Do not even get me started on Unions in C, Equivalence in
Fortran, and Linkage Sections in Cobol.  I get really ugly fast.

**If you do not believe this statement, I can tell you HOURs of
examples. Maybe I will do that tonight at Martha'sbut it always
makes me so depressed.

***There are no statements above that relate to this footnote, but this
is simply a warning not to even SAY THE WORD Itanium to me.

___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: Why are still not at 64 bits [was Can't figure out Firefox Plugin Requirement ]

2007-02-15 Thread Kevin D. Clark

Ben Scott writes:

   Then there's the fact that every moron programmer in the world (and
 there are legions of moron programmers) assume integers and pointers
 are 32-bits, and their code breaks horribly if recompiled for a 64-bit
 architecture.  So even if you have source, it's not just a matter of
 recompiling, in most cases.

The reason why we find ourselves in this mess is because we treat
programming as a task (or, some would even say, an art) instead of
what it actually is:  engineering.

Code that causes the compiler to emit warnings, doesn't contain
reasonable asserts, contains lots of casts and cute union tricks,
was designed with nary a thought about endian/sizeof issues, and is
uncommented reminds me of carpentry jobs in which I've seen windows
installed without flashing, doors that are crooked, and joists are
just sawed through haphazardly.

Would you pay a carpenter for such work?  Nope.  Do programmers
produce such work?  Frequently, yes.  What is this attributable to?:
market forces, time-to-market pressure, ignorance, apathy, etc.



I was pretty happy the day that somebody took a network protocol stack
that I had written, compiled it on a 64-bit system that I did not have
access to, whereupon it compiled without warnings and WORKED
CORRECTLY.

Regards,

--kevin
-- 
GnuPG ID: B280F24E  Never could stand that dog.
alumni.unh.edu!kdc   -- Tom Waits

___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: Why are still not at 64 bits [was Can't figure out Firefox Plugin Requirement ]

2007-02-15 Thread Ben Scott

On 2/15/07, Kevin D. Clark [EMAIL PROTECTED] wrote:

The reason why we find ourselves in this mess is because we treat
programming as a task (or, some would even say, an art) instead of
what it actually is:  engineering.


 And the choir will now sing back the chorus...  ;-)

-- Ben
___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: Why are still not at 64 bits [was Can't figure out Firefox Plugin Requirement ]

2007-02-14 Thread Jon 'maddog' Hall
On Wed, 2007-02-14 at 22:22 -0500, Paul Lussier wrote:
 Ben Scott [EMAIL PROTECTED] writes:
 
I'm still running 32-bit everywhere, but from what I've read, it
  seems like the recommended solution is to use a 32-bit environment for
  web browsing.  It's generally possible to do this in an otherwise
  64-bit environment.  The details and difficult vary by distribution
  and release.
 
I assume that, even with Flash memory leaks, a 64-bit address space
  isn't needed for web browsing.  :)
 
 I find it mind-boggling that the Alpha came out what, 16-18 years ago
 with 64 bit technology and it *still* hasn't caught on in the
 mainstream.  Why is that?  Is there really that little market demand
 for 64 bits?  Sure, you only need 32 bits (or less) to run Word, but
 imagine how many more bugs MS could sell if they had double the amount
 of address-space!
 
I think there is a strong analogy between why we do not have 64 bits
everyplace, and why we still suffer with IPv4 instead of IPv6

md

___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: Why are still not at 64 bits [was Can't figure out Firefox Plugin Requirement ]

2007-02-14 Thread Tom Buskey

On 2/14/07, Paul Lussier [EMAIL PROTECTED] wrote:


Ben Scott [EMAIL PROTECTED] writes:

   I'm still running 32-bit everywhere, but from what I've read, it
 seems like the recommended solution is to use a 32-bit environment for
 web browsing.  It's generally possible to do this in an otherwise
 64-bit environment.  The details and difficult vary by distribution
 and release.

   I assume that, even with Flash memory leaks, a 64-bit address space
 isn't needed for web browsing.  :)

I find it mind-boggling that the Alpha came out what, 16-18 years ago
with 64 bit technology and it *still* hasn't caught on in the
mainstream.  Why is that?  Is there really that little market demand
for 64 bits?  Sure, you only need 32 bits (or less) to run Word, but
imagine how many more bugs MS could sell if they had double the amount
of address-space!




32 bits is good enough.  And 90% of all systems run 32 bits.

Some of my users spec'd and ordered a PC.  They thought they needed 64 bits
so they got XP 64 bits.  None of the custom PCI-X cards had drivers of
course.  And std XP worked just fine.  Engineering by reading the marketing
materials :-(  Wish they had asked me 1st.

I've been running 64 bit Solaris on sparc for a number of years and have
never had an issue with 64/32 bit programs.   Everything just seems to
work.  I'm not sure how well the x86 64bits works yet.
___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: Why are still not at 64 bits [was Can't figure out Firefox Plugin Requirement ]

2007-02-14 Thread Bill McGonigle

On Feb 14, 2007, at 22:22, Paul Lussier wrote:


I find it mind-boggling that the Alpha came out what, 16-18 years ago
with 64 bit technology and it *still* hasn't caught on in the
mainstream.  Why is that?


I remember porting povray to the alpha back when they still called it  
OSF/1, and it was a minor bitch to get done and didn't really provide  
any benefit (other than it would compile).  I did learn how to bring  
the entire cs cluster to a crawl, though with my hacked up  
distributed computing system :) (Wayne, thanks for showing me 'nice').


x86-64 actually has one additional benefit - in 64 bit mode you  
actually have enough registers on the chip such that the compiler can  
behave in a civilized manner and you don't have to keep storing  
addresses in main memory (or more likely its cache).  So, typical  
program execution gets faster on this one particular architecture -  
none of the others have this problem in 32-bit space.


Things I don't know:
  * are there outstanding driver issues on x86-64 with linux?
  * are there any gotchas with running 32-bit apps under a linux  
that's native to x86-64?


-Bill
-
Bill McGonigle, Owner   Work: 603.448.4440
BFC Computing, LLC  Home: 603.448.1668
[EMAIL PROTECTED]   Cell: 603.252.2606
http://www.bfccomputing.com/Page: 603.442.1833
Blog: http://blog.bfccomputing.com/
VCard: http://bfccomputing.com/vcard/bill.vcf

___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/