Linux-Advocacy Digest #681, Volume #25           Sat, 18 Mar 00 03:13:06 EST

Contents:
  Re: Windows is a sickness.  Unix is the cure. (Terry Murphy)
  Re: A Linux server atop Mach? ("Chuck Swiger")
  Re: Windows 2000: nothing worse (Donn Miller)
  Re: Windows is a sickness.  Unix is the cure. ([EMAIL PROTECTED])
  Re: Iridium Tech Support (Was Re: . . . Itanium . .. (Bryant Brandon)
  Re: Bsd and Linux (David Steuber)
  Re: Bsd and Linux (David Steuber)
  Re: Bsd and Linux (Donovan Rebbechi)
  Re: An Illuminating Anecdote (Bob Hauck)

----------------------------------------------------------------------------

From: [EMAIL PROTECTED] (Terry Murphy)
Subject: Re: Windows is a sickness.  Unix is the cure.
Date: Sat, 18 Mar 2000 05:25:49 GMT

On 17 Mar 2000 20:50:41 GMT, mr_organic <[EMAIL PROTECTED]> wrote:
>I posted a rant a few days ago about the cluelessnes displayed by many
>Windows developers, and the issue has been percolating in my mind ever
>since.  Is it *possible* for a Windows-only programmer to truly
>embrace the hackish spirit?  What does the term "hacker" mean, anyhow?

A great deal of professional programmers, including myself, would
be greatly offended if they were called a hacker. Hackers are people
who write elegant, artistic, and ... wrong ... code. They are more 
interested in the quick fix, the aesthetically appealing code, rather
than the correct code.

>It should be clear that hackers, first and foremost, know their own
>history.  They have a sense of people who came before them and who
>helped create a culture with its own customs, language, and ceremonies.

Among professional engineers, there is no culture. The culture you see
on-line on places like the newgroups, and websites like Slashdot is
phony, people trying to fit in, people trying to out-dweeb each other
by trying to subsitute the highest number of Unix commands in place of
English verbs. This does not exist in real life, and productive 
engineers want no part.

>Most Windows coders I know aren't even *aware* of the Jargon File;
>they have no idea such a thing exists.  Few even know the names of
>Richard Stallman or Eric Raymond; fewer still the names of Bill Joy,
>Marshall Kirk McKusick, or other pioneers of the field.  

Calling peole like Stallman or Raymond pioneers as hackers is 
dubious, but calling them heroes of computer development in general
is just plain offensive. Hailing Raymond as attaining any kind of
technical achievement is just plain wrong.

>They know who Bill Gates is, but not Gary Kildall, who might have won
>that long-ago IBM contract for the PC operating system had things turned
>out a bit differently.

This is business/industry history, not technical history. Although
most engineers are interested in industry history, it is most certainly
not a prerequisite for being a highly competent engineer.

>A rare few Windows programmers (usually the hardcore driver-writers
>and system-programmers) read Petzold's mammoth "Programming Windows"
>book, but almost none have dipped into The Lion Book, the Demon Book,
>or the Dragon Book.  (Or even know what those books are, or where they
>can be found.)

The very idea that people refer to these books other than by their
title suggests that the whole culture is an elitist, "I am smarter 
than you", show-offy culture, rather than one which is simply interested
in getting the most work done.

>But many if not most of these folks do not learn the most rudimentary
>aspects of software or system design; they have no skills at debugging
>complex systems; and they are trained to use "packaged" solutions rather
>than figure out things for themselves.

Interesting.

Microsoft Office is certainly "a complex system".

Since it was written by Windows programmers, it was written by people
who had no skill at debugging it.

So it worked the first time it compiled?

According to that logic, Windows programmers are even more competent 
than I ever imagined!

>Few of them know how to write common algorithms or solve common
>problems; ask an average VC++ coder to whip up a custom quicksort
>algorithm or doubly-linked list, and all you're likely to get in
>return is a blank stare.  

If you actully know people who do not even know how to code quicksort (!), 
let me humbly suggest that you are working at a place with really, really
incompetent people, and you really need to find a good job. Most of the 
people I work with are pretty bright, and we use about 50% Windows, 
50% Unix. Anybody I work with would laugh in your face if challenged to
write quicksort. Here's a clue: Windows programming competence does not
revolve around the people you know and work with. 

>These programmers harm the entire trade because they give us a bad
>name -- they produce shitty, unstable code and have no real ability to
>do otherwise.

I think Unix gives us a bad name even more. Unix continues to support
the fallacious belief that it is OK if applications blow up in your
face every five minutes, as long as the precious kernel never goes down.

>Now, this kind of thing happens on Unix,too (probably more often than
>it should!).  But as old Unix hackers have known for a long time,
>peer-review is one of the best ways to get good, solid code.  It
>promotes correct design and good coding practices.  And over time it
>leads to best-of-breed software -- Apache, Sendmail, Emacs, gcc, mutt,
>etc.  Bad code happens but it dies out quickly; the evolutionary
>environment of Open Source assures that only the best-adapted
>survives.

Obviously you have not read "The Rise of Worse is Better" by Richard
P. Gabriel. Add it to your little library, and it will tell you the
true reason why Unix is evolutionally superior.

>It's no accident that version-churn on Windows is continuous.  Windows
>software is feature-driven; stability and security is of secondary
>(and often tertiary) concern.  New "features" are integrated without
>much thought as to their overall impact on the system; often these
>features are included even when the vendor *knows* they will cause
>problems.  Active Directory is once such "feature" -- Microsoft
>assures us that it works fine...as long as you have a Windows-only
>network.  Introduce Novell or Unix servers into the mix, or mix in NDS
>and Unix-based DNS/BIND implementations, and you're asking for bad
>trouble.  Microsoft never admitted that the old domain-based
>administration model was broken, either; they insisted it was fine
>right up until they replaced it with Active Directory.  *Then* they
>admitted that it might have been a little broken.

All of this has to do with development abilities how?

>I blame a lot of this on the whole mindset of Windows programmers.
>They are never taught precepts that are second-nature to most Unix
>programmers -- that stability and "correctness" are not features, but
>core assumptions from which all else must flow.  Windows coders love
>GUI screens, and love messing around with COM/DCOM, but have no real
>idea how most of this stuff works at a lower level.  I've seen Visual
>C++ programmers who don't really know C++ at all -- they've never used
>anything but an IDE, so they have no idea how to use the preprocessor,
>tweak the compiler/linker, or just ditch the IDE altogether and invoke
>the compiler from the commandline.  ("Not the CLI!" they shriek, and
>hide their eyes.)

Although most good programmers will have (incidentally) knowledge of
the command line, that stuff is considered "implementation detail"
and is certainly not a prequisite to good code. This is stuff for
technicians to do, and is not part of engineering.

>The attitude fostered by Microsoft -- that programming can be "easy"
>and "intuitive" -- has nurtured an entire generation of programmers
>who are sloppy, careless, and short-sighted.  Part of this is due to
>the tools they use, but part is also their lack of training in *real*
>programming.  

I couldn't even count to how many Unix security exploits were caused
by buffer overruns - perhaps the best example of "careless code".
Why wouldn't you check to make sure you have enough memory to write
to? Apparently most Unix programmers do not consider it worthwile.

>They are not taught to design first and code later; they are not taught 
>to code around data structures and not the other way around; they are not 
>taught to debug.  

Of all of your points in the post, this is the funniest. Obviously things
like Office and IE were extensively designed long before a text editor
was ever fired up, though some of the software by the third-rate Windows
vendors may fall into this.

Please add this book to your little library: _The Unix Philosophy_ by 
Mike Gancarz. I quote from page 27: (Chapter 3: "Rapid Prototyping for
Fun and Profit":

  "Tenet 3: Build a prototype as soon as possible"

  "When we say 'as soon as possible', we mean AS SOON AS POSSIBLE. Post
   haste. Spend a small amount of time planning the application, and then
   GET TO IT. Write code as if your life depended on it. Strike while the
   terminal is warm. You haven't a nanosecond to waste!"

  "This idea runs counter to what many would consider 'proper engineering
   methodology'. Most have been told that you should fully design an
   application before embarking on the coding phase. 'You must write a 
   functional specification', they say. 'Hold regular reviews to ensure
   that you're on track. Write a design specification to clarify your 
   thourghts on the details. 90 percent of the design should be done 
   before you ever fire up the compiler".

ALL Unix software, except for the high end CAD tools and such, follow
this methodology.

In fact, in "The Cathedral and tha Bazzar", Raymond actually ridicules
people who want to do design. In the Linux community, the mantra is
"Show me the code". Anybody who write a design document is ridiculed.

>Most of the really good hackers I know (I don't consider myself one
>yet, but I'm working on it) learned their craft on Unix.  

You must work with people really new to computers. I have never met
anybody below age 35 or 40, who learned programming on Unix. _Most_
people above that age learned on mini's in college, and most people
younger than that learned on home computers in high school or younger
(if you consider "hacking" to be "learning their craft"). The only people
who learned their craft on Unix were a few people for a couple of years
in the early/mid-80's who entered college before home computers became
prevalent, but after Unix became prevalent in University.

>To hack is to attempt to understand the inner workings of a thing and bend
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
>it to your will; it is to move beyond eye-candy and focus on the engine.

Really? Why do so few Unix programmers know assembly then?

>Good hackers usually have experience on a multitude of architectures --
>from mainframes down to handhelds.  They may not be experts in all of
>them, but they are conversant.

Very few Unix programmers I know have worked on mainframes. Most of the
younger ones (entered college after 1990 or so) have never even used
minis extensively.

>In the Open Source world, "good enough" often isn't.  

You _really_ owe it to yourself you read the Gabriel article I cited
back.

>The enormous popularity of Linux in recent years has given rise to a
>floodtide of mediocre (and often outright *bad*) code, but it is to the
>community's credit that this software disappears in short order.

Really? Then why is xuath still in popular use?

>Debugging is at least as important a talent as knowing how to code,
>but among Windows programmers this is almost a lost art.

I know a _huge_ number of Unix programmers who debug with printf's.
I'm not kidding. I'm not convinced that many Unix programmers are
capable of sophisticated debug; GDB is such an inadequete debugger
(absolutely horrendous support for threads, and complete inability
to single step through instructions usefully) that it is hard to be.


------------------------------

From: "Chuck Swiger" <[EMAIL PROTECTED]>
Subject: Re: A Linux server atop Mach?
Crossposted-To: comp.sys.next.advocacy
Date: Sat, 18 Mar 2000 05:31:46 GMT

In comp.sys.next.advocacy Charles W. Swiger <[EMAIL PROTECTED]> wrote:
> I'm a sysadmin who used to be a developer.  I find still value in the
                                                ^^^^^^^^^^

That should be: "I still find value...."  *ahem*

-Chuck "q-spoonerism" Swiger

       Chuck 'Sisyphus' Swiger | [EMAIL PROTECTED] | Bad cop!  No Donut.
       ------------------------+-------------------+--------------------
       I know that you are an optimist if you think I am a pessimist.... 

------------------------------

Date: Sat, 18 Mar 2000 01:21:47 -0500
From: Donn Miller <[EMAIL PROTECTED]>
Crossposted-To: comp.os.ms-windows.nt.advocacy
Subject: Re: Windows 2000: nothing worse

Erik Funkenbusch wrote:

> When you pay an admin $100,000 a year, $319 is not very much.

That IS a pretty good point.  It also depends who your users are -
there's a lot of people who are already well versed in Windows 9x and
NT.  A lot of normal users would struggle with something like Linux,
because 1.) the Windows UI is already familiar to them 2.) they are
already familiar with office.

So, it's better in that case to have any version of Windows
installed.  Otherwise, you'll end up spending extra money to help
those users learn unix, KDE, or something else.  You're better off
just spending the money on Window 2000, because the cost per capita is
lower than training certain people to use unix.  (There's a lot of
non-techie people who don't really care too much for computers. 
That's the users I'm talking about, like your secrectaries, people in
marketing who use a computer, etc.)

But for the rest of us, Linux or FreeBSD is the way to go.  Linux or
FreeBSD is better for those people who love to tweak their machines
for optimum performance.  For example, on FreeBSD, you can do a make
world with a particular optimization level that maximizes
performance.  Since you don't have the source code to Windows, you
can't build all your main apps with various optimization levels.

I usually build my kernel and "world" with -march=pentium -Os -pipe. 
For everything else, I use -march=pentium -O3 -pipe.  So, Windows 2000
has ease of use, familiarity, lots of games and apps, and most of all
it's stable.  But Linux and FreeBSD have a much less restrictive
license, and it's really easy to tweak for the kind of performance you
need.  It's possible to tweak Windows 2000 for performance, but it's
not as easy or flexible to do as on a unix system.

- Donn

------------------------------

From: [EMAIL PROTECTED]
Subject: Re: Windows is a sickness.  Unix is the cure.
Date: 17 Mar 2000 21:58:46 -0800

In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] says...
 
>
>I know a _huge_ number of Unix programmers who debug with printf's.
>I'm not kidding. I'm not convinced that many Unix programmers are
>capable of sophisticated debug; GDB is such an inadequete debugger
>(absolutely horrendous support for threads, and complete inability
>to single step through instructions usefully) that it is hard to be.
>

Really good programmers hardly ever need a debugger.

A good tracing tool and loggin subsystem is much more valuable. And
the best debugging tool is code review and good design.

One sign of less experienced programmers is the large amount of debugging
they do. Notice how good programmers seem to do little debugging.

One good use for a debugger is to learn how a software works by stepping
into it. This is one way a new programmer comming to a new project can
learn how the program works from the inside. However, good design documents,
and good logging system remains the best tool to debug with and to learn
how a complex system is working.

Also, the choice of the computer language used affects how much 
one spends debugging. 

Languages that are statically strongly typed eliminate many of the
bugs that can result in run-time in less strongly typed languages 
such as in C or perl and to lesser extent in C++. (i.e. let the compiler 
catch as many of your bugs, is the smartest thing you can do).

bill


------------------------------

From: Bryant Brandon <[EMAIL PROTECTED]>
Crossposted-To: comp.os.ms-windows.nt.advocacy,comp.sys.mac.advocacy,comp.unix.advocacy
Subject: Re: Iridium Tech Support (Was Re: . . . Itanium . ..
Date: Sat, 18 Mar 2000 00:59:23 -0600

In article <[EMAIL PROTECTED]>, evilsofa 
<[EMAIL PROTECTED]> wrote:

@In article <[EMAIL PROTECTED]>, Chad 
@Irby <[EMAIL PROTECTED]> wrote:
@
@> [EMAIL PROTECTED] (Bill Vermillion) wrote:
@> 
@> > It looks like it might be turned off this week anyway.
@> 
@> The amateur astronomers are just *waiting* for someone to de-orbit the 
@> Iridium satellites...
@
@They won't be de-orbiting them over populated areas.  That reminds me of 
@SkyLab, by the way, which after a mind-boggling amount of hysterical 
@hysteria, ended up squashing a jackrabbit somewhere in backwoods 
@Australia.

   Yeah, but it was an aussie jackrabbit.  So skylab just pissed it off.  
It went on to slaughter 8000 Aborigonies and some tourists before going 
to England and killing some Brits in olde armour.  The issue was finally 
brought to rest with the Holy Hangrenade.  Imagine a swarm of those 
beasts.
   On a related note: Anybody here seen the 80's remake of The Blob?

-- 
B.B.        --I am not a goat!           http://web2.airmail.net/dbrandon

------------------------------

Crossposted-To: comp.os.linux.x,comp.os.linux.development.apps
Subject: Re: Bsd and Linux
From: David Steuber <[EMAIL PROTECTED]>
Date: Sat, 18 Mar 2000 07:00:00 GMT

[EMAIL PROTECTED] (Martin Kahlert) writes:

' I don't think so: the main difference between Borland and M$ is, that
' M$ sells an OS competing with Linux (actually it's a bit the other
' way round ;-)). Their main chance to hinder Linux is to suppress the
' porting of as many apps to Linux as possible.
' I assume, M$ earns a lot of more money selling Windows
' than selling compilers!

In a way, this is actually happening.  Look at the codec model that is 
used by the media player, Net Show, Net Meeting, etc.  Also there are
the security APIs, DVD, drivers in general, etc.

Microsoft is indeed trying to decomoditize protocols as suggested in
the Halloween documents.  You can be sure that they will use the DMCA
and any other tool at their disposal to force everyone to use Windows.

At the moment, they have a lock on the corporate desktop because of
the Office suit.  They have a lock on the games market because of
DirectX APIs and drivers for video and audio hardware that are written 
to be used by DirectX.  Then there is all the COM based stuff (which
includes DirectX).  Everything in the Windows OS is tightly
integrated.  There is nothing to really differentiate an application
from the OS.  Just because it runs in user space (like IE) doesn't
mean it won't be used as a standard service by a whole bunch of
applications.

People do find this level of integration useful.  Hell, I find it
useful.  But I can see problems with it as the side effects of
changeing one application ripple throughout the entire system.

I'm not sure how things will be in five years.  I am hoping that Linux 
and free software in general will be the standard.  However, Microsoft 
has gotten a lot of people on their bandwagon.  This includes the
people who actually make IT decisions.  Also, people who write
software for money will target the biggest platform.  People will buy
the platform with the most software.  It is a self feeding cycle.

There is a lot of momentum to over come.  There is also the problem
that Linux distributions are not as idiot friendly as windows is.
This leads to disappointment among the people who try to switch based
on the Linux hype and find they are not able to get it working the way 
they like.

-- 
David Steuber   |   Hi!  My name is David Steuber, and I am
NRA Member      |   a hoploholic.

http://www.packetphone.org/

"There is hopeful symbolism in the fact that flags do not wave in a
vacuum."
                -- Arthur C. Clarke

------------------------------

Crossposted-To: comp.os.linux.x,comp.os.linux.development.apps
Subject: Re: Bsd and Linux
From: David Steuber <[EMAIL PROTECTED]>
Date: Sat, 18 Mar 2000 07:00:00 GMT

[EMAIL PROTECTED] (Victor Wagner) writes:

' BSD vs Linux is not licensing issue, it is architecture and development
' model issue. BSD is basicaly cathedral, while Linux is bazaar.

Nothing becomes part of the Linux kernel without the blessing of Linus 
Torvalds.  Nothing.

Then you have the PCMCIA code by David Hinds that isn't part of the
Linux kernel because David Hinds doesn't want it to be.

At least, this is how I understand the development situation.

-- 
David Steuber   |   Hi!  My name is David Steuber, and I am
NRA Member      |   a hoploholic.

http://www.packetphone.org/

Some men are alive simply because it is against the law to kill them.
                -- Ed Howe

------------------------------

From: [EMAIL PROTECTED] (Donovan Rebbechi)
Crossposted-To: comp.os.linux.x,comp.os.linux.development.apps
Subject: Re: Bsd and Linux
Date: 18 Mar 2000 07:04:50 GMT

On 17 Mar 2000 07:37:09 +0300, Victor Wagner wrote:

>Changing only a kernel doesn't abolish you from GNU license. Especially,

THis was my point exactly.

>BSD vs Linux is not licensing issue, it is architecture and development
>model issue. BSD is basicaly cathedral, while Linux is bazaar.

I'd agree. This is indeed the main difference.

>Too many new features which can break things, 

Possibly. There would appear to be a trade off in speed / features for
robust code. The Linux code is pretty stable, but at least OpenBSD kills
it in the security stakes, because there is a focus on refining the code 
they've already got, rather than churning out more.

> too many low-quality programs,

Well you know, the low quality programs work ( or don't work ) just as well
on the BSDs. In fact I bet they're in the ports collections at least on
Free and Net ( OpenBSD ports seems a bit behind in 2.5. I hope 2.6 has 
improved things ... )

It's also worht mentioning that some great projects ( eg: QT and KDE ) 
originated on Linux.

>too many security holes. 

I'd say *this* is the main thing BSD ( at least OpenBSD ) has over Linux. 

> I know people who say (and have arguments to
>back this opinion up) that properly admninstered NT 4.0 is more stable
>than modern Linuxes, like RedHat 6.1.

Sure, but what if you properly administer Linux as well ?

I mean, if you're going to run all the GUI stuff and Netscape on your server
with the root account, I could see a lock up happening ( due to the machine
getting memory starved ), but it's pretty solid if you just leave it in
console mode.

>I don't like BSD with its BSDish rc scripts and its scheduler, which
>seems to be unfriendly to X users. 

Ditto. I prefer sysvinit. Then again, if you're not running too 
many standalone services, the BSD init is OK.

>hardware is not cheapest thing in the world). But total costo of
>ownership of OpenBSD server (if it is able to do everything you want
>from it) is less then of Linux server. 

This was why I moved my server to OpenBSD. Just stick it in a room, run the 
services and forget about having to look after it except to read the logs 
and laugh at the futility of the crackers attempts to get in.

Still , you won't catch me with OpenBSD on my desktop any time soon 

-- 
Donovan

------------------------------

From: [EMAIL PROTECTED] (Bob Hauck)
Subject: Re: An Illuminating Anecdote
Date: 18 Mar 2000 07:09:52 GMT
Reply-To: bobh{at}haucks{dot}org

On 17 Mar 2000 17:40:17 -0800, Terry Murphy <[EMAIL PROTECTED]>
wrote:

>In article <[EMAIL PROTECTED]>,
>Bob Hauck <bobh{at}slc{dot}codem{dot}com> wrote:

>>On Fri, 17 Mar 2000 06:04:56 GMT, Terry Murphy <[EMAIL PROTECTED]> wrote:

>>>I have met EXTREMELY ignorant Unix programmers also. I have even met
>>>one or two Unix programmers who do not even know assembly language on
>>>the machine which they work on (I swear to god I am not kidding).

>>Maybe that would be because hardly anybody codes in asm on Unix machines.

>You do not need to write assembly code in order to need to understand it.
>I look at assembly output from GCC very often, for example, to make sure
>it is doing what I want it to do. It is also useful for debugging.

I rarely look at generated asm code when writing hosted apps.  I haven't
really felt the need to understand every single instruction that was
generated when I'm coding some UI stuff for a 500 MIP box.  I think this
is a typical experience for many programmers.  IOW, I don't feel that
programmers who don't know the assembler for their platform are
necessarily EXTREMELY ignorant, as you put it, or even find ignorance of
assembler very surprising, depending on the type of programming they do.

Embedded systems are another story.  Groveling through the assembler is
part of the routine.  Which is one reason I like doing embedded work <g>.


>The rationale for knowledge of assembly language is to understand the
>entire tool chain.

Frankly, I think programmers who have to deal with things like COM have
bigger things to worry about than spelunking through generated assembly
code.  That would be a last resort when a compiler bug is the only
explanation for the behavior (there are so many bugs in MS API's that you
always check into that before suspecting the compiler).


>>I think you misspoke.  You meant to say "Windows desktop productivity
>>software".

>Windows database servers do shine on TPC-C, for what it's worth.

Which is mainly because Intel hardware is really cheap per MIP.  TPC
measures cost per transaction.


>>gcc generates correct code more often 
>
>I do not have extensive experience with MSVC, but I do have extensive
>experience with GCC (I use it on a daily basis, and do almost all of 
>my work in it), and there were several instances where it has 
>produced incorrect code, or was otherwise buggy.

The grass is always greener, eh?

I use gcc every day as well.  Yes, it has bugs (particularly the 68hc11
port, which of course is not supported by the fsf).  Yes, it is not as up
to date with the C++ as it could be (although that is changing).  But the
msvc optimizer has a terrible reputation for generating incorrect code
while gcc has a pretty good rep in that department.


>>msvc does and gcc is portable while msvc is not
>
>I know for a fact that MSVC was available for the Alpha platform, and 
>was likely also available for the MIPS and PPC when there NT ports
>were supported. 

Which ports are no longer supported of course, but I'll give you
the point.


>Anyways, if a portable compiler is inherently not as fast as a 
>non-portable compiler, that's more of an argument to abandon the
>concept of a portable assembler altogether...  

I disagree.  For some users it can be of great value to have a compiler
that can run on multiple platforms, and even greater value to have a
compiler that can generate code for multiple targets.  Performance is not
the be-all for everyone.  There are lots of other reasons to pick a
particular compiler beyond how fast the code that it generates is.  How
good it's error messages are, licensing, support for the various targets
you want to support, standards, stability, etc, etc, all come into play
when deciding which compiler is "better".


>>Then there's the fact that one is developed by a giant corporation with
>>virtually unlimited resources, the other largely by volunteers who mostly
>>work on it part time.
>
>Cygwin wrote some parts of GCC, and is made up of anything but
>volunteers. 

Yes, but they don't do $20 billion a year at 93% gross margin, now do
they?  I would hope that msvc was at least competitive with gcc given the
resources available to Microsoft.


>$ gcc -c file1.c file2.c file3.c file4.c
>
>And there is not enough memory to compile file2.c, then operation would
>be terminated at that point (and only file1.c would be compiled), but
>in the ideal implementation file[134].c would all be compiled.

So what?  You still can't link it.  In spite of that, I'll go along with
the idea that it might be a good idea to try to continue with the other
files, but I certainly wouldn't spend a lot of time trying to make it so
if I were coding gcc.  There are bigger fish to fry.  If this is a big
issue, do:

for i in file1.c file2.c file3.c
do
        gcc -c $i
done


>This is not a stowstopper, and I consider it to be very low priority.

Exactly.  So it was a bad example.


>>>Even the X Windows server does not properly handle failed allocations,

>>Again, what _should_ it do?  Kill a couple of apps at random?  

>Since it is a server, it should fail the request which caused it
>to exceed memory allocation. It is _extrememly_ simple to handle this
>in the server. It is more difficult to handle this in the clients.

See my response to Erik Funkenbush.  I don't think it is as easy as you
think it is.  X is a network protocol after all.


>As I said before, I have in the past received "out of memory" errors
>from Windows programs, which is evidence enough from me that it is
>being handled in at least some of the software.

Yes, but not evidence for Windows programmers being more congnizant of
memory issues than Unix ones, which is what you seemed to be saying.

I think what's happening is that the commonly-used app frameworks provide
a mechanism to make handling errors easier.  I haven't used MFC much, but
Borland's app framework used to do a thing where it would keep it's own
little pool of memory around so that if the system ran out there would be
enough to put up a dialog box and flush work out to disk before exiting.  
I would suppose that this was not a Borland-only idea.  Many Windows apps,
especially ones used for in-house development, are written in Delphi or
Visual Basic, both of which offer some memory-management features like
this.

The key here is that the app framework helps out.  I don't think it is a
case of Windows programmers being smarter or more careful than Unix
programmers, but a matter of their being more effort being put into tools
for building desktop apps on Windows.

-- 
 -| Bob Hauck
 -| To Whom You Are Speaking
 -| http://www.bobh.org/

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and comp.os.linux.advocacy) via:

    Internet: [EMAIL PROTECTED]

Linux may be obtained via one of these FTP sites:
    ftp.funet.fi                                pub/Linux
    tsx-11.mit.edu                              pub/linux
    sunsite.unc.edu                             pub/Linux

End of Linux-Advocacy Digest
******************************

Reply via email to