Linux-Advocacy Digest #609, Volume #27           Wed, 12 Jul 00 02:13:04 EDT

Contents:
  Re: Richard Stallman's Politics (was: Linux is awesome! (T. Max Devlin)
  Re: Linsux as a desktop platform ("Christopher Smith")
  Re: Richard Stallman's Politics (was: Linux is awesome! (T. Max Devlin)
  Re: Richard Stallman's Politics (was: Linux is awesome! (T. Max Devlin)
  Re: Certifications on the internet by Brainbench? (Paul E. Larson)
  Re: Would a M$ Voluntary Split Save It? (Leslie Mikesell)

----------------------------------------------------------------------------

From: T. Max Devlin <[EMAIL PROTECTED]>
Crossposted-To: gnu.misc.discuss
Subject: Re: Richard Stallman's Politics (was: Linux is awesome!
Date: Wed, 12 Jul 2000 01:37:35 -0400
Reply-To: [EMAIL PROTECTED]

Quoting Jay Maynard from comp.os.linux.advocacy; 11 Jul 2000 09:42:28 
>On Tue, 11 Jul 2000 04:30:36 -0400, T. Max Devlin <[EMAIL PROTECTED]> wrote:
>>It seems reasonable to assume that no software would exist if it *had*
>>to be GPL'd.  Nevertheless, indications are strong that someday, almost
>>all software will be voluntarily GPL'd.
>
>I find this a truly terrifying prospect.
>
>Why? Because it implies the destruction of the software industry as we know
>it. That would throw millions of people out of work and damage the
>technology economy beyond any hope of recovery. After all, if you can't sell
>software, why pay people to create it?

Yes, if you don't need to buy it, why would you pay people to sell it to
you?  I can understand why a contemplation of a change in the status quo
might be frightening.  Who wants to live in interesting times?  But if
the software industry as we know it is "destroyed" by GPL, it will be
because of self-immolation.  Your missing assumption is that you won't
have the software you want if people can't profiteer off of it, if
there's no exorbitant profits to be made by insisting that people sign
away their rights to avoid future repayment at equally exorbitant
profits, then nobody will write software.

But the fact is that if it is valuable to pay someone $80 to provide you
with a box that says "word processor" which you can assume will install
and run successfully on your computer and includes some documentation,
what do you care if you could take that CD and burn a copy and write a
manual and charge people $80 for it?  Chances are you could do it a lot
cheaper, of course; you'd do it for $40.  But, see, the guys who charge
$80 have an advantage; they wrote the software.  If somebody has
problems, or wants enhancements, or needs other software that works the
same way, they could do it themselves, pay you, or pay the "commercial
software provider".  The only question that will remain is "is it worth
$80".  Not "do I need to have his trade secrets, or this other guy's
trade secrets, to continue to use the computer which has been running
the software I already bought two years ago?"

>Oh, I have no doubt that open source projects to replace some stuff would
>spring up. 

Everything that's needed or desired by sufficient users to be called "a
market".  I can understand why concern for "the software industry" would
be raised in a group of developers.  But, seriously, what *new* software
do you need, in comparison to revisions and extensions of the software
you already have?  I don't think the software industry as a model for
corporate welfare makes any sense; if they can't stay employed, then
they shouldn't be making money.

Which brings us to "will *new* types of software be created if the
originator of a very novel idea aren't able to profit from it?"  And I
say, "I don't know; let's go ask VisiCalc how they feel about it.  I'm
sure it will be enlightening."

Treating software as a trade secret helps no-one but profiteers, unless
trade secret is an equitable and appropriate business model for that
particular type or piece of software.  These will still remain, in vast
numbers, I'm sure, even after every bit of code a user generally
interacts with on a typical desktop or server system is covered by GPL.

>The experience of the Linux world is relevant here: stuff that
>interests hackers (term used properly, i.e. *NOT* as a synonym for
>"cracker") would be developed and of reasonably high quality and usability
>for hackers, but other software would be of varying and sometimes barely
>usable quality at the user interface level, and developed with little regard
>for making it usable for real people who aren't computer geeks.

The entire premise of the "PC Revolution" was that end users could do
their own programming.  Many were disillusioned to find that this wasn't
an automatic, or even simple, process.  But once you know how to do
simple "if...then" type processing, and have the tools provided by the
developers, you can go a surprising way to implementing whatever
functionality most interests and benefits you, just like the hackers
did.  We're not talking kernel level stuff here; the development is the
value there, not the distribution.  For applications, end user stuff?
There's plenty of market demand for people who are good at making that
stuff work well for a large enough group of customers to make a living
at it.

>What would those millions of people (not just programmers, by any means) who
>work for software companies do for a living?

I hate to seem conservative and Republican, because it is only the
context in which you ask this which forces the issue, but "I could give
a rat's ass" is the appropriate response.

> They can't all go to work
>supporting open source software, for two reasons: 1) there's not as much
>demand for that, 

?!?  Do you *know* what "assuming your argument" means?  You are
"begging the question".  If there is no demand for GPL software, they
won't be out of work.  If they are out of work, it will be because there
is demand for GPL software.  They can't go to work for a gigantic
corporation pretending to be a "software factory" while profiteering on
licenses.  They will have to get themselves a business plan and find a
large enough market demand for a particular type of enhancement to one
of the open source "products", and get busy writing software.  That is
what they want to do for a living, isn't it?  Why should they care if
the company is simply making profit distributing their work, instead of
profiteering on its ownership of their work.  Sure, the profiteer can
provide a better salary, but they're still going to have to keep writing
software.  I am as undaunted by the case that programmers may make less
money as I am by their prospect of being out of work altogether.  I
didn't buy a PC so that I could keep programmers employed.

>and 2) if there were, somehow, what would that say about
>the quality of the software? The folks at Troll Tech have a point worth
>careful consideration when they argue that companies who make open source
>software and sell support for it have little incentive to make the software
>robust and easy to use...case in point: Sendmail, Inc.

Thus the binding of software to the forces of capitalism that Richard
Stallman's politics are meant to unfasten.  Redistribution and binary
only form under more restrictive license than the originator does allow
Sendmail, Inc. to benefit from failure to improve their product.  The
reason this is a point from the Troll Brigade is that it is an argument
for GPL as much as against open source in general.  If other's could
improve the product and sell it, Sendmail benefits because the producer
doesn't have a requirement to sell it as open source.  But if it is GPL,
then the improvement makes Sendmail's profiteering on failure to improve
their own product impossible.  Granted, somebody's got to come up with a
better sendmail.  But its not like that's tough; it just hasn't been
done because there's no market in it when you can profiteer along,
instead.

>You describe the GPV zealot's utopia. To me, it's a nightmare world with
>poverty and misery for millions of people, and I want no part of it.

If all of those millions are just programmers, then that's not my
problem.  Boo-friggen'-hoo.  I *got* the software I need.  What I'm
looking for now is somebody who can *improve* it, without *replacing*
it.

--
T. Max Devlin
Manager of Research & Educational Services
Managed Services
[A corporation which does not wish to be identified]
[EMAIL PROTECTED]
-[Opinions expressed are my own; everyone else, including
   my employer, has to pay for them, subject to
    applicable licensing agreement]-


====== Posted via Newsfeeds.Com, Uncensored Usenet News ======
http://www.newsfeeds.com - The #1 Newsgroup Service in the World!
=======  Over 80,000 Newsgroups = 16 Different Servers! ======

------------------------------

From: "Christopher Smith" <[EMAIL PROTECTED]>
Crossposted-To: comp.sys.mac.advocacy,comp.os.ms-windows.advocacy,comp.unix.advocacy
Subject: Re: Linsux as a desktop platform
Date: Wed, 12 Jul 2000 15:46:47 +1000


"T. Max Devlin" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]...
> Quoting [EMAIL PROTECTED] () from comp.os.linux.advocacy; 10 Jul 2000
> >Even Windows 95 has pre-emptive multitasking. Mac OS before
> >Mac OS X (in other words, the version of Mac OS that Mac
> >users are using today) does not.
>
> There is reason to believe that this is a good thing.

No, there is not.  CMT has no place and absolutely zero advantages on any
general purpose machine where the operating system developer does not have
absolute and total control over every instruction that is ever executed on
it.

> The method used
> by the Mac puts whatever program is running in the foreground in charge
> of yielding to background programs if it wants to, while pre-emptive
> multitasking allows Windows to have background processes take control
> without waiting for the foreground process to yield.

Yes.  Thus, your 900 page print job doesn't stop the rest of the system dead
so you have to take the next two hours off.  Additionally, it means that if
some background program gets the CPU and refuse to yield, you don't have to
reboot.

Most people consider this to be a *good* thing.

> This does seem a
> bit in the Mac's favor in terms of being appropriate for a system which
> is intended to be used as a user desktop.

How can you say a system which allows any arbitrary program to potentially
and *easily* hang the machine and require a reboot is appropriate for a user
desktop ?  IME, most users don't like having their last few hours work go
down the drain.

> Anyone who has been
> frustrated by a menu disappearing repeatedly because some dialog box
> wanted to pop up will recognize some of the trade-off.

This is a UI issue, totally independant of, and irrelevant to, CPU
scheduling.

> For a
> client-only system, the foreground *should* have to yield before any
> background processes can take control, by some reasoning.

No, it shouldn't.  It would impose needless program complexity and
programmer overhead.

You could certainly make an argument that on such a system any foreground
app receives a boosted priority to improve response time.  This is what
Windows does.  However, under no circumstances whatsoever should any
user-space application ever be able to wrest control from the Operating
System.




------------------------------

From: T. Max Devlin <[EMAIL PROTECTED]>
Crossposted-To: gnu.misc.discuss
Subject: Re: Richard Stallman's Politics (was: Linux is awesome!
Date: Wed, 12 Jul 2000 01:52:34 -0400
Reply-To: [EMAIL PROTECTED]

Quoting John S. Dyson from comp.os.linux.advocacy; 11 Jul 2000 10:48:15
>In article <[EMAIL PROTECTED]>,
>> 
>> You describe the GPV zealot's utopia. To me, it's a nightmare world with
>> poverty and misery for millions of people, and I want no part of it.
>>
>Remember, that you are arguing mostly with zealots who are already set
>up to be 'successful' in the miseryworld of a GPL universe :-). 

You bet your ass, buckaroo.  I'm gonna be SWIMMIN' in GRAVY.  Woo-hoo!!!

> They
>have very strong motivation to see it succeed. 

And that is a bad thing if you're not a ruthless competitor who is
willing to profiteer, right.  I forgot, you're only allowed to be
strongly motivated if you're greedy.

>(BTW, I don't necessarily
>agree with you about that miseryworld, but there would be a significant
>shakeup, and likely a significant decrease in programmers, yet an
>increase in laywers, doctors and other professions -- be that good
>or bad.) Techno-emmigration would also come to halt, with expiration
>of visas.

Ha.  Let's cry for the poor disadvantaged foreign workers who will be
shipped back to their pathetic little countries; aren't they poor
downtrodden people yearning to be free?

Of course, they'd have all the source code to all the software, so I'd
guess they've been well compensated.  And I'm always for hiring more
doctors and others with real expertise, instead of program jockeys
working in cubicles, so I guess that's no problem.  In fact, it will
make doctors lower their fees in the face of greater competition, and
hey, that's always a good thing.

Say, I've got an idea.  Maybe all those out-of-work programmers could
open businesses driving doctors and lawyers back and forth to work?
It'd keep them off the streets, anyway, so to speak.

>Myself, I have only relatively neutral holdings.  Whatever happens
>in the GPL world (or in the free or commercial software worlds)
>doesn't affect my financial future.  Geesh, I moved out of
>predominantly tech stocks about 3mos ago right before the crash.
>
>Anyone, with common sense understands what is going on.

Anyone with common sense should know your prognostications are just
about as good as anyone else's that doesn't have a clue but thinks
they're well informed.  Unfortunately, common sense isn't up to the
challenge, but at least we have your general ignorance to remind us.

I can't believe anyone is foolish enough to think that GPL software
could possibly change anything if there's no money in it.  I certainly
wouldn't consider their advice on financial markets, or their opinion on
the ramifications, that's for sure.

--
T. Max Devlin
Manager of Research & Educational Services
Managed Services
[A corporation which does not wish to be identified]
[EMAIL PROTECTED]
-[Opinions expressed are my own; everyone else, including
   my employer, has to pay for them, subject to
    applicable licensing agreement]-


====== Posted via Newsfeeds.Com, Uncensored Usenet News ======
http://www.newsfeeds.com - The #1 Newsgroup Service in the World!
=======  Over 80,000 Newsgroups = 16 Different Servers! ======

------------------------------

From: T. Max Devlin <[EMAIL PROTECTED]>
Crossposted-To: gnu.misc.discuss
Subject: Re: Richard Stallman's Politics (was: Linux is awesome!
Date: Wed, 12 Jul 2000 01:55:18 -0400
Reply-To: [EMAIL PROTECTED]

Quoting Austin Ziegler from comp.os.linux.advocacy; Tue, 11 Jul 2000 
>On Tue, 11 Jul 2000, T. Max Devlin wrote:
>> Quoting Leslie Mikesell from comp.os.linux.advocacy; 10 Jul 2000 
>>> X wouldn't exist at all if it had to be GPL'd. Nor would most
>>> of the things that use it.
>> It seems reasonable to assume that no software would exist if it *had*
>> to be GPL'd. Nevertheless, indications are strong that someday, almost
>> all software will be voluntarily GPL'd.
>
>I don't think you're right. If, instead, you say 'almost all software
>will be open sourced,' I can agree. I can't agree that they will be
>GPLed.

If almost all software is open source, then there's no reason for it not
to be GPLd.  And since the GPL does, indeed, have the effect of
extending the GPL, it stands to reason that unless specifically
prevented from doing so, software will be practically all open source
(because its copyrighted) and GPL (because its software, not
literature).  Last years literature is still literature.  Last years
source is useless bytes.  Which means there will always be a huge market
for production, distribution, and maintenance of (but no market
whatsoever for ownership of) software.

--
T. Max Devlin
Manager of Research & Educational Services
Managed Services
[A corporation which does not wish to be identified]
[EMAIL PROTECTED]
-[Opinions expressed are my own; everyone else, including
   my employer, has to pay for them, subject to
    applicable licensing agreement]-


====== Posted via Newsfeeds.Com, Uncensored Usenet News ======
http://www.newsfeeds.com - The #1 Newsgroup Service in the World!
=======  Over 80,000 Newsgroups = 16 Different Servers! ======

------------------------------

From: [EMAIL PROTECTED] (Paul E. Larson)
Subject: Re: Certifications on the internet by Brainbench?
Date: Wed, 12 Jul 2000 05:54:50 GMT

In article <8kfe72$mgu$[EMAIL PROTECTED]>, "mmm007" <[EMAIL PROTECTED]> wrote:
>A friend of mine just took the Linux Administrator certification exam by
>Brainbench.  She said it was really hard.  have any of your taken these
>exams?  What do you think of them?  I'd rather not take it unless it is
>worthwhile.
>
>
Do they look nice on your resume, doubtful at this time.

Are they a way of checking your skills? I would say probably, if taken without 
the use of suplimentary material(ie. only using your mind). I have three of 
the certificates if you want to know. 

They maybe a way of checking your knowledge if you are going for the more 
recognized certificates. 

Paul

--

"Mr. Rusk you not wearing your tie." -- Frenzy 1972

------------------------------

From: [EMAIL PROTECTED] (Leslie Mikesell)
Crossposted-To: 
comp.os.os2.advocacy,comp.os.ms-windows.nt.advocacy,comp.sys.mac.advocacy
Subject: Re: Would a M$ Voluntary Split Save It?
Date: 12 Jul 2000 00:58:45 -0500

In article <Jb1a5.2500$[EMAIL PROTECTED]>,
Daniel Johnson <[EMAIL PROTECTED]> wrote:

>[snip]
>> >> >> Exactly - and none of them should involve having to make
>> >> >> changes on the other end of the wire.
>> >> >
>> >> >Why not?
>> >>
>> >> Because doing so takes away your choice of ever using anything
>> >> (a) not under your control or
>> >
>> >What does this mean?
>>
>> Interoperating with anyone you can't force to install the
>> plug-in that happens to match your API-of-the-day.  Like
>> the rest of the world.
>
>I'm sorry, but I'm having difficulty parsing your comments
>here.
>
>I am guessing you did too much violence to that sentance
>in your effort to work the word "force" in. Perhaps you could
>try to say it without it?

Take your favorite flavor of API and plug-in that does not
involve a standard wire protocol.  Try to make it work
with an IBM 390 on the other end.  Or even a Sparc. 

>> If you follow standards, you are never in the position of
>> being unable to communicate so that comparison never happens.
>
>Sure you are: as soon as you try to install a product that
>does *not* follow the standards your happen to prefer.

So don't.  Why should you ever be locked into something
only available from a single vendor?

>> So, why make the inconvenient choice?
>
>Well, if you are willing and able to insist on a protocol
>that all your clients used, you can avoid it: but this is true
>of *any* protocol, standard or not. WIth plug-ins you get
>your choice of protocols.

But unless you use standard protocols you won't be able to
match them on the other end.

>[snip]
>> >Hmmm.. I sense a bit of waffling here. Will you admit that
>> >COM is a standard?
>> >
>> >Not "might make sense as a standard". Is it one?
>>
>> I didn't see any evidence that it was offered to or accepted
>> by the IETF, or ISO, which would determine if it is an
>> accepted standard or not.
>
>I see. So the IETF and ISO are the only standards making
>bodies that you accept, and apparently ANSI is out as is
>the open group.

ANSI is rather limited in scope.  Networking was already
worldwide back in the 20th century.

>Why those two, then?

Because they define the standards for public interaction.

>[snip]
>> >This isn't true of computers. Most of their value is
>> >rather more self contained. Don't get fixated on the
>> >Web.
>>
>> No, that value was provided by typewriters, where you only
>> get back what you put it.  Computers gain their added value
>> by being able to store data and exchange it.
>
>Computers, well, compute, and were very popular before
>the web came along. "Exchanging" data you can do with
>the post office; Computers can do much more.
>
>Don't get fixated on the web. Comptuers are used for
>much, much more.

Yes, and there are many types of computers doing it.  Together.
There is no reason to limit yourself to what will only
work on one type with one OS.

>>  It is standardization of exchange
>> formats and protocols that allows progress and prevents a
>> single vendor from destroying it.
>
>You keep saying that, and I keep not believing it.

Yet you haven't admitted to having any evidence to the
contrary.  MS has made a lot of non-standard stuff that
won't interoperate with anything else.  Which one would
you say constitutes 'progress' in a way that can't be
done following standards?

>[snip]
>> >> No, Netscape didn't claim to be inseperable from the OS,
>> >
>> >That has nothing to do with it.
>>
>> Yes it does.  If it didn't the claim would be unnessary (as well
>> as being untrue...).
>
>I did not *make* that claim; therefore I would suggeswt that
>the claim *is indeed* unnecessary.

Too bad you aren't a legal advisor for Microsoft...

>[snip]
>>   This happens when the developers are tricked
>> by the MS tools that generate non-conforming pages.
>
>You really, honestly, think that people who use IE's extensions
>don't mean to?

Yes I do think that most do not realize that their work will
not display correctly in browsers competing with IE.

>That its all a trick, and that somehow nobody
>every *notices* but you?

No, I think many people notice - they just don't know what
to do about it.  After all, if they were expert HTML
producers they wouldn't be using FrontPage.

>[snip]

>> >>  Command
>> >> driven programs generally take the same commands when
>> >> run non-interactively as when interactively.
>> >
>> >This is not a particularly good thing. It means that
>> >both UI forms are compromised.
>>
>> How so?
>
>Well, the command line must be made *terse* for interactive
>use and it must be able to do as much as possible implicitly.
>These are not usually good things for a reusable program.

There was a study done in the late 80's (or so) comparing
the GUI's of the day to typing terse commands for moderately
complex but repetitive work and the CLI came out ahead.
Does anyone remember the details - it might have been done
by AT&T.  GUI's have improved since then, of course, but
for jobs where short text commands describe the actions
the command approach may still be better, as well as more
conducive to automating groups of commands.

>However, it must remain *textual* so that it can be placed
>in a text file. Good interactive UIs are so rarely textual.
>
>>  Giving the computer commmands is often the
>> best way to accomplish a task.
>
>Why?
>
>This certainly isn't the convential wisdom.

Convention wisdom seems to involve people who don't do
one particular job often enough to remember the commands.
Picking from a menu or group of icons is easier the
first few times you do something.

>[snip]
>> >Sure, it was a user-level program. But you can't expect
>> >real users to alter it when something like this comes up;
>> >it's far better to be able to give them an installer that put
>> >a driver in.
>>
>> You can't expect a user to be able to obtain a driver for
>> every possible program/device interaction.
>
>Wanna bet? :D

Yes, this was an AT&T 3b2 with a WE32k CPU, 40-some serial
ports, and probably a dozen different modem brands.  It
is an absolutely safe bet that you could not have come
up with a driver for that machine for each of those modems.

>>  It would certainly
>> have been impossible in this case.
>
>No, had UUCP used a modem API a la Windows,
>it would have been quite ordinary to download
>a modem driver for that particular modem.

Download?  With the modem that didn't work
without the driver?  This is a joke, right?

>True, Unix does not support such things. But
>that's Unix's problem.

What problem?  Needing specific support for
each program/CPU/device combination would be
the problem.

[snip]
>>  If the software vendor won't allow using generic
>> hardware without specific support, change the software.
>
>This is out of the question on the desktop, and very expensive
>elsewhere.

Out of the question?  Why is that, and why should we put up
with it?

>[snip]
>> >You said you wouldn't be willing to touch each client; how
>> >are you going to support new version fo SMTP without
>> >doing that?
>>
>> SMTP announces its capabilities as it answers a connection.
>> The sender sends what the receiver can handle.
>
>Sounds like you aren't going to be able to support new
>capabilities, but you will be able to keep using the old
>ones. Is that a fair statement?

Yes, you can update the software or not as you like.  When
two updated versions connect you get the new capabilities.
However, things like the ability to carry attachments
do not require protocol changes and will work even across
the old transports.

>> >Your strategy only avoids touching the clients when the
>> >protocol *coesn't* change.
>>
>> And when it does.  It might come as a surprise to someone
>> used to MS software, but when new capabilities were added
>> to SMTP, all the email systems in the world did *not* have
>> to be upgraded on the same day.
>
>Of course they didn't, of course not. It's not like you have to
>do that with *anyone's* systems.

Really?  Can you make any of the pre-exchange MS-Mail versions
work with anything else, letting some of the machines continue
to run that version while you update others?  Note that 
SMTP is much older than any version of MS-Mail.  In fact it
wouldn't surprise me if there are still machines on the
internet running versions that old.

>[snip]
>> >But seriously; why isn't this a standard? If you mean
>> >to say "its not a standard because its not open";
>> >then please tell me what it means to be "open", and
>> >why it matters.
>>
>> There are different types of standards.  One is mostly to
>> assure that consenting adults know what they are getting
>> into when they agree to do something privately. Another is
>> for public interaction.
>
>Which, if you have anything to say about it, will not involve
>consent!

Of course it does.  Sort of like consenting to use the
existing language if you want to communicate with others.
You can make up something no one else understands if you
want.

>>  The Open Group fits in the first
>> category.  However, something can be open without being a
>> standard by simply making a reference version available without
>> unacceptable restrictions on copying or use.
>
>Okay. So the Open Group is open but not standard then?

More standard than open, just not a public standard.

>You seem to be saying that somehow, something
>imbues real standards with *moral* force: It's *wrong*
>to disobey standards. What imbues them with this,
>that the open group hasn't got?

It is a matter of definition.  Standards define
interoperation.  And the standards group for the
particular media defines the standard.  The wire
level protocol seen at the software level is no
less critical than at the hardware level.  Do you
also question the need for standards at the hardware
level?  Do you think we would be better off if you
also had to buy every network component from one vendor
to make it work?

>[snip- Motif history lesson]
>> >"Correctly" for you, as always, means "standard conforming,
>> >no extensions".
>>
>> It is irrelevant if your browser has extensions, but publically
>> available web pages should not break standards-conforming
>> browsers.
>
>Do they have to be in English too, so you can read them?

Being readable doesn't have anything to do with displaying
correctly.

>> On a private MS-only net, anyone can do anything they want.
>> On the internet, having pages that do not display correctly
>> is bad, particularly when this is a side effect of using
>> a development tool and not intended by the author.  This
>> deception does not meet the 'consenting adults' test and comes
>> closer to outright fraud.
>
>What is your "consenting adults" test then? Why should
>I, or anyone, care about it?

The parties involved should know exactly what they are
doing.  In the case of producing non-conforming web pages
this is rarely the case.  How many people do you know
who intentionaly build a web site to break Netscape?
I've seen it done unintentionally.

>[snip]
>>  Why
>> bother with the pretense of portability with byte-code
>> for something that will only run on one CPU/OS anyway?
>
>Byte-codes buy you CPU, not OS, portability. Perhaps they
>thought Windows 2000 would run on this 'Itanium'  thing
>or something.

What?  What OS has a JVM that won't accept the same
java byte-codes as all the others?

>> Also, if it is allowed to touch native methods not bounded
>> by the expected java sandbox security it shouldn't be
>> allowed in applets anyway.
>
>I would expect the browser to enforce that, though.

The same browser that permits active-X?  Fat chance?

>[snip]
>> >What you said was that they should keep their own product
>> >at the same level as LDAP- or else the other way around,
>> >and it's not clear to me what you meant.
>>
>> It is a simple lookup and there is no reason to make the LDAP
>> version take more user steps than local files or the exchange
>> directory service - unless perhaps it is to punish people for using
>> a product not sold by MS.
>
>I don't see any reason to take your word for there being "no reason";
>I expect that LDAP is somehow inadequate compared to the
>other mechanisms. MS, presumably, chose not to extend it.

OK, so they have their reason - but it has nothing to do
with programming.


  Les Mikesell
    [EMAIL PROTECTED]

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and comp.os.linux.advocacy) via:

    Internet: [EMAIL PROTECTED]

Linux may be obtained via one of these FTP sites:
    ftp.funet.fi                                pub/Linux
    tsx-11.mit.edu                              pub/linux
    sunsite.unc.edu                             pub/Linux

End of Linux-Advocacy Digest
******************************

Reply via email to