Linux-Advocacy Digest #346, Volume #34            Tue, 8 May 01 23:13:03 EDT

Contents:
  Re: Yet another IIS security bug (T. Max Devlin)
  Re: The long slow slide to Microsoft.NOT (T. Max Devlin)
  Re: The long slow slide to Microsoft.NOT (T. Max Devlin)
  Re: The long slow slide to Microsoft.NOT (T. Max Devlin)
  Re: How to hack with a crash, another Microsoft "feature" (T. Max Devlin)
  Re: How to hack with a crash, another Microsoft "feature" (T. Max Devlin)
  Re: Justice Department LOVES Microsoft! (T. Max Devlin)
  Re: Justice Department LOVES Microsoft! (T. Max Devlin)
  Re: Justice Department LOVES Microsoft! (T. Max Devlin)
  Re: Windows makes good coasters (T. Max Devlin)

----------------------------------------------------------------------------

From: T. Max Devlin <[EMAIL PROTECTED]>
Crossposted-To: alt.destroy.microsoft
Subject: Re: Yet another IIS security bug
Reply-To: [EMAIL PROTECTED]
Date: Wed, 09 May 2001 02:52:42 GMT

Said Giuliano Colla in alt.destroy.microsoft on Tue, 08 May 2001 
   [...]
>Well, I don't know if I've proven you wrong.

It was hyperbole.  You really only proved me mistaken.

>If you enlarge the concept of 'familiar' to encompass not
>only computer experience, but all the previous experiences,
>maybe your definition holds true. But it's very hard to
>tell.

My definition already included that idea; I make no distinctions
concerning how or why something is familiar.

>Consistency is certainly a big issue. And that's
>something related on how our brain works. But maybe there's
>something deeper, very hard to define. We're used to infer
>conclusions from insufficient data (that's what 'intuition'
>means, on the end), but how it works is far from being
>clear. 

My theory is that it works like a hologram.  Did you know holograms can
be used for data processing?  They're best at pattern matching,
ironically one of the things that information processing theory (how
computers do data processing) is not very good at.  The basis of
hologrammatic processing is that it works by comparing things.

>An artists builds metaphors and analogies that make
>us 'feel' what he means. How it's done we don't know.
>Shakespeare is great, but no school can teach you to write
>in such a way as to convey the same emotions to readers or
>listeners.

No school is necessary, though, huh?

>If you take linguistic, you find a number of facts quite
>interesting. Why the same word (to wait) means two
>apparently unrelated things: to wait *on* someone and to
>wait *for* someone? What has to do 'to take care of
>something' with waiting? Well, this double meaning is common
>in may other languages besides English. This is but one
>example of a myriad of those cases. Apparently it has to do
>with some basic concepts buried deeply in our mind.

Most of the 'dual meanings' you would wonder about come from the common
roots of Indo-European; a people and language that predated the
Germanic, Greek, and Latin civilizations slightly, and many words in our
language are derived from Indo-European.  According to my reference,
"The Roots of English" by Robert Claiborne, the word "wait" comes from
the I-E root "Weg-2", which meant "lively".  The word wait itself comes
through Germanic, and originally related to "lying in wait", as in an
ambush.  So 'wait' actually means the lively attack at the end of the
wait.  (Further etymology is my own speculation, not covered in my
reference, but seems correct.)  The term was first applied to the thing
a "waiter" does.  You may recall the servants in old fancy mansions that
lurked in concealing alcoves during dinner, ready to leap out at an
instance' notice.  And so of course the word soon became a verb meaning
to stand by idly until the time to "wait" occurs.

In most cases, such supposedly unrelated meanings are pretty easy to
worm out.  But then we are left with Weg-1, an I-E root that means
"weave" or "a web" (and both words are obviously derivative themselves.)
Even I-E, it seems, had certain words with two different meanings.
Linguists are obviously unsure of whether they were once, tens of
thousands of years ago, the same word, which drifted and mutated just as
"wait" did.

>You rightly described the Desktop as a metaphor, but
>building the right metaphor, convincing for all listeners is
>something not at all so obvious.

A very valid point.  But bear in mind, the word 'metaphor' was actually
a metaphor! It didn't mean "a word used to artistically reference a
dissimilar concept" or whatever definition you want to use for metaphor.
It actually meant "a design of a user interface which causes a program
to mimic the actions of a physical arrangement of objects".  The reason
I used the term "metaphor", rather than, say, "imitation", is to
insinuate that there are rules about what would be a "good" one or not
which we cannot easily extract in words.  Just as you have noted.

>First of all it has to do with background and culture. A
>mailbox or a trash bin convey a very precise idea, until you
>go in a different country where mailboxes or trash bins have
>a completely different shape.

Good point; too easily forgotten.  But, still, as soon as you know that
silly shape is supposed to be a mailbox, it doesn't matter if it is
actually a 'mailbox' icon, does it?  You're just associating the
pictogram with the function; you don't actually need the mailbox to look
like a mailbox.  I don't believe anyone's necessarily measured this
scientifically, but I would suspect that there is no hampering of the
process of getting mail, because the mailbox on the screen does not look
like the mailbox on your doorstep.  It will take longer the first time,
certainly, but there is not ongoing cognitive dissonance which might
indicate the correspondence of icons to real world objects is at all
meaningful for *use*, only for *initial* recognition.  After that, as
long as they are consistent, it doesn't matter at all how 'familiar'
they are.

>It has to do with the degree of literacy of the user. If you
>don't mind a digression (you may skip it if you do mind!) I
>may tell you the experience of the 'communication error'
>icon.
>
>Some twenty years ago, while designing the first graphical
>human interface for our control system, I decided to use
>graphical symbols (the word 'Icon' was unheard of at that
>time) to make things easier for operators, and also to avoid
>the translation burden. We had only a text mode display, so
>we used an alternate character generator to display
>graphics. Most of the symbols were quite obvious, but then I
>stumbled into a problem. Being a multiprocessor system, with
>processors communicating via serial lines, you could
>encounter some communication errors because of noise,
>hardware failures and such. How do you show that? I thought
>that a telephone symbol could be the right thing. But I
>discovered that technicians and engineers could understand
>the meaning of the icon without explanation, while normal
>operators could not. It's a metaphor which works only if you
>understand how the system works, with many processors
>communicating. If you don't, it becomes meaningless.

The root of the issue, I think, is whether this made any difference to
those who were not novices.  Sure, the techies can guess what it is de
novo more quickly.  But is the purpose of using icons to make people
guess?  I would suggest that the recognition can be quite problematic;
suppose you unwittingly chose an icon which is as quickly recognized by
the techs, but is recognized *incorrectly*.  The strong affinity between
the real world metaphor ('telecom') and the icon ('telephone') can
actually work AGAINST you, if the techies use it to presume that the
problem is in the phone system, when it is actually a local failure.  Do
you see what I'm saying?  Metaphors can only be as valuable as they can
be dangerous, and it is better to play it safe.  The normal operators
can learn just as quickly (and, most importantly, learn different just
as quickly should the scenario change) while you've hooked into a
metaphysical concept in the techies mind, and I wouldn't suggest too
much confidence that it was the right one.  Although in your example, it
turned out to be; thanks for bringing it up.

>Up to now I appear to be backing up your concept of
>'familiar', but I'm not convinced that it's all the truth.

And you have AGAIN proven me mistaken!  How could intuitive POSSIBLY be
'familiar' alone, if an expert is capable of getting it wrong, not
right, because of familiarity?  We should measure 'intuitive',
certainly, by how often that does not happen, and what does happen is
the thing you saw with 'normal operators'.  We expect that it is better
to be intuitive, but are we not simply saying 'intuitive is equal to
whatever is better'?

>It would be nice if it were as simple as a friend of mine (a
>biologist) was explaining the difference between hereditary
>vs environmental factors: if the newborn resembles to the
>father, it's hereditary, if it resembles to the neighbor
>next door, it's environmental....

I can't tell if you, or your friend, are the one that is joking.  ;-)

>I don't believe that there is a platonic model of
>computer/human interface, but I believe that there is a
>model of human behavior, built by millions of years of
>evolution, plus background, plus education, plus everyday
>experience, plus something else, which makes some solutions
>more 'intuitive' than others.

Doesn't sound much like a model of human behavior, Giuliano.  It sounds
more like an unfalsifiable hypothesis, to claim that such a model is
anything but a platonic object (non-existent).  You're saying that man
has behavior; only a catalog of all of it could 'explain' it.  That
isn't a "model".  That is lack of a model.  ;-)

>I also believe that the day a critic will be able to tell
>what makes the difference between a masterpiece and rubbish,
>we'll be able to pinpoint the factors not related with
>familiarity.

Believe it or not, I think critics have been doing this for thousands of
years.  But I understand your point; you want them to do it
mathematically, scientifically.  And then science will have progressed
to the point where we know what 'intuitive' means.  You think there is
some 'seed algorithm' in the human brain, that all thoughts, words, and
works of man are simply the interaction of this with sense data; our
minds are like computers, and we are processing the information we are
given and calling the results 'knowledge'.

In many ways, I think this is true, but for a few small things.  The
'seed algorithm' is simply the word "is", which makes abstraction
possible.  The mind does not work like a computer; it acts like a
hologram, as does the language (abstraction) that it uses.

It was no coincidence you mentioned Shakespeare in a discussion of GUIs,
I think.  In order to know how our minds work, we have to be able to
explain both of them with the same set of rules!  The challenge is as
ponderous as the convergence between quantum physics and general
relativity.

>Till then, we must rely on consistency, on trying to
>understand what the user is familiar with, and on using at
>best our 'intuition' to work out something 'intuitive'.

Accurate, consistent, and practical?  ;-)

>> >Coming to your question I'd say that alt-tab is connected to the idea of
>> >making first of all disappear the current app and then popping up the
>> >next one. But if current app has already disappeared, the scenario is
>> >different, so it's not 'intuitive' to use the same command. One would
>> >think that, as current app has disappeared, the next one should pop up
>> >by itself.
>> 
>> It honestly seems like, when looking suddenly at desktop, you should be
>> able to hit 'ENTER' and have an app pop up!  Somehow this seems more
>> intuitive, even, (at least to me, based on what *I* am familiar with)
>> then having the next app pop up automagically.
>
>Well, we have our administration and production management
>software which requires an F10 to confirm anything. I'm
>using it since at least five years, but I still feel that
>'ENTER' would be the right key, and that F10 is 'wrong'. 

Having used systems years and years ago that used F10 to confirm, I'd
have to suspect that this is a matter of consistency.  The trick,
obviously, is determining what you want to be consistent with; simply
trying to 'be consistent' itself is not enough.  I explained earlier
that your thinking about 'intuitive' makes me think that it is
consistency *with whatever rules are already indicated by the interface,
but not any other resemblance between the interface and the real world*,
which is the root of it.  'Accuracy', I think, is the possibility to be
familiar from the real world, as in your technical and normal operators
and their variation in understanding.  Practical, of course, is the
be-all-and-end-all, and why, if we are lucky, discussions like this
might one day be science, but they will never ever be technology.  :-)

-- 
T. Max Devlin
  *** The best way to convince another is
          to state your case moderately and
             accurately.   - Benjamin Franklin ***

------------------------------

From: T. Max Devlin <[EMAIL PROTECTED]>
Crossposted-To: comp.os.ms-windows.nt.advocacy,alt.destroy.microsoft
Subject: Re: The long slow slide to Microsoft.NOT
Reply-To: [EMAIL PROTECTED]
Date: Wed, 09 May 2001 02:52:43 GMT

Said Chad Myers in alt.destroy.microsoft on Tue, 08 May 2001 01:55:47 
>"Chad Everett" <[EMAIL PROTECTED]> wrote in message
>news:[EMAIL PROTECTED]...
>> On Mon, 07 May 2001 02:53:44 GMT, Chad Myers <[EMAIL PROTECTED]>
>wrote:
>> >
>> >"Pancho Villa" <[EMAIL PROTECTED]> wrote in message
>> >news:[EMAIL PROTECTED]...
>> >> "T. Max Devlin" wrote:
>> >> >
>> >>  COM is obviously a smoke-screen for combining that
>> >> > with CORBA-like functionality, as part of Bill Gates' "everybody will
>> >> > have to pay me money" campaign.
>> >> >
>> >> The fact of the matter is that COM and DCOM were MS ripoffs of IBM's
>> >> SOM and DSOM.  OLE is simply bloated, buggy, 2nd-rate technology.  To
>> >> this day, SOM and DSOM kick COM and DCOM's butt!  Tragically, along
>> >> with IBM's OpenDoc, another fantastic technology, SOM and DSOM have
>> >> been pretty much destroyed by a criminal monopoly, and we are all
>> >> suffering.  :(
>> >
>> >It's so amusing to watch people go to all lengths to ensure that Microsoft
>> >never gets any credit for anything.
>> >
>>
>> What do you mean?  Microsoft gets credit for stealing lots of innovative
>> ideas.
>
>COM, MTS, and COM+ were firsts in the industry. There may have been
>similar technologies, but there was no copying or stealing.

According to Microsoft.  Bwah-ha-ha=ha-ha.

Monopolies have lots of things they declare are "firsts in the
industry", Chad.  They generally correlate to whatever the monopoly
prevented from being widely implemented, years previously.

>CORBA is copying. 99% of anything GUI in Linux is stolen from Microsoft,
>so perhaps you shouldn't throw rocks...

Why?  99% of anything GUI in Microsoft was stolen from Macintosh?

>Besides, where is Linux's enterprise-grade transaction processor?
>
>Oh yeah, that's right...

....capital development is tight or non-existent when there's a criminal
monopoly, yea.

-- 
T. Max Devlin
  *** The best way to convince another is
          to state your case moderately and
             accurately.   - Benjamin Franklin ***

------------------------------

From: T. Max Devlin <[EMAIL PROTECTED]>
Crossposted-To: comp.os.ms-windows.nt.advocacy,alt.linux,alt.destroy.microsoft
Subject: Re: The long slow slide to Microsoft.NOT
Reply-To: [EMAIL PROTECTED]
Date: Wed, 09 May 2001 02:52:45 GMT

Said Tom Wilson in alt.destroy.microsoft on Tue, 08 May 2001 02:12:38 
   [...]
>Anything that makes you remember assembler wistfully,
>is a "Bad Thing".

LOL!  I can imagine.  :-D

   [...]

-- 
T. Max Devlin
  *** The best way to convince another is
          to state your case moderately and
             accurately.   - Benjamin Franklin ***

------------------------------

From: T. Max Devlin <[EMAIL PROTECTED]>
Crossposted-To: comp.os.ms-windows.nt.advocacy,alt.destroy.microsoft
Subject: Re: The long slow slide to Microsoft.NOT
Reply-To: [EMAIL PROTECTED]
Date: Wed, 09 May 2001 02:52:45 GMT

Said Pancho Villa in alt.destroy.microsoft on Mon, 07 May 2001 11:21:49 
>"T. Max Devlin" wrote:
>> Said Steve Sheldon in alt.destroy.microsoft on Sun, 6 May 2001 22:08:54
>> >"Pancho Villa" <[EMAIL PROTECTED]> wrote in message
>> >news:[EMAIL PROTECTED]...
>> >> The fact of the matter is that COM and DCOM were MS ripoffs of IBM's
>> >> SOM and DSOM.  OLE is simply bloated, buggy, 2nd-rate technology.  To
>> >> this day, SOM and DSOM kick COM and DCOM's butt!  Tragically, along
>> >> with IBM's OpenDoc, another fantastic technology, SOM and DSOM have
>> >> been pretty much destroyed by a criminal monopoly, and we are all
>> >> suffering.  :(
>> >
>> >Yes, IBM has certainly destroyed a lot of their own technologies through
>> >bungled marketing.
>> 
>I really do not think that a lot of IBM's famed failures have to do
>with "marketing failures", "bungled marketing", etc.  After all, IBM
>is the biggest IT corporation in the world.  They sell more software
>than Microsoft!  Their profits are larger than MS' revenues!  They
>have 300,000 employees - 4th largest company on the planet.  Seen
>their stock price lately?  You would think that if IBM could not
>market at all, they would not be so successful.

Since when did Microsoft advocates ever let facts get in the way of a
good smoke screen?

>> > Although a lot of these marketing problems related back
>> >to the remedies that were imposed after they were found to be a criminal
>> >monopoly back 20 years ago.
>> 
>This is true.  But IBM and MS have acted very differently.  When IBM
>was found guilty, they followed all of the government's restrictions
>to the letter of the law.  And some of these restrictions were harsh,
>indeed!  IBM admitted it broke the law, and displayed contrition.  As
>a result of having gone round with the government a few times on
>antitrust, IBM's corporate culture has now completely changed, and
>they studiously avoid breaking antitrust law.  Compare the above
>behavior to MS!!!!!!!!!!!!!

Do you think MS might have a come-back in a couple of decades, like IBM
has with their Linux advocacy?

-- 
T. Max Devlin
  *** The best way to convince another is
          to state your case moderately and
             accurately.   - Benjamin Franklin ***

------------------------------

From: T. Max Devlin <[EMAIL PROTECTED]>
Subject: Re: How to hack with a crash, another Microsoft "feature"
Reply-To: [EMAIL PROTECTED]
Date: Wed, 09 May 2001 02:52:48 GMT

Said Erik Funkenbusch in comp.os.linux.advocacy on Mon, 7 May 2001 
>"Chad Everett" <[EMAIL PROTECTED]> wrote in message
>news:[EMAIL PROTECTED]...
>> On Mon, 7 May 2001 13:56:28 -0500, Erik Funkenbusch <[EMAIL PROTECTED]>
>wrote:
>> >"Chad Everett" <[EMAIL PROTECTED]> wrote in message
>> >news:[EMAIL PROTECTED]...
>> >> On Mon, 7 May 2001 05:11:24 -0500, Erik Funkenbusch <[EMAIL PROTECTED]>
>> >wrote:
>> >> >
>> >> >You're forgetting.  I already offered that it would be quite possible
>to
>> >> >device an encoding scheme where the same sequence of characters are
>not
>> >> >encoded the same way twice.  Thus, all your occurances of A and O
>would
>> >be
>> >> >different for each time they occured.  Suppose I used the 4 bit key as
>an
>> >> >index into a completely arbitrarily chosen letter translation table.
>> >>
>> >> How do you distribute the translation table?
>> >
>> >You don't need to.  Both sides have the table.  We're not talking about
>PKI
>> >here, we're talking about two sides (say the Pentagon and some foreign
>> >embassy).  You can have totally unique algorithms for each embassy, so
>that
>> >should one become compromised, your other embassies aren't.  It's only
>> >purpose is encryption between point a and point b, never point c, d, or
>e,
>> >etc...
>> >
>>
>> If that's the case then the translation table AND your 4-bit index are
>> "the key" and you're not really talking about a 4-bit key anymore.
>
>If a key unlocks a door, it's still the key that unlocks it, regardless of
>whether it's a 100 year old warded lock or state of the art medico lock.
>The complexity is irrelevant.

The complexity actually gets very relevant very quickly, in terms of how
easy it is to pick the lock, whether you have the key or not.
Translation to Navajo worked because Navajo is a natural language, not
simply because it was an unknown type of mathematical encoding scheme.
Natural language is not, in any analytic respect, a simple encoding
scheme.

Again, all of your ideas about codes and encryption are quite valid,
Erik, up until the computer was invented.  Computers can munge through
ANY possible type of substitution/mangling encryption, with consummate
ease.  It is ONLY the mathematically based encryption schemes using
large primes as factors which are in any way 'secure' these days.

They still would have had trouble with Navajo, actually, but then they
still have trouble with any natural language, don't they?  Yet more
proof that natural language is not simply an encoding scheme.  You'll
note that, even though it was successful, using unknown languages for
security was abandoned.  It isn't really very effective, except as a
temporary trick of misdirection.  It is simply more 'security through
obfuscation', but you can only fool humans; you can't fool computers.

-- 
T. Max Devlin
  *** The best way to convince another is
          to state your case moderately and
             accurately.   - Benjamin Franklin ***

------------------------------

From: T. Max Devlin <[EMAIL PROTECTED]>
Subject: Re: How to hack with a crash, another Microsoft "feature"
Reply-To: [EMAIL PROTECTED]
Date: Wed, 09 May 2001 02:52:49 GMT

Said GreyCloud in comp.os.linux.advocacy on Mon, 07 May 2001 20:25:59 
   [...]
>Lets put it this way... if Eric used a 4-bit key and did everything he
>says he would do, NSA would have it deciphered in less than a minute.

The NSA?  Sure, 'less than a minute' is accurate, but 'a few
milliseconds' is more precise.

-- 
T. Max Devlin
  *** The best way to convince another is
          to state your case moderately and
             accurately.   - Benjamin Franklin ***

------------------------------

From: T. Max Devlin <[EMAIL PROTECTED]>
Crossposted-To: 
comp.os.ms-windows.nt.advocacy,comp.sys.mac.advocacy,comp.os.ms-windows.advocacy
Subject: Re: Justice Department LOVES Microsoft!
Reply-To: [EMAIL PROTECTED]
Date: Wed, 09 May 2001 02:52:50 GMT

Said Daniel Johnson in comp.os.linux.advocacy on Tue, 08 May 2001 
>"Karel Jansens" <[EMAIL PROTECTED]> wrote in message
>news:[EMAIL PROTECTED]...
>> Daniel Johnson wrote:
>> > MS did not exclude anyone from developing on
>> > Windows. They *encouraged* it.
>>
>> ... but only if the developers promised to phase out the OS/2
>> equivalents. Ask Borland (ObjectVision, anyone?)
>
>Had Microsoft been as foolish as you suggest,
>they would have failed. Windows needed developers
>and MS could not afford to drive them away.

And the fact that the most common applications used today are all
Microsoft products doesn't seem to register with you, does it?


-- 
T. Max Devlin
  *** The best way to convince another is
          to state your case moderately and
             accurately.   - Benjamin Franklin ***

------------------------------

From: T. Max Devlin <[EMAIL PROTECTED]>
Subject: Re: Justice Department LOVES Microsoft!
Reply-To: [EMAIL PROTECTED]
Date: Wed, 09 May 2001 02:52:54 GMT

begin  

Said JamesW in comp.os.linux.advocacy on Tue, 8 May 2001 12:17:59 +0100;
>In article <[EMAIL PROTECTED]>,
[EMAIL PROTECTED] 
>says...
>> Note that this will not be readable just as is by the sucking newsreader 
>> JS PL is using.
>> 
>> Peter
>> 
>> end
>> 
>  begin
>^^
>You need to put two spaces in front of the begin to fool OE. The 
>braindead newsreader expects an attachment - true if there was a matching 
>space-space-end somewhere after the space-space-begin. Since there is no 
>space-space-end working newsreaders ignore the begin and display the 
>text. 
>
>OE thinking there is an attachment will displays a paperclip instead - 
>another M$ product that substitutes useless paperclips for 
>functionality...

You mean like this?  (I've already adjust for your correction that it is
begin-space-space, not space-space-begin.)

end  

-- 
T. Max Devlin
  *** The best way to convince another is
          to state your case moderately and
             accurately.   - Benjamin Franklin ***

------------------------------

From: T. Max Devlin <[EMAIL PROTECTED]>
Crossposted-To: 
comp.os.ms-windows.nt.advocacy,comp.sys.mac.advocacy,comp.os.ms-windows.advocacy
Subject: Re: Justice Department LOVES Microsoft!
Reply-To: [EMAIL PROTECTED]
Date: Wed, 09 May 2001 02:52:55 GMT

Said Daniel Johnson in comp.os.linux.advocacy on Tue, 08 May 2001 
>"Rick" <[EMAIL PROTECTED]> wrote in message
>news:[EMAIL PROTECTED]...
>> Daniel Johnson wrote:
>> > Recall how much fun MS has had trying to kill
>> > DOS.
>> >
>> > I look forward to watching Apple, um, enjoy
>> > the same passtime. Don't you? :D
>>
>> One day, in the near future, Apple will simply cease to support OS 9. AS
>> they dont support serial ports, ADB, etc. As they moved to the PPC from
>> the 68K family. Apple has a history of being able to move forward, and
>> drag the rest of the industry with it.
>
>You may be right, but in all honesty Apple has had
>the *worst* trouble trying to deal with their software's
>backwards compatibility baggage.
>
>I hope they overcome it too, but history does not
>encourage me in this.

Bwah-ha-ha-ha-ha-ha.  Honestly, Apple has NEVER had the kind of trouble
with 'backwards compatibility' [sic] that Microsoft does!  Guffaw.

-- 
T. Max Devlin
  *** The best way to convince another is
          to state your case moderately and
             accurately.   - Benjamin Franklin ***

------------------------------

From: T. Max Devlin <[EMAIL PROTECTED]>
Crossposted-To: alt.linux.sux,alt.linux,comp.os.ms-windows.nt.advocacy
Subject: Re: Windows makes good coasters
Reply-To: [EMAIL PROTECTED]
Date: Wed, 09 May 2001 02:52:56 GMT

Said Chad Myers in comp.os.linux.advocacy on Tue, 08 May 2001 01:52:19 
   [...]
>How so? Really, CD Burning isn't all that intensive. [...]

Typical Myers cluelessness about the subject he's pretending to discuss.
Guffaw.

-- 
T. Max Devlin
  *** The best way to convince another is
          to state your case moderately and
             accurately.   - Benjamin Franklin ***

------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list by posting to comp.os.linux.advocacy.

Linux may be obtained via one of these FTP sites:
    ftp.funet.fi                                pub/Linux
    tsx-11.mit.edu                              pub/linux
    sunsite.unc.edu                             pub/Linux

End of Linux-Advocacy Digest
******************************

Reply via email to