Linux-Advocacy Digest #725, Volume #25           Tue, 21 Mar 00 05:13:09 EST

Contents:
  Re: Disproving the lies. (R.E.Ballard ( Rex Ballard ))
  Re: Producing Quality Code ("Erik Funkenbusch")
  Re: Producing Quality Code ("Erik Funkenbusch")

----------------------------------------------------------------------------

From: R.E.Ballard ( Rex Ballard ) <[EMAIL PROTECTED]>
Crossposted-To: comp.os.ms-windows.nt.advocacy
Subject: Re: Disproving the lies.
Date: Tue, 21 Mar 2000 09:44:22 GMT

In article <rNqA4.65$%p6.258@client>,
"Nik Simpson" <[EMAIL PROTECTED]> wrote:
>
> "R.E.Ballard ( Rex Ballard )" <[EMAIL PROTECTED]> wrote in message
> news:8aru6p$feu$[EMAIL PROTECTED]...
> > > > When you get into complex back-end business integration,
> > > > clustering isn't as trivial. You can do clustering with
> > > > DCOM, CORBA, RPC, or MQ, and you can add Tuxedo, CICS, CORBA
> > > > Transaction Services, or MQ Transaction services to integrate
> > > > with XA compliant databases and servers to provide
transactional
> > > > integrity.
> > >
> > > And your point is?
> >
> > The methods Microsoft touts (DCOM, MTS, and MSMQ) are
microsoft-only
> > solutions. You can only use them on NT or Win2K. This means that
> > when you are getting into large-scale solutions, you are still
stuck
> > to Microsoft platforms.
> >
> Seems to work pretty well for some very large sites, DELL for
instance, what
> sites do you have in mind that are too big to be handled in this way?
Web
> sites tend to be pretty easy to balance accross multiple machines so
> relative power of a single machine doesn't seem to be that big a
deal, how
> many web sites do you know that run on say an E-10K machine?

Any site that intends to do e-business, which involves integration
with Mainframe, UNIX, and other legacy systems that aren't supported
by Microsoft.  In addition, there are the connections to Suppliers,
Agents, Vendors, and Market Research information.

There are some sites that use multiple E-10K machines, especially
for interactive capabilities that involve more complex session
management and coordination.

Many Sun customers use multiple E-4500 machines or E-6K class machines,
because they can be pretty easily clustered.



> > > > At a lower level, you can use PVM and MPI to create distributed
> > > > calls that can anonymously be routed to other procesessors
while
> > > > still supporting the context of the calling process/processor.
> > >
> > > Irrelevant to websites.
>
> > Actually, many e-business and e-commerce sites that do a great deal
> > of sophisticated customer relations management do rely on
> > data-warehousing techniques and very large engines such as SP/2 and
> > E-10k machines. Both make very effective use of these technologies.
>
> Name some.

AT&T Wireless, Prudential, Dun and Bradstreet...


> > > Are you saying that an Oracle licence
> > > LINUX for use on a largescale website
> > > is $120, I somehow doubt that figure.
> > Progres comes with unlimited users. SQL database, 2 phase commit,
> > and ODBC compatible. When you need to switch to Sybase or DB2,
> > you can get site licenses at reasonable prices.
>
> So when you need to switch to an
> industrial strenght DB, you end up paying
> for database licences.

> How many largescale sites can you point to that are
> using Progres as the backend.

DejaNews.

> And anyway it doesn't invalifdate my original
> (which you seem to have snipped) that spread
> the FUD about needing CALs for
> an NT webserver which is BS.

The entire definition of Client Access License has been a
feature Microsoft has played many interesting games with.
For example, the definition that was used prior to Back-Office
and Enterprise edition specified no duration.  Typically, the
CALs were estimated based on concurrent connections.

Later, Microsoft decided that BackOffice connections, even
though they were web-based, and involved trivial connections,
also involved uniquely identifiable users and that each user
average 30 minutes per "session".  For some companies, Microsoft
redefined the "Client Access" which meant that users of back-office
were counted based on the busiest 30 minute period in terms of
number of connection from different uniquely identifiable users
(based on cookies or verisign signatures).

Customers balked, many began switching to Linux or UNIX for these
functions.  Others began disabling signature sensitive software
such as ActiveX controls.  Microsoft eventually back-peddled on
some of their licensing policy.  In areas where there is sufficient
Competition, Microsoft has to tread lightly.  In areas where Linux
and UNIX compete effectively on a royalty-free basis, the prices
Microsoft also has to be flexible.  In areas where Microsoft thinks
it's Monopoly will hold, they hit you hard.

> > > NT boxes can also do their own DNS, so the same rather crude DNS
load
> > > balancing is possible.
> > > However CISCO local director and similar solutions
> > > are much more popular at web sites because they
> > > give finer granularity.
> > And CISCO runs BSD UNIX as it's core operating system.
>
> So what, a router is a blackbox.
> The main reason they run things like a BSD
> kernel is hsitorical, CISCO has been developing
> routing code on top of BSD
> kernels since the mid-80s why would they change.

They've changed quite a bit since 1980.  Even the types of
UNIX being used, the user interfaces, and the management
functionality has changed radically, even the scheduling
has become more sophisticated - supporting multilevel cache,
SCSI multi-spindle scheduling, and RAM.

> However, a BSD kernel is
> not a requirement for Local Director like functionality,

> WLBS is a "shim"
> driver in the network stack on NT and does just the same.

Perhaps, if Win2K is sufficiently reliable, we may see the the
use of W2K based Local Director like functionality.  I don't see
Cisco replacing millions of routers with NT or Win2K any time
soon.

> > > > Microsoft has tried to lock Linux out of USB, DVD, PCI-PnP, and
> > > > several of it's other key technologies. Eventually, the
> > > > proprietary
> > > > content was either disclosed or hacked.
> >
> > Microsoft required each member of each organization to sign
> > comprehensive nondiscosure agreements before participating in their
> > standards bodies.
>
> Care to provide a reference for that assertion,
> or should we just take your word for it.

Review the minutes of the DVD-CSS, PCI, W3C, USB, and IETF committees.
Each of these standards bodies has wrestled with the question of
whether or not to accept Microsoft's demands.  In many cases, such
as the IETF, the demand is rejected, and Microsoft merely refuses
to play by the standards bodies' rules.

You may have to go through old printed copy in libraries, because
many of the disputes are brushed under the umbrella of nondisclosure
once the demand is accepted.  The print versions however do cover
the requests, some of the contriversy, and the general nature of
the agreements.

Unfortunately, to get more details, you generally need to have
court orders.  If the DOJ needs to, they can go through the
entire process of yet another case, subpoenas, and disclosures
again.

Remember, until the court was on the verge of issuing a blanket
ruling, and possibly adding obstruction of justice to the charges,
government investigators could not question Microsoft allies, partners,
or others covered by nondisclosure agreements without having a
Microsoft lawyer present.  Even after issuing an announcement that
people should "cooperate", Microsoft tried to "coach" companies like
Netscape, IBM, Dell, and others in what they could say without risk
of reprisals.

> > Normally, if you are trying to establish industry standards, you
> > want the standards open and published.
> > Microsoft eventually saw that it was loosing the market to CORBA
> > and offered to publish a brain-dead specification of DCOM that
allowed
> > CORBA implementors to map DCOM IDL to CORBA IDL. You still can't
> > implement DCOM Clients with that specification.
>
> And what about all the shenaigans Sun has got upto with the JAVA
> standardisation process.

Sun's goofiness with JAVA has resulted in a countermovement in the
OpenSource community.  Linux was published with Guavac and Kaffe,
which provided open-source implementations that were on the verge
of extending "java compatibility" - but away from Microsoft.
Eventually Sun began agressively including Linux support for Java
to prevent total loss of control to Guavac and Kaffe.

> > > What evidence do you have the Microsoft
> > > was repsonsible for any of these, If
> > > I recall correctly, the DVD issue was
> > > caused by the AAMP, something that MS
> > > is not even a member of.
> > Microsoft was a key member of DVD-CSS, and was the sole
> > provider of software to decode DVD-CSS. Interesting that
> > you can download the decoder software from Microsoft,
> > but you can't even show the link to Norway.
>
> Because Microsoft has paid its dues to AAMPA
> and can thuis distribute software to decode DVDs,
> also the Microsoft code will not bypass the dumbass
> region encoding that the DVD standard got saddled with.

However, other software publishers were excluded.  Most of the
remaining members were peripheral makers and controller card makers.

Here is the current membership:
http://www.dvddemystified.com/dvdfaq.html#6.1

Note the software publishers.
Interesting that SuSE 6.3 is distributed on DVD (unencrypted?).

The DVD-CSS site has been "cleaned up" - and no longer lists
it's membership in the public area.  I had viewed it previusly,
and the only official member that wasn't a hardware maker was
Microsoft.  Furthermore, the only "endorsed" product on the old
site was the Microsoft decoder.

> The movie industry
> is the one driving the limitations of
> the DVD standard not Microsoft and
> they the movie industry is doing it
> to protect copyright and movie release
> cycles, not Microsoft.

True, but Microsoft was happy to help set up the structure
to protect it's monopoly, and was pressing very hard for
suppression of the DeCSS software.

> All somebody had to do wqas pay the licence fee and
> they could produce a DVD player for LINUX,
> the problem arose because some
> members of the LINUX community beleive
> that everything should be free.

Actually, anyone who implements the software also
has to pay $4/player to the MPEG-2 group and 4 cents/copy
for each disk.

The fact is that SuSE had paid the money and currently
releases it's software on DVD.  The DVD-CSS protects
the encryption scheme and does not disclose even the
registration fee and/or the royalty fee - until after
you've sent in your form and signed the nondisclosure
agreement.  There is no mention of the DVD-CSS royalty
fees at their web site, only the application process.

>From my reading of the DVD documents above, the manufacturer
of the DVD Drive has already paid the $5000 fee and all
DeCSS did was provide the decoder required to convert the
bits pulled from the manufacturer's drive into the unencrypted
MPEG-2 datastream required for display on the screen.

It turns out that the decoding is pretty trivial - a combination
of RC5, MD4, and DES encryption.

> > > PCI-PnP requires a licence, and that is controlled
> > > by the PCI folks whoa re primarily run by the chipset
> > > and hardware vendors, same goes for USB.
> > Both of which were implemented and pushed by Microsoft, often
> > at the cost of open and published standards like Fire-Wire.
>
> Intel were the big force behind PCI and USB, not Microsoft.

That's very interesting since PCI was first used on the MAC and
the MicroVAX - neither of which were Intel based.  The USB was
also first introduced by Motorola.  It was Microsoft that insisted
on a comprehensive umbrella of nondisclosure agreements before
implementing the "extended" "standards" on Windows.  Only Microsoft
could disclose the API.

> > > What you mean is that the LINUX folks didn't want to pay
> > > to join these clubs
> > Actually, Bob Young stated at the Raleigh Linux Expo that he
> > would be happy to join the clubs and publish binaries where
> > necessary. There are several drivers that are published in
> > binary formats that are not GPL. When possible, these are
> > included as RPMs on the main distribution, when royalties
> > must be paid, these binaries are placed on the secondary
> > service.
>
> So what you are saying is that when
> Redhat decided to play by the same rules
> as the rest of the industry and licence
> the PCI-PnP specs they had full
> access just like everybody else.

No they didn't.  They had the top-level specs, and could get
chip information, but much of the interpretation of the chip
information was still protected by exclusive agreements with
Microsoft.  This is why Red Hat still has problems with some
Video Cards and many sound cards.

> > Red Hat DID pay and did receive key PCI-PNP information from
> > Adaptec who was a member of the PCI standard. Microsoft had
> > promised to give SCSI a much bigger role in future computers
> > and renigged when it was discovered that Linux ran faster on
> > SCSI than NT did.
>
> In what way can Microsoft give SCSI a bigger role?
> They ship drivers for Adaptec products on their CDs,
> it's up to the buying public and the hardware
> vendors to decide what disk I/F they buy.

Half true.  The vendors decide what the consumers will buy,
and Microsoft decides what the vendors will sell.  SCSI
makes Linux run faster with multiple drives while Windows NT 4.0
runs slower.  Win2K has a multithreaded disk driver that will
be capable of managing multiple outstanding drive requests.

Notice that most of the Microsoft touted benchmarks use
single RAID-in-hardware arrays which means that both systems
appear to have a single drive with very big partitions.  Microsoft
even had to tweak it's journalling file-system and swap areas while
not tweaking Linux.

> This is just more conspricary theory bullshit.
>
> > > and in your deluded mind that equates
> > > to some vast Illuminati conspiracy from
> > > the evil Microsoft empire.
> > The last time someone made a comment like that was about 6 months
> > before the DOJ case. His name was Roger. Judge Jackson's findings
> > of fact confirmed all but two of my accusations.
>
> I don't recall Judge Jackson even discussing issues like
> USB/DVD,PCI-PNP etc.

You didn't read my previous postings.

I pointed out that Microsoft had held up shipments to at least
one PC manufacturer to the point of threatening their ability
to stay in the PC market (IBM).  I was called a nut - until
IBM testified in court.

I pointed out that Microsoft had used embrace and extend tactics
to prevent the proliferation of open standards and open source
APIs such as PERL, Java, and HTTP/HTML to prevent users from porting
UNIX software to Windows, and to prevent users from switching from
Windows to UNIX.

Even more important, I stated that Microsoft was using these
tactics to gain control of the Internet, to extend the monopoly
to include control of the Internet, the Media industry, and
even the web infrastructure.

All of these things were first exposed in 1997.  The final findings
weren't formally published until 1999.

> He has more sense than that.

But you don't have the sense to research my posting history
to see what I was talking about.  Instead you chose to take
a quote out of context - and assume that I was talking about
these things.

There are well documented cases of Microsoft trying to manipulate
the IETF into accepting DCOM as an IETF standard without disclosing
the protocol in sufficient detail to create a reference implementation
to prove that the specification is complete and correct (all formal
IETF standards must be subjected to this regimen).

Microsoft has tried to use this loophole to promote ActiveX as
an "Internet Standard" in spite of the fact that the proprietary
nature of the content makes it possible for hackers to deliver
executible code into a computer while giving the system and network
administrators no ability to detect or monitor it.

Microsoft has also been trying to have XML "extended" with the
ability for Microsoft to embed ActiveX and other Microsoft binaries
directly into the XML messages and have them executed.  IBM and others
have countered with open standards like CGM which are platform
independent, auditable, and verifiable.

Microsoft eventually began literally talking out both sides of
their mouth.  In the pages discussing the Sun suit Microsoft openly
argued for the right to protect their monopoly, stating that they
wanted complete control of the PC environment.  In the pages
discussing the Caldera suit, they openly admitted to practices that
kept DRI out of the market, and Microsfot even admitted that Service
Pak 2 contained code that was designed to crash on Cyrix systems (Cyrix
being owned by IBM at the time).  And Microsoft argued that these
malicious acts, including the willful sabatoge of hundreds of thousands
of computers were it's "Right" to protect Microsoft's Investors from
Competition and to provide "what the customer needed" (which was
great comfort to anyone who had purchased a Cyrix powered computer).

Microsoft is not above rendering a computer completely useless to
protect a market it wants to control.  The most dramatic example
was when Microsoft published MS-DOS 6.0 with doublespace which
rendered computer systems equipped with Stacker completely useless.
Stacker users couldn't even read Stacker equipped disks for the
purpose of backing them up.

If Mitnick had caused 20% of the world's PCs to become completely
useless, he would have spent years in jail.  When Bill Gates did it,
he was merely ordered to put out a "fix release".

Service Pack 4 also had malicious code which was fixed in SP 5.

Third party vendors are traumatised with every service pack and
fixpack Microsoft puts out.  In many cases, companies like Netscape
have to put out new software to accomodate DLLs replaced by Service
Packs, B releases, and application fix-packs.

> --
> Nik Simpson
>
>
--
Rex Ballard - Open Source Advocate, Internet
I/T Architect, MIS Director
http://www.open4success.com
Linux - 60 million satisfied users worldwide
and growing at over 1%/week!


Sent via Deja.com http://www.deja.com/
Before you buy.

------------------------------

From: "Erik Funkenbusch" <[EMAIL PROTECTED]>
Subject: Re: Producing Quality Code
Date: Tue, 21 Mar 2000 03:58:40 -0600

<[EMAIL PROTECTED]> wrote in message
news:DSCB4.4323$[EMAIL PROTECTED]...
> > Nope, marketing is only part of the process.  Yes, it can be
> > critical to this business model in the early stages, but it
> > is no more the determining factor in closed source software
> > development than is the match that starts a forest fire --
> > critical to getting things started, but not nearly so
> > important once the whole process is fully engaged.  Regardless
> > of marketing, most Windows users couldn't switch OSes now if
> > they wanted to, and neither can many people change from
> > proprietary software tools and data formats.
> >
>
> What's stopping them, apart from simple inertia?  Granted, it
> would be a hassle to convert document formats and retrain
> for new applications, but corporations did it once, and will do
> it again; why not now?  This is an empty point, but everyone
> seems to have bought into it.  Dropping Windows is not an
> impossibility; it simply takes imagination and foresight,
> along with a willingness to be flexible.

What's stopping them?  The fact that non-Windows software doesn't do 1/10th
of what the windows software does, if it exists at all.  Show me a
non-windows contact management software that can hold a candle to GoldMine
or Act.  StarOffice simply cannot do what many business uses need it to do
(the old argument that people only need 10% of what Office does fails to
understand that just about everyone needs a different 10%, which all add up
to the 100%).

Data formats are the easy part.

> > Engineers with inadequate understandings of economic realities
> > make poor agents provocateur in the software marketplace.
> > Those who understand and respect the realities stand the best
> > chance of making a difference.
>
> Crap.  And crap again.  Business majors make crappy software
> engineers, and vice versa.  Arguments like yours are part of the
> reason we're in the mess we're in vis a vis software.  If
> "economic realities" mandate shitty code, then we are all in a
> world of hurt.

Do you think computers would have become anywhere near as ubiquitious as
they are today if they were not marketed at specific needs?

We wouldn't be "in this mess" without marketing simply because the only uses
of computers would be millitary oriented still.

You fail to realize that you owe your job to the fact that the mess exists
in the first place.

> I made no judgement about you personally (although I must
> have hit a nerve if you took it that way).  I will say that your
> "realities of economy" fooferaw is the same kind of fatalist
> B.S. that has gotten software into this sorry state to begin
> with.  As a group, computer people seem to be hobbled by
> greed -- we express noble intentions, but lose our backbone
> when presented with fat stock options or the threat of loss of
> pay.

When a client says "I need this, and I need it in this time-frame" you
either say yes, we can do that no matter how unreasonable it is, or the
client goes to your competitor that WILL say yes to them.  Starvation is not
the way to effect change, since all you do is die off while the people that
cut corners feast.

> How many software companies would survive if no programmers
> would work for them?  That was my whole point; programmers are
> the most important part of the process, and yet they have no
> sense of their own power.  We still are mired in the belief that
> we can lose our jobs and be destitute at any moment (or lose
> millions in vested stock options).  Your argument seems to boil
> down to nothing more than "status quo" -- leave things alone and
> maybe it'll get better.  Well, there's no evidence of that; if fact the
> situation is worsening at an alarming rate.

All of them would survive.  Why?  Because they would simply tell their
accountants, janitors, clerks, whatever to start writing software.  This is
how software began.  The concept of a "computer science" degree didn't even
come into being until about 20 years ago.  Software developers were all
educated in other fields.

The thing you have to realize is that they do not NEED you.  They can get by
without you, and probably will.  Sticking to your guns will not force
anyones hand.

> You persist in seeing things in terms of how they are -- I speak of
> how things *can be*, given willingness of people to effect change.
> If all this dialogue garners is ill-will, wouldn't you say that there
> is a larger problem among the ranks of programmers?  After all,
> all I'm advocating is rigor and quality in a field that desperately
> needs it.

Your lofty theories fail to take into account human nature.  All endeavors
that fail to take this into account fail.  Relying on the good will of your
neighbor get's you stabbed in the back.  That's the reality, and it will not
change.

> What's so frightening about that?

Your rose colored glassses.





------------------------------

From: "Erik Funkenbusch" <[EMAIL PROTECTED]>
Subject: Re: Producing Quality Code
Date: Tue, 21 Mar 2000 04:07:54 -0600

<[EMAIL PROTECTED]> wrote in message
news:eiCB4.1843$[EMAIL PROTECTED]...
> In article <vtBB4.2222$[EMAIL PROTECTED]>,
> > Engineers that protest become unemployed.  Companies are already are
moving
> > to hiring off-shore labor (H1-B visas are 4x the amount they were just a
few
> > years ago) and cheap entry-level workforces.  The only way to affect
change
> > is to keep yourself employed.  You can't change things if you get fired
for
> > trying.
>
> *snort*
>  I can't say for sure on Software, but every company I consult with is
begging for
> engineers, the demand is huge. Provided you avoid wandering down the halls
> with a shotgun, getting fired is not a problem. The reason the companies
are hiring
> as many H1-B's as they can, is because of that demand.

This is patently false.  It's a myth fostered by the IT industry to get more
H1-B visa's.  There are literally hundreds of thousand unemployed software
engineers out there working as janitors, cooks, paper deliverers, any job
they can get because companies refuse to hire them.

Why do they refuse to hire them?  Cost.  Someone with 25 years of experience
won't work for entry-level pay.  There is serious age discrimination in this
industry.  Rather than retrain older workers, they would much rather hire
cheap young blood that will work 60 hours a week for no overtime and churn
out tons of code.  It doesn't matter if that code is poor or not because
they know the code will be obsolete in 6 months or a year.

H1-B's are even further in demand because they're slave labor.  H1-B's are
prevented from switching jobs.  If they quit, they get deported, regardless
of whether they get a different job or not.  That means they can pay them
nothing and force them to stay, work them long hours and burn them out.

That's the real reason they hire H1-B's.  Not because of a lack of
engineers, but because of a lack of cheap, entry level engineers that will
work long hours without overtime and cannot quit their jobs no matter how
poorly they're treated without being deported.

As an example.  Microsoft has over 6,000 open positions waiting to be
filled, yet they only hire less than 1% of those that apply.  If they really
were that desperate, they would hire anyone that walked in the door.





------------------------------


** FOR YOUR REFERENCE **

The service address, to which questions about the list itself and requests
to be added to or deleted from it should be directed, is:

    Internet: [EMAIL PROTECTED]

You can send mail to the entire list (and comp.os.linux.advocacy) via:

    Internet: [EMAIL PROTECTED]

Linux may be obtained via one of these FTP sites:
    ftp.funet.fi                                pub/Linux
    tsx-11.mit.edu                              pub/linux
    sunsite.unc.edu                             pub/Linux

End of Linux-Advocacy Digest
******************************

Reply via email to