On Jul 20, 2005, at 11:24 AM, Hawaii Linux Institute wrote:

Tim Newsham wrote:


This could be win for linux, but could turn out to be a lose for
other platforms.  Many vendors are inclined to provide binary-only
drivers.  This would definitely aid the vast linux community but
might make it harder for other operating system communities to
convince vendors to provide the information needed to write drivers
for their platform.  Lets hope the pressure encourages the release
of technical information that enables open source driver development.

Tim Newsham
http://www.lava.net/~newsham/


Apple took BSD and turned it into arguably the most user-friendly desktop OS (at least definitely better than Microsoft Windows).

I happen to agree (and I type this on an Apple notebook.) Apple's not perfect, of course, and the whole "GUI" thing is just rocks in a can. (Lots more noise than effect.)

Sun's Solaris desktop, while received very little attention locally, runs smoother than any Linux-based desktop I have seen for business use.

There is no good reason for this, of course. You're either talking about CDE or Gnome (I can't tell which, and Sun supports both), but in either case, its all code that LInux *could* use, but doesn't.

Both of these examples involve ability of specific vendors to provide optimized drivers and fuse them to the kernel.

This was a choice they made, not a pre-requisite.

Intel's move should remove this one of the most serious handicaps of Linux.

Or it could serve to "legitimize" binary lkms, and destroy what makes linux *free* in the process.

(Of course Intel will not be able to include its proprietary drivers in the kernel,

why not?

but you can bet that most distros will be happy to include Intel's drivers.) A new issue then arises, will this cause Linux to be turned into an Intel-dominated or even Intel-monopolized platform? I share your concern.

What makes you think its not Intel-dominated now? Show of hands, please, how many in the audience here
run linux on anything other than an x86 processor?

The August issue of Linux Format has a special section on the recent revolutionary changes made in X.

Oh phleze.... X must die.
http://pepper.idge.net/disaster.html

Seriously, if linux had managed to carry gnome onto raw hardware, rather than surfing the packets through an "X server", then they might have had something. Better, if *nix had aligned around something like NeWS, then Windows would seem completely creaky in the GUI department.

But no, we got a designed-by-committee crapfest, and called it "X".

We could have re-created the Lisp Machine environment by now, but instead we have desktops that look like the product of Soviet airport architects who've been on a drinking binge with a bunch of flower children. All that flash and glitter. Ooooh pretty! Paisley!

What does it do?

For most people, computers have become expensive toys which are constantly tweaked like some over-grown Tomaguchi. Lets download the latest patches! Lets rebuild the kernel! Twice! Watch me run benchmarks OOOOhh, look at how fast my machine is. ! I've got 57 fans in my case!

Any computer architecture that needs "anti-virus" software has failed.

And all of this in the service of writing documents, (typically in some proprietary binary format (Word)), reading email, and surfing the web.

I have not had time to digest it, but it is a highly recommended reading. From my own experience, I don't think the hardware makers in Taiwan will be willing to share their specs (and they really shouldn't).

Most of Taiwan (absent VIA) doesn't make the chipsets, they just "use" them to build interesting boards. Taiwan, Inc doesn't get to make the decision about releasing specs, thats up to the chipset vendor (Nvidia, VIA, Intel, etc.) The Taiwanese vendors signed a contract that includes terms about not disclosing the "trade secrets" and "Intellectual Property" of the chipset vendor.

But Intel's move is going to force them to face the Linux issue. Finally! Wayne

This is what Intel wants you to believe. They've been talking the talk for over a year and a half.
http://news.zdnet.com/2100-3513_22-5161041.html
The proof is still 'out there'.

jim

Reply via email to