James Miller wrote:

>OSNews: The Fast-Food Syndrome: The Linux Platform is Getting Fat
>Jun 10, 2004,
>
>"Consider these memory requirements for Fedora Core 2, as specified by Red
>Hat: Minimum for graphical: 192MB and Recommended for graphical: 256MB
>Does that sound any alarm bells with you? 192MB minimum? I've been running
>Linux for five years (and am a huge supporter), and have plenty of
>experience with Windows, Mac OS X and others. And those numbers are
>shocking--severely so. No other general-purpose OS in existence has such
>high requirements. Linux is getting very fat.
>
>"I appreciate that there are other distros; however, this is symptomatic
>of what's happening to Linux in general. The other mainstream desktop
>distros are equally demanding (even if not as much as Fedora, for example
>Arch Linux or Slackware run Gnome on 128 MB, but not very comfortably when
>you load 2-3 apps at the same time), desktops and apps are bloating beyond
>control, and it's starting to put Linux in a troublesome situation. Allow
>me to elaborate..."
>
>http://www.osnews.com/story.php?news_id=7324 for more of the story
>
>
>
Yeah I noticed that with Mandrake 8, RH 9. But I wouldnt expect that to
bother guys setting up servers on say, Suse 6.4; Redhat I found
particularly slow, but they have always had the most highly advertised,
but worst performance. Xandros needs 128; although with memory prices
nowadays, its no biggie to have 256, and most of the old motherboards
will take 3 128meg SIMM for 384 total. However, for most users, Xandros
includes, which Redhat does not, automatic mounting of windows or other
drives so that a newbie can easily transfer work from the old win drive
(even if the original win os is crashed)... and the ability to run most
of the popular win 9x applications, as well as read all the win file
formats.

And to some considerable extent, it is the bloat and eye candy on the
web that is bogging things down. But of course, that same overhead is
why people are looking at long distance wireless high speed data in the
first place. I wonder as well how much of the bloat comes of routines to
cope with sabotage software. And then too, it seems that Microsoft has
been sending out 'upgrades' to their customer servers which try to
empower windows platforms but deny access with stuff like MS-CHAP to
platforms which dont run windows. I've seen a Mac customer give up on
her Mac cause she couldnt get it to stay online and it took so many
tries to get it to merely logon.

I ran COREL/Debian for a few years, and noticed the same problem at that
time. Now, I realize that the distro programmers figure out what
Microsoft is up to, and benefit by having customers like me, look for a
newer distro... like anything with the 2.4 kernel. I also saw that the
Netscape 4.7 which came with the COREL was having problems, and webpages
were continually telling me I had to download some flash or whatever.
Which, however, wouldnt work with a browser that old. But if you try to
get the newer browser, then you find out you need the newer kernel to
run it.... "There's a hole in the bucket, dear Liza, dear Liza...."

And yeah, that's alarm bells. Exponentially increasing complexity
continues untill some unforseen factor (Murphy's Law) destroys the
system. The Mandarins were always China's smartest people, but never
knew there was a problem until the bricks start flying over the palace
wall. Wireless technologies are inherently diffuse, and could continue
to function during a breakdown of the central system. If however- they
can offer their customers a backup *longer range* wireless
communications during a natural disaster or whatever, they will have an
attractive edge over the other ISP sources which are all much more
centralized.

Reply via email to