Re: On Bugs and Linux Quality

2008-06-25 Thread Daniel Mons
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Karl Goetz wrote:
| Surely your not saying Debian stable has "faster adoption" then Red Hat
| EL? Or were you refering to release turn around time?

I know very few people who use Debian Stable at a desktop level.  Most
(including myself) use Debian Testing or Unstable, as it moves much
faster and provides much quicker downstream updates.

About the only place I've seen Debian Stable used in abundance is on
firewalls, servers, or on "install and forget" systems like high
performance clusters that are not connected to the internet, where a
solid, reliable base is needed to build on, and no "unknown factor"
updates/patches should be installed at a later date so as not to break
things or force service restarts.

Debian Stable is certainly more akin to RHEL.  But I tell you with no
surprise that RedHat have announced publicly on many occasions that the
desktop doesn't interest them in the slightest, and they are still
focussed on the server.  Compare and contrast to SuSE and Ubuntu who
have a much higher share of the Linux desktop market, and who both move
much quicker with their official releases.

- -Dan
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.6 (GNU/Linux)

iD8DBQFIYxAreFJDv0P9Qb8RAo8OAKCnCMweLoAcsE+KqNFuKPGWgISOFQCggK/F
jy9Xw4JO4E5dQ8KulN6P1ok=
=rjWh
-END PGP SIGNATURE-

-- 
ubuntu-au mailing list
ubuntu-au@lists.ubuntu.com
https://lists.ubuntu.com/mailman/listinfo/ubuntu-au


Re: On Bugs and Linux Quality

2008-06-25 Thread Karl Goetz
On Sun, 2008-06-22 at 21:19 +1000, Dave Hall wrote:
> On Sun, 2008-06-22 at 20:35 +1000, Null Ack wrote:
> > Daniel with respect, I did not mean to present that the solution to
> > improving the quality of GNU/Linux is for centralised control.

trim
 
> > I dont see proper release management stifling any freedoms in FOSS
> > projects. It just means having a proper quality standard before bits
> > are declared stable and ready for production. I greatly enjoy Ubuntu,
> > over all other distro's Ive tried (Arch, OpenSuse, Fedora) but I am
> > certainly not the only person Ive seen sharing their views that
> > arbitary time based releases arent condusive to good software.
> 
> I have been watching this thread, and many like it over the years.  Yes
> it would be nice if the quality of GNU/Linux distros improved, but I
> don't demand that.
> 
> Lets take a look at the situation.  You are getting a complete operating
> system for free (as in liberty and beer).  It comes with a warranty -
> see clause 15 of the GPL [1].  Vendors (including canonical/ubuntu)
> honour the warranty offered by upstream.
> 

For those playing at home that refers to GPL3. Its clause 11 in GPL2
(which Linux and a huge chunk of the free software world are using).

> This is the free software movement at work.  No one makes you use the
> code we produce (yes I am a FOSS developer).  No one can make us fix our
> bugs.  This is the risk you take when you use our code.  I don't lose
> any sleep if someone does or doesn't use my code.  If someone demands
> that I fix a bug or else , I mentally put it to the
> bottom of the pile.

I suspect unless you have a good SLA with your proprietary software
vendor you get the same treatment :)

> 
> For the flip side, lets look at a proprietary development model.  I have
> picked the easiet one - Windows.  Windows 98 didn't support USB mass
> storage and support for it was never included, last I checked you
> couldn't install onto a SATA drive without a _boot floppy_ and looks

I assume you were refering to Win9x/NT5.x?

> unlikely to ever be fixed.  It took until SP2 for XP to come anywhere
> close to getting half decent security.  Many vendors took months to get
> their drivers right for Vista.  The list of fundamental flaws with
> various versions of Windows is extensive.  This is a product shipped by
> the biggest software company on the planet.

> 
> I must say that Dapper was the highlight for me in terms of stable
> desktop releases.  I have found that recently the rush to include the

I'm willing to bet the extra 2 months had a good deal to do with it.
(Personally, i found Dapper to be the last of the rock solid ubuntu
releases).

> latest and greatest while still hitting a target date isn't the best
> approach.  I hope that the next LTS release is more an attempt to polish

It does allow you to go LOOKATUSBLING!, which seems to be "where its at"
for promoting desktops atm.
> Cheers
> 
> Dave
> 
> [1] http://www.gnu.org/licenses/gpl.html
> [2] http://preview.tinyurl.com/dnqgs
> 
> 

kk

> 
-- 
Karl Goetz <[EMAIL PROTECTED]>


signature.asc
Description: This is a digitally signed message part
-- 
ubuntu-au mailing list
ubuntu-au@lists.ubuntu.com
https://lists.ubuntu.com/mailman/listinfo/ubuntu-au


Re: On Bugs and Linux Quality

2008-06-25 Thread Karl Goetz
On Sun, 2008-06-22 at 20:35 +1000, Null Ack wrote:
> Daniel with respect, I did not mean to present that the solution to
> improving the quality of GNU/Linux is for centralised control.
> 
> However, people are in control of aspects of Linux - such as release

The way you start out with "control aspects of Linux" makes it hard to
take this seriously, because we have already covered that its not one
big group/org.
Some free software projects release with bugs - thats life. So does
proprietary software.

>  decisions about key sub systems, or release decisions as it relates
> to Distros. These decision makers have the power to conform, or not to
> conform as some unfortunately choose, to decades old principles to do
> with what consitutes an alpha, beta or production release.

[trim the rest]
kk

-- 
Karl Goetz <[EMAIL PROTECTED]>


signature.asc
Description: This is a digitally signed message part
-- 
ubuntu-au mailing list
ubuntu-au@lists.ubuntu.com
https://lists.ubuntu.com/mailman/listinfo/ubuntu-au


Re: On Bugs and Linux Quality

2008-06-25 Thread Karl Goetz
On Sun, 2008-06-22 at 19:53 +1000, Daniel Mons wrote:
> -BEGIN PGP SIGNED MESSAGE-
> Hash: SHA1
> 
> Blindraven wrote:
> | *shrug*
> |
> | I agree fully with the op.

trim

> It should be mentioned that a common criticism of RedHat Enterprise
> Linux is that it moves quite slowly.  They are slow to adopt new
> features, and are extremely conservative when it comes to making
> changes.  The upside of course is that upgrades are predictable and that
> sudden changes in the system that will break things are unlikely.  I
> prefer Ubuntu (and Debian) simply for their faster adoption of new
> features, and their brilliant "APT" package manager.  But again, the
> downside of a fast-moving system is the occasional software hiccup.

Surely your not saying Debian stable has "faster adoption" then Red Hat
EL? Or were you refering to release turn around time?
kk

> 
-- 
Karl Goetz <[EMAIL PROTECTED]>


signature.asc
Description: This is a digitally signed message part
-- 
ubuntu-au mailing list
ubuntu-au@lists.ubuntu.com
https://lists.ubuntu.com/mailman/listinfo/ubuntu-au


Re: On Bugs and Linux Quality

2008-06-25 Thread Karl Goetz
On Sun, 2008-06-22 at 18:22 +1000, Steve Lindsay wrote:
> On Sun, Jun 22, 2008 at 2:27 PM, Null Ack <[EMAIL PROTECTED]> wrote:
> >
> > Linux needs to have less scatterbrain behaviour where half done things are
> > left and the chaos moves forward to the next semi complete feature. It needs
> > to consolidate and have a unified effort to really work on stability and bug
> > fixing.
> >
> 
> The word "needs" needs to be banned.

Stallman has suggested "GNU/Linux" as an alternative ;)
kk

> 
> Steve
> 
-- 
Karl Goetz <[EMAIL PROTECTED]>


signature.asc
Description: This is a digitally signed message part
-- 
ubuntu-au mailing list
ubuntu-au@lists.ubuntu.com
https://lists.ubuntu.com/mailman/listinfo/ubuntu-au


Re: On Bugs and Linux Quality

2008-06-25 Thread Karl Goetz
On Mon, 2008-06-23 at 14:03 +1000, Null Ack wrote:
> Slawek having been on the tender process for numerous Government
> contracts (both inside in the Government and outside in vendors) the
> key pros / cons for Linux I see are:
> 
> 1. Pro - reduced TCO
> 2. Pro - easy sell for servers
> 3. Con - hard sell for desktops. I did not see anything particularly
> solid in preventing this - its more a lack of understanding. Im sure
> some areas really could not do without Office but most that make this

I find it less confusing to call "Office" "Microsoft office". helps
destinguish from [star,open,gnome,kde]office :)
kk

>  claim are in my experience wrong about OpenOffice capabilities. Some
> sites have custom .net apps running so it would be critical that Mono
> or some equiviliant really worked. Actually I dont really understand
> all the whining about Mono as I understand that is is now an open
> standard and not a MS standard? Theres probably going to be the
> occasional legacy app written on the win32 application platform that
> doesnt play nice with Linux. What we did on one project where all the
> infrastructure was replaced was to have a few citrix sessions running
> legacy apps - for some reason they didnt want virtualisation for
> desktop apps.

I dont remember who said it, but i find the quote "The biggest cost of
proprietary software is migrating away from it" (something like that
anyway) quite relevent. Not very helpful when "selling" gnu/linux, but
still relevent.
kk

> 
-- 
Karl Goetz <[EMAIL PROTECTED]>


signature.asc
Description: This is a digitally signed message part
-- 
ubuntu-au mailing list
ubuntu-au@lists.ubuntu.com
https://lists.ubuntu.com/mailman/listinfo/ubuntu-au


Re: On Bugs and Linux Quality

2008-06-25 Thread Daniel Mons
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Slawek Drabot wrote:

| Choice is great. However we need to ask ourselves if we really need 20
different media players, for example, instead of 3 or 4 really good ones?
|
| The nature of open source means that there are always more bugs than
resources to fix them and therefore the tendancy for spawning new
projects in response to shortfalls in another. I'm quite happy to live
with this arrangement when using open source in the home, but I see this
as a big obstacle for commercial/corporate adoption.

I don't agree.  Open source is very organic and very Darwinian: the
strong survive, and the weak fail.  Yes, there are dozens of
spreadsheeting and word processing software options, but only OpenOffice
is making serious in-roads into corporate adoption.

Similarly, there are hundreds of distros available.  Speaking purely
from the desktop perspective, I've only ever seen 2 in operation in the
corporate world (RedHat and SuSE), and on other in use in the
commercial/non-corporate world (Ubuntu).  The dozens of fly-by-night
distros out there don't make it, because they don't have the backing
needed by industry.


|  Let's face it, the pointy haired bosses struggle with making a
decision when faced with 2 or 3 options. Expand that to 20 and beyond
and we have decision deadlock.

Speaking as someone who makes a living implementing free software in
businesses, I have not once in my entire career given a PHB-type 20
different options.  Anyone who has even the slightest clue understands
that of eh 20 different options, only 1 or 2 of them are serious
contenders for the particular need at hand.  Sure, there are hundreds of
network file systems.  Only Samba is a real contender in an network full
of Microsoft desktops.  Sure, there are hundreds of email clients.  Only
one of them (Evolution) is a contender if there's an MS Exchange server
in house.  And if you're using generic IMAP, you'd be mad to use use
Claws, Mutt, or any of the other dozens of mail clients.  Only Evolution
and Thunderbird are worth your time.

Don't make the mistake of assuming that the 20 different options are
your disposal are ALL contenders.  If you want to start pushing hobbiest
homebrew software infront of corporate types, you dig your own grave.
Open source covers an enormous range and quality of software - some of
which is professional and polished, some of which is rubbish.

What irks me about this mindset is that the same happens in the
proprietary world.  I can name for you dozens of image editing
applications, all of which are proprietary.  When architecting a graphic
design studio, would I give all of them as options to the business
owner?  Hell no.  They'd get the top 3 in the market (Photoshop, Corel,
Xara) as options, and they could pick from there.  No need for them to
see every single low-end, niche or hobbiest bit of crap on the market.

Ditto for open source.  The above makes the assumption that "just
because it's in the package manager, it's a viable option on the
desktop".  Tell me: when was the last time you installed all 27,000
packages available to Ubuntu users via APT?  Answer: never.  You pick
and choose the ones that work.  Same goes for the proprietary world.
Software is near infinite in choice and range, yet nobody claims that it
freaks out PHBs when faced with hundreds of choices of proprietary software.

Just because it exists at all, doesn't mean you have to install, use, or
even consider it.

- -Dan
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.6 (GNU/Linux)

iD8DBQFIYwDveFJDv0P9Qb8RAodpAKCuc8x7ewdhtTWmVNHjvA7nZAZe0wCcDuor
26rRtH5SCXMYQw9KHBcLrV0=
=AIBE
-END PGP SIGNATURE-

-- 
ubuntu-au mailing list
ubuntu-au@lists.ubuntu.com
https://lists.ubuntu.com/mailman/listinfo/ubuntu-au


Re: On Bugs and Linux Quality

2008-06-25 Thread Slawek Drabot
Daniel Morrison:
"I have had similiar experiences to what you describe but surely this 
is what free software is all about! Choice.

If you are concerned about stability and showstopper bugs, choose 
Debian, RHEL, etc.

If you are concerned about bleeding edge functionality and hardware 
support, choose Ubuntu, Fedora, etc.

And don't forget to file those bug reports :-)

cheers

danm"


Choice is great. However we need to ask ourselves if we really need 20 
different media players, for example, instead of 3 or 4 really good ones?

The nature of open source means that there are always more bugs than resources 
to fix them and therefore the tendancy for spawning new projects in response to 
shortfalls in another. I'm quite happy to live with this arrangement when using 
open source in the home, but I see this as a big obstacle for 
commercial/corporate adoption. 

Let's face it, the pointy haired bosses struggle with making a decision when 
faced with 2 or 3 options. Expand that to 20 and beyond and we have decision 
deadlock.


  

-- 
ubuntu-au mailing list
ubuntu-au@lists.ubuntu.com
https://lists.ubuntu.com/mailman/listinfo/ubuntu-au


Re: On Bugs and Linux Quality

2008-06-24 Thread Daniel Morrison
Null Ack wrote:
>
> I dont see proper release management stifling any freedoms in FOSS 
> projects. It just means having a proper quality standard before bits 
> are declared stable and ready for production. I greatly enjoy Ubuntu, 
> over all other distro's Ive tried (Arch, OpenSuse, Fedora) but I am 
> certainly not the only person Ive seen sharing their views that 
> arbitary time based releases arent condusive to good software.


I have had similiar experiences to what you describe but surely this 
is what free software is all about! Choice.

If you are concerned about stability and showstopper bugs, choose 
Debian, RHEL, etc.

If you are concerned about bleeding edge functionality and hardware 
support, choose Ubuntu, Fedora, etc.

And don't forget to file those bug reports :-)

cheers

danm


-- 
ubuntu-au mailing list
ubuntu-au@lists.ubuntu.com
https://lists.ubuntu.com/mailman/listinfo/ubuntu-au


Re: On Bugs and Linux Quality

2008-06-22 Thread Daniel Mons
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Slawek Drabot wrote:
| Based on the observations made, what do people here see as the biggest
pros and cons of using Linux, and specifically Ubuntu in a commercial,
corporate environment?

For me personally, speed and reliability are the biggest pros.  What do
I mean by "speed"?

I can roll out fairly complex networks using a Linux environment in a
very small amount of time.  Take for example a client of mine that was a
startup visual effects studio.  They needed a centralised authentication
system that would allow authentication services for workstations running
MacOSX, Windows, Linux and SGI/IRIX, high speed file serving, cross-OS
permissions control, VPN access, routing and firewalling, secure
wireless network,  and a host of other services (databases galore, wikis
for knowledge bases, etc) for a network of 20 users/workstations, 4 very
large servers and with a decent sized renderfarm and management nodes.

Under a proprietary OS, this would have taken ages.  I've seen smaller
networks with less features and a single OS base take a month or more to
roll out.  Particularly when you consider you need to waste a huge
amount of time on license procurement, software asset/licensing/serial
management, auditing of licenses to ensure you are not over or under
licensed, etc, etc.

I managed all of the above by myself in under 5 days.  The whole studio
was up and running on a production film within 30 days of the green
light, which included hardware purchasing and installation, the software
stuff above, and the rest of the administration style activities that
normally happen.

I used free software like LDAP, SASL, Samba, BIND, OpenVPN, NFS and
others to build the core features which allowed all of the systems to
plug in without worrying about OS compatibility.  The testing performed
was minimal and most things "just worked" regardless of attached OS.  On
the long-term scale, it allows the studio owner to scale his network
well over 100 times in size with no extra software cost at the server
side.  It also meant that we could build in redundancy early on without
the need to worry about further licensing restrictions and time delays.

I've since worked on similar projects where there has always been an
insistence to use proprietary software as the base.  Typically this
means Microsoft Windows Server and Active Directory.  Invariably the
result is the same: permissions and user mapping fail when dealing with
cross-platform networks (MacOSX and Linux systems don't play as nicely
when compared with a standard LDAP system), DNS systems are more
difficult to manage, systems like Exchange are limiting in their
scalability, performance, and extensibility, and the time to roll out
these networks is always blown out by software license management.  One
I was a part of recently took several months to complete, and had fewer
features than the 5-day affair I mention above.

And I'm not even going to start talking about the dollar cost (I hate
terms like "TCO", as they are far too hand-wavy and belong in the realm
of pointy haired bosses only).

The "reliability" part has been fantastic.  The network in question has
been functional for 1.5 years now, and we are planning various upgrades
and extensions that are going to be equally as trivial to implement
thanks to the redundant nature of the network, which in turn is thanks
to the free software driving it.

- -Dan

-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.6 (GNU/Linux)

iD8DBQFIXyfzeFJDv0P9Qb8RAttYAKCcfZG1FUGLjnOy3FBUkI4SLcP8VgCggjj1
VmjAwcc8t8WYWy0kHG24638=
=dSaf
-END PGP SIGNATURE-

-- 
ubuntu-au mailing list
ubuntu-au@lists.ubuntu.com
https://lists.ubuntu.com/mailman/listinfo/ubuntu-au


Re: On Bugs and Linux Quality

2008-06-22 Thread Null Ack
Slawek having been on the tender process for numerous Government contracts
(both inside in the Government and outside in vendors) the key pros / cons
for Linux I see are:

1. Pro - reduced TCO
2. Pro - easy sell for servers
3. Con - hard sell for desktops. I did not see anything particularly solid
in preventing this - its more a lack of understanding. Im sure some areas
really could not do without Office but most that make this claim are in my
experience wrong about OpenOffice capabilities. Some sites have custom .net
apps running so it would be critical that Mono or some equiviliant really
worked. Actually I dont really understand all the whining about Mono as I
understand that is is now an open standard and not a MS standard? Theres
probably going to be the occasional legacy app written on the win32
application platform that doesnt play nice with Linux. What we did on one
project where all the infrastructure was replaced was to have a few citrix
sessions running legacy apps - for some reason they didnt want
virtualisation for desktop apps.

In my experience even getting OpenOffice into departments was difficult. The
one place that was done was on a Java developer build where the users were
all developers working on Java projects.
-- 
ubuntu-au mailing list
ubuntu-au@lists.ubuntu.com
https://lists.ubuntu.com/mailman/listinfo/ubuntu-au


Re: On Bugs and Linux Quality

2008-06-22 Thread Slawek Drabot
I enjoyed this debate. 

Based on the observations made, what do people here see as the biggest pros and 
cons of using Linux, and specifically Ubuntu in a commercial, corporate 
environment?


  

-- 
ubuntu-au mailing list
ubuntu-au@lists.ubuntu.com
https://lists.ubuntu.com/mailman/listinfo/ubuntu-au


Re: On Bugs and Linux Quality

2008-06-22 Thread David Symons
Hi all,

It may not contribute much to this thread but I'm reminded of a quote
that I first saw in this slide from Greg Kroah-Hartman's keynote at
OLS 2006:
http://www.kroah.com/log/images/ols_2006_keynote_08.jpg

The whole presentation is a very worthwhile read (imho) and has some
great slides in it (my favourite is the one beginning with "Wow, for
such a small file...").  While it's about the kernel, much of its
sentiment can be applied to that (all too entrenched it seems) wider
usage of "Linux".

http://www.kroah.com/log/linux/ols_2006_keynote.html

Cheers, Dave.
-- 
David Symons
Armidale NSW Australia
http://www.liberatedcomputing.net

-- 
ubuntu-au mailing list
ubuntu-au@lists.ubuntu.com
https://lists.ubuntu.com/mailman/listinfo/ubuntu-au


Re: On Bugs and Linux Quality

2008-06-22 Thread Dave Hall
On Sun, 2008-06-22 at 20:35 +1000, Null Ack wrote:
> Daniel with respect, I did not mean to present that the solution to
> improving the quality of GNU/Linux is for centralised control.
> 
> However, people are in control of aspects of Linux - such as release
> decisions about key sub systems, or release decisions as it relates to
> Distros. These decision makers have the power to conform, or not to
> conform as some unfortunately choose, to decades old principles to do
> with what consitutes an alpha, beta or production release.
> 
> Clearly, there are allot of problems when parties who are in control
> declare a release as stable when its not. With the kernel, I gave the
> example where Andrew Morton shared with us that he often see's
> regression bugs go without fixes, he see's developers ignore bug
> reports. There is other examples too in other key sub systems of just
> about any Linux distro. Take for example, all the problems with X
> releases and how most recently a new release of X was made with a
> blocker bug and other serious bugs.
> 
> If more focus and discipline was put into what constitutes a
> production release I think that would be a very good direction to
> take. Who cares if there is more release candidates for kernels or
> more betas for X, if its not ready its not ready. Some bugs can be
> tricky for a developer to replicate and resolve. Its human nature not
> to see the severity the same way with an issue if it's not happening
> on your machine.
> 
> I dont see proper release management stifling any freedoms in FOSS
> projects. It just means having a proper quality standard before bits
> are declared stable and ready for production. I greatly enjoy Ubuntu,
> over all other distro's Ive tried (Arch, OpenSuse, Fedora) but I am
> certainly not the only person Ive seen sharing their views that
> arbitary time based releases arent condusive to good software.

I have been watching this thread, and many like it over the years.  Yes
it would be nice if the quality of GNU/Linux distros improved, but I
don't demand that.

Lets take a look at the situation.  You are getting a complete operating
system for free (as in liberty and beer).  It comes with a warranty -
see clause 15 of the GPL [1].  Vendors (including canonical/ubuntu)
honour the warranty offered by upstream.

Now lets look at the upstream projects.  They or usually run by
informal groupings of people.  Even with all the corporate resources
thrown at Linux (as in the real Linux - the kernel), it is down to Linus
and the advice of his lieutenants as to what is in and what is out of a
release.  No one is able to make a volunteer fix a bug.  The goes for if
Ubuntu/Canonical pushes a patch upstream to kernel and it is included in
a release but later turns out to be broken, there is nothing which
compels anyone, let alone makes Ubuntu fix it.

This is the free software movement at work.  No one makes you use the
code we produce (yes I am a FOSS developer).  No one can make us fix our
bugs.  This is the risk you take when you use our code.  I don't lose
any sleep if someone does or doesn't use my code.  If someone demands
that I fix a bug or else , I mentally put it to the
bottom of the pile.

For the flip side, lets look at a proprietary development model.  I have
picked the easiet one - Windows.  Windows 98 didn't support USB mass
storage and support for it was never included, last I checked you
couldn't install onto a SATA drive without a _boot floppy_ and looks
unlikely to ever be fixed.  It took until SP2 for XP to come anywhere
close to getting half decent security.  Many vendors took months to get
their drivers right for Vista.  The list of fundamental flaws with
various versions of Windows is extensive.  This is a product shipped by
the biggest software company on the planet.

My experiences with the official support channels for Windows, as a
legitimate, paying customer have been extremely disappointing.  If you
are disappointed with the quality of the product warranty offered in
clause 11 of the EULA (XP SP2 [2]), which I think you will have very
little chance of seeing it honoured as there are too many loopholes for
MS to use to get out of it.

I currently have 2 virtual machines running Windows (one Vista and one
XP), they are both legitimate copies.  I am currently running about 30
different GNU/Linux (physical and virtual) servers, and my laptop only
runs Hardy.  I have given up on running Windows for anything serious.

When I need bullet proof servers, I run Debian stable and I double check
the specs before paying for new kit.  When I need "pretty good" servers
I check if Ubuntu LTS or Debian stable are a better fit for the task.
On the desktop desktop I generally run ubuntu alphas (which I don't
advocate to others), for others they get LTS or current stable ubuntu
depending on their needs.

Do I get annoyed/frustrated with GNU/Linux, yes.  Do I complain - yes
via bug reports and my blog.  Do I threaten people to get thing

Re: On Bugs and Linux Quality

2008-06-22 Thread Null Ack
Daniel with respect, I did not mean to present that the solution to
improving the quality of GNU/Linux is for centralised control.

However, people are in control of aspects of Linux - such as release
decisions about key sub systems, or release decisions as it relates to
Distros. These decision makers have the power to conform, or not to conform
as some unfortunately choose, to decades old principles to do with what
consitutes an alpha, beta or production release.

Clearly, there are allot of problems when parties who are in control declare
a release as stable when its not. With the kernel, I gave the example where
Andrew Morton shared with us that he often see's regression bugs go without
fixes, he see's developers ignore bug reports. There is other examples too
in other key sub systems of just about any Linux distro. Take for example,
all the problems with X releases and how most recently a new release of X
was made with a blocker bug and other serious bugs.

If more focus and discipline was put into what constitutes a production
release I think that would be a very good direction to take. Who cares if
there is more release candidates for kernels or more betas for X, if its not
ready its not ready. Some bugs can be tricky for a developer to replicate
and resolve. Its human nature not to see the severity the same way with an
issue if it's not happening on your machine.

I dont see proper release management stifling any freedoms in FOSS projects.
It just means having a proper quality standard before bits are declared
stable and ready for production. I greatly enjoy Ubuntu, over all other
distro's Ive tried (Arch, OpenSuse, Fedora) but I am certainly not the only
person Ive seen sharing their views that arbitary time based releases arent
condusive to good software.
-- 
ubuntu-au mailing list
ubuntu-au@lists.ubuntu.com
https://lists.ubuntu.com/mailman/listinfo/ubuntu-au


Re: On Bugs and Linux Quality

2008-06-22 Thread Daniel Mons
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Blindraven wrote:
| *shrug*
|
| I agree fully with the op.

As per my previous post, due to the very fact that "Linux" is open
source means that the moment you try to centralised/control it's
development, someone else will fork the code and do their own thing.

If you desire a more controlled, formalised Linux system with support,
I'd suggest you migrate to RedHat Enterprise Linux Desktop.  It is
available for purchase with 1 year support for US$80, or 3 years of
support for US$228.

https://www.redhat.com/apps/store/desktop/

If you don't want the formalised support, you can download CentOS and
get essentially the same software, but minus any support.

It should be mentioned that a common criticism of RedHat Enterprise
Linux is that it moves quite slowly.  They are slow to adopt new
features, and are extremely conservative when it comes to making
changes.  The upside of course is that upgrades are predictable and that
sudden changes in the system that will break things are unlikely.  I
prefer Ubuntu (and Debian) simply for their faster adoption of new
features, and their brilliant "APT" package manager.  But again, the
downside of a fast-moving system is the occasional software hiccup.

If you want to further guarantee the hardware you purchase is "Linux"
compatible, then buy hardware from a certified vendor.  HP, IBM/Lenovo,
Dell and others all sell high-end hardware certified for Linux use.  It
will cost you slightly more, but if it's the guaranteed stability you're
after, then pay the price for the peace of mind.

The options are there for those who want them.  They do involve you
either spending some money or sacrificing some of the bleeding-edge
features of faster moving distros.  By all means, try out the more
conservative distros and see if they fit your needs better.  But it's
ill-advised to judge all Linux systems on one distro, and similarly to
make sweeping claims on how "Linux" should be "fixed" without
understanding the true nature of how it works and where it has come from.

- -Dan
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.6 (GNU/Linux)

iD8DBQFIXiE2eFJDv0P9Qb8RAoAFAJ93/wl84gUd+UDoUUW5s2H/r1qPowCffbwL
XfoY8DYvJ548QZdBXup4daU=
=qaYx
-END PGP SIGNATURE-

-- 
ubuntu-au mailing list
ubuntu-au@lists.ubuntu.com
https://lists.ubuntu.com/mailman/listinfo/ubuntu-au


Re: On Bugs and Linux Quality

2008-06-22 Thread Blindraven
*shrug*

I agree fully with the op.

On Sun, Jun 22, 2008 at 6:22 PM, Steve Lindsay <[EMAIL PROTECTED]>
wrote:

> On Sun, Jun 22, 2008 at 2:27 PM, Null Ack <[EMAIL PROTECTED]> wrote:
> >
> > Linux needs to have less scatterbrain behaviour where half done things
> are
> > left and the chaos moves forward to the next semi complete feature. It
> needs
> > to consolidate and have a unified effort to really work on stability and
> bug
> > fixing.
> >
>
> The word "needs" needs to be banned.
>
> Steve
>
> --
> ubuntu-au mailing list
> ubuntu-au@lists.ubuntu.com
> https://lists.ubuntu.com/mailman/listinfo/ubuntu-au
>



-- 
When one burns ones bridges, what a very nice fire it makes.
-- 
ubuntu-au mailing list
ubuntu-au@lists.ubuntu.com
https://lists.ubuntu.com/mailman/listinfo/ubuntu-au


Re: On Bugs and Linux Quality

2008-06-22 Thread Steve Lindsay
On Sun, Jun 22, 2008 at 2:27 PM, Null Ack <[EMAIL PROTECTED]> wrote:
>
> Linux needs to have less scatterbrain behaviour where half done things are
> left and the chaos moves forward to the next semi complete feature. It needs
> to consolidate and have a unified effort to really work on stability and bug
> fixing.
>

The word "needs" needs to be banned.

Steve

-- 
ubuntu-au mailing list
ubuntu-au@lists.ubuntu.com
https://lists.ubuntu.com/mailman/listinfo/ubuntu-au


Re: On Bugs and Linux Quality

2008-06-21 Thread Daniel Mons
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Null Ack wrote:
| Linux needs to have less scatterbrain behaviour where half done things

I'll snip the post there, as that line contains the wording I am
interested in discussing.

One thing I find very common of people migrating to Linux is that they
treat "Linux" as a single entity, or single product.  Almost as if it
were a corporate being, ala Microsoft, Oracle, etc.

"Linux", as the name is used en mass, is not a single being.  It is a
collection of literally tens of thousands of programs, all working
together.  Indeed, no two Linux distros are alike, simply because the
people providing them choose different collections of software to forge
together to offer.

I make no apologies for saying this, but "Linux" will never be "less
scatterbrain" as you put it.  The distributed nature of the coding
processes that go into Linux may well be at times one of its weaknesses,
but on the whole it is also its single greatest asset.

As someone who is obviously new to "Linux", what you need to understand
is that the "Linux" community understand the distributed, evolving
nature of software far better than the slower moving proprietary world.
~ In software, the concept of "the perfect release" is simply impossible
to achieve - software is a process, not an object, and something that
moves and flows with the needs of its users over time.  To consider it a
single release, or one-off product is the wrong way to approach it.

Yes, there are tens of thousands of bugs in "Ubuntu".  Again, you need
to realise that this encompasses code written by tens of thousands of
human beings, from thousands of different companies, all working
together to release their code in a common pool for anyone to take from.
~ Unlike a lot of proprietary software that has a vested financial
interest in proclaiming how perfect it is on release (and then proceding
to release dozens of service packs and fixes anyway), "Linux" makes no
such claims.  Complete transparency and full discolsure is the name of
the game.

And what you also need to realise is that "Linux" is no better or worse
than proprietary counterparts.  I can assure you, Microsoft have just as
many bugs across their entire suite of software.  The difference is you
as the end user don't get to see their internal bug trackers.  To their
marketing departments and shareholders, admitting fault like that would
be financial suicide.  But just because it's not out there for all to
see, doesn't make it any less real.

Linux's distributed nature can be frustrating to people new to it.  But
again, you need to understand that despite the shortcomings of the
approach, it is the single biggest reason why Linux is alive and
thriving today.  I work in professional IT (as a Linux sysadmin and
systems architect), and time and time again hear the same cry from
people with little exposure to Linux: "They just need to stop making
dozens of distros and all work together to make one killer distro", or
"they need to stop making 5 different word processors and just make one
killer app".  What's obvious about this is that the people saying it are
grossly unaware of who "they" are, and how many people that encompasses.
~ More to the point, what constitutes "the perfect app"?  By whose
definition are we quantifying perfection?

"Linux" is a massive collection of small programs that each focus on
doing one thing well.  A Linux distro is one
individual's/group's/company's idea of which of these programs should be
tied together to make an operating system.  By virtue of the fact that
there are so many large-scale distros doing so well in the market
(RedHat, SuSE, Debian and Ubuntu are generally the "big 4" that people
talk about, with other popular ones like Fedora, CentOS, Gnetoo and
others close behind).  In a perfectly free market, if something is not
good enough it will disappear through obscurity and lack of interest.
So again, by virtue of all of these distros existing and being popular,
it means that they all have users who find them interesting and usable
for a whole gamut of reasons, personal and professional.

I understand your frustration.  You have used the product, found a bug,
and found that it was not fixed in a manner you considered timely.  As
an end-user, that is frustrating.  Speaking from the point of view of a
sysadmin who deals with literally thousands of machines on a daily
basis, I can tell you that this is not limited to the Linux world by any
means.  I personally find even more frustration when I'm forced to admin
proprietary software at great financial expense, and find the support no
better or more satisfying despite the enormous financial outlay.

I'm not trying to downplay your frustrations, but in my 11 years of free
software use, I've consistently found it faster to respond to known bugs
from both an acknowledgement point of view and a fix-delivery point of
view.  Additionally, I find it easier to get support for free software
from a wi

On Bugs and Linux Quality

2008-06-21 Thread Null Ack
I would like to respond to Dan's comments about Linux/Ubuntu quality. To not
dilute the discussion on my specific bug Ive moved it over here.

Like all of you, I share the enthusiasm for Linux and open source software.
To me, the greatest features are the ability to see the inner mechanisms of
the software, for not just security reasons ( which is a compelling reason
all in itself) but also for the ability to understand / change core
components. However it is vitally important we do not remain critical in a
constructive way to improve the software, or more importantly, improve the
user experience.

I disagree with the notion that one bug should not effect the global
perception of Ubuntu quality. The point is, for software to be generally
useful, an expected level of robustness and capability is generally expected
in core features. If a bug prevents this from happening then the software is
not generally useful, or at the very least, is not very usable without
protracted workarounds and other annoyances.

Ubuntu has a number of flawed approaches to release management and support.
For example, not updating the Nvidia 169.12 driver in Hardy. 169.12 has
numerous oopses and other bugs, and there subsequently has been three
revisions of the driver since. Not to mention the old driver revision does
not support new Nvidia cards that have been released. Or lets look at the
many, many other device driver fixes in the vanilla kernel tree that have
not been backported into Hardy's old kernel revision. Its wishful thinking
to somehow arbitarily declare that a release is "stable" and then hardly do
any device driver updates ongoing in the "support" phase. Clearly the word
support isnt actual support - I think its more driven by a lack of resources
to upgrade the baseline as is commonly done in Vista and Leopard.

Ubuntu has tens of thousands of bugs in the bug database. Reporting bugs
does not ensure that the bug is actually fixed or indeed investigated at
all. Of those that are, many of those are sent up stream where they again
sit like a statue and not seem to be resolved.

It doesnt take much of a look at say X for example, to see the countless
people complaining about X bugs that have not been fixed. Or, how X was
"released" and declared "stable" with a known blocker and other bugs that
could be argued to actually be blockers as well.

Lets look at what the insiders are saying. A kernel hacker recently said
"I'll typically read the linux-kernel list in 1000-email batches once every
few days and each time I will come across multiple bug reports which are one
to three days old and which nobody has done anything about! And sometimes I
*know* that the person who is responsible for that part of the kernel has
read the report" He continues on answering a question about the declining
quality of Linux "I used to think it was in decline, and I think that I
might think that it still is. I see so many regressions which we never fix.
Obviously we fix bugs as well as add them, but it is very hard to determine
what the overall result of this is.When I'm out and about I will very often
hear from people whose machines we broke in ways which I'd never heard about
before. I ask them to send a bug report (expecting that nothing will end up
being done about it) but they rarely do.So I don't know where we are and I
don't know what to do. All I can do is to encourage testers to report bugs
and to be persistent with them, and I continue to stick my thumb in
developers' ribs to get something done about them.I do think that it would
be nice to have a bugfix-only kernel release. One which is loudly publicised
and during which we encourage everyone to send us their bug reports and
we'll spend a couple of months doing nothing else but try to fix them. I
haven't pushed this much at all, but it would be interesting to try it once.
If it is beneficial, we can do it again some other time." His name is Andrew
Morton...

Linux needs to have less scatterbrain behaviour where half done things are
left and the chaos moves forward to the next semi complete feature. It needs
to consolidate and have a unified effort to really work on stability and bug
fixing.
-- 
ubuntu-au mailing list
ubuntu-au@lists.ubuntu.com
https://lists.ubuntu.com/mailman/listinfo/ubuntu-au