Re: Why does Valdis trust UL?

2002-02-03 Thread Graham Klyne

Stef,

I'm doing some work in a W3C working group where one of the deliverables is 
a set of test cases.  I.e. a set of machine processable files that give 
some kind of before-and-after indication of how certain constructs may be 
processed.

These are used (a) as discussion points for building consensus about 
exactly what is intended in some circumstance, (b) as a way to go and find 
out what existing code actually does, and (c) as a secondary document to 
back up what the primary specification is trying to state.

#g
--


At 02:11 PM 2/2/02 -0800, Einar Stefferud wrote:
I keep working on Keeping It Simple in honor of Stupid;-)...  (KISS)

In keeping with this, and still seeking some progress, you might note that 
my position is reasonably fluid, since the solution(s) do not seem to be 
obvious from the beginning.

It is extremely difficult to do what is needed in the form of Enforcement, 
which requires Punishment Consequences and trial courts and all such.  All 
of which we all agree, should not be mounted or provided by IETF.

But, let's suppose that someone assembled some documented test cases for 
Interoperability, such as were used first for the first pair of 
implementations the justified moving a standard from Proposed Standard to 
Draft Standard.

At levels above IP/TCP I suspect that there is very little code required 
to do the testing.  what is required instead of code is scenarios for 
sending this that and the other thing in both directions between 
interworking systems.

I am assuming that such a test was performed at least once, whether 
documented or not.  I further assume that this could plausibly be used as 
an initial Public Standard for testing.  This is the specification of the 
test, not the code for the test.  What kinds of objects are to be 
exchanged successfully before that first pair can be accepted as proof of 
interworking between that first justifying pair of independent applications.

I suggest that the first thing to do is stop tossing those test specs in 
the trash after they are used, as though they have no further value.
They in fact have the value of a seed that can grow into a valuable long 
term testing protocol for all that care about interworking, such that any 
customer seeking to buy the most interoperable systems can use the 
published test suite protocol to do in-house testing on the systems 
offered by bidding vendors.

So, what I propose is to do something that will give the customers a tool 
for protecting themselves from careless or heedless or even dishonest vendors.

As things are now, we, the end users and customers are basically 
defenseless in the face of what appear to be hostile vendors who are 
without any checks and balances in the hands of crippled customers.

If nothing else, our customer community should be interested in founding 
an operation that will supply interoperability test scenarios for 
themselves.  to hell with expecting the vendors to protect the 
customers.  If the testing tools are not in the hands of the customers, 
who can you trust.

Don't tell me that we should trust the Marketing Droids;-)...
How much testing do those Droids do?

I suspect they mostly test market savvy, not product reliability.

But, being suspicious is not a useful thing without some tools to use for 
seeking truth.

I prefer to Trust, but Verify!
This is the power in customer emPOWERment.

BTW, I do not expect much help from vendors in this strategy.
Though one or two might find some advantage in helping out.

Especially if they offer real interworking systems;-)...

Cheers..Stef


At 12:22 -0500 30/01/02, Mark Adam wrote:
Since interoperability on a one-to-many scale would be a problem,
perhaps approaching it from the many-to-one point of view would be
better.

Einar's ideas are good, but still difficult to implement. What happens
when a company fails to find every device it should be tested against?
It almost seems that what we need is the concept of a reference
platform.

Having a reference platform allows for a single point of contact for
everyone wanting IETF Certification.

I would also suggest that the task of implementing such a platform
should be up to the WGs creating the standards or the companies
authoring the standard. This would also give you a group that could
administer the platform. Of course there would have to be some rules of
conduct so that nobody could be excluded from performing their
interoperability testing. (Do I smell a BOF here?) I'm sure groups
holding reference platforms could find some way to make money off of
this without breaking the rules.

I'm not saying this would be easy to implement, but it might be worth a
thought.

mark---

At 00:25 1/29/02, Einar Stefferud wrote:
  Well now, an idea blinked on here;-)...
  
  As Paul Hoffman noted, it costs a small fortune for an entire set of
  vendor products to be tested against all other interworking products
  (N**2 pairs is the estimate) and there is no proffered 

Re: Why does Valdis trust UL?

2002-02-03 Thread Einar Stefferud

Hello Graham --

Given your ideas and information, it seems to me that someone my be 
able to make a business out of marketing testing software that 
customers can use to evaluate other vendors software, so all 
customers do not need to self develop the testing software.

This might well be an add-on business to those firms that now sell 
virus detection and system maintenance tools.

Cheers...\Stef

At 18:18 + 03/02/02, Graham Klyne wrote:
Stef,

I'm doing some work in a W3C working group where one of the 
deliverables is a set of test cases.  I.e. a set of machine 
processable files that give some kind of before-and-after indication 
of how certain constructs may be processed.

These are used (a) as discussion points for building consensus about 
exactly what is intended in some circumstance, (b) as a way to go 
and find out what existing code actually does, and (c) as a 
secondary document to back up what the primary specification is 
trying to state.

#g
--


At 02:11 PM 2/2/02 -0800, Einar Stefferud wrote:
I keep working on Keeping It Simple in honor of Stupid;-)...  (KISS)

In keeping with this, and still seeking some progress, you might 
note that my position is reasonably fluid, since the solution(s) do 
not seem to be obvious from the beginning.

It is extremely difficult to do what is needed in the form of 
Enforcement, which requires Punishment Consequences and trial 
courts and all such.  All of which we all agree, should not be 
mounted or provided by IETF.

But, let's suppose that someone assembled some documented test 
cases for Interoperability, such as were used first for the first 
pair of implementations the justified moving a standard from 
Proposed Standard to Draft Standard.

At levels above IP/TCP I suspect that there is very little code 
required to do the testing.  what is required instead of code is 
scenarios for sending this that and the other thing in both 
directions between interworking systems.

I am assuming that such a test was performed at least once, whether 
documented or not.  I further assume that this could plausibly be 
used as an initial Public Standard for testing.  This is the 
specification of the test, not the code for the test.  What kinds 
of objects are to be exchanged successfully before that first pair 
can be accepted as proof of interworking between that first 
justifying pair of independent applications.

I suggest that the first thing to do is stop tossing those test 
specs in the trash after they are used, as though they have no 
further value.
They in fact have the value of a seed that can grow into a valuable 
long term testing protocol for all that care about interworking, 
such that any customer seeking to buy the most interoperable 
systems can use the published test suite protocol to do in-house 
testing on the systems offered by bidding vendors.

So, what I propose is to do something that will give the customers 
a tool for protecting themselves from careless or heedless or even 
dishonest vendors.

As things are now, we, the end users and customers are basically 
defenseless in the face of what appear to be hostile vendors who 
are without any checks and balances in the hands of crippled 
customers.

If nothing else, our customer community should be interested in 
founding an operation that will supply interoperability test 
scenarios for themselves.  to hell with expecting the vendors to 
protect the customers.  If the testing tools are not in the hands 
of the customers, who can you trust.

Don't tell me that we should trust the Marketing Droids;-)...
How much testing do those Droids do?

I suspect they mostly test market savvy, not product reliability.

But, being suspicious is not a useful thing without some tools to 
use for seeking truth.

I prefer to Trust, but Verify!
This is the power in customer emPOWERment.

BTW, I do not expect much help from vendors in this strategy.
Though one or two might find some advantage in helping out.

Especially if they offer real interworking systems;-)...

Cheers..Stef


At 12:22 -0500 30/01/02, Mark Adam wrote:
Since interoperability on a one-to-many scale would be a problem,
perhaps approaching it from the many-to-one point of view would be
better.

Einar's ideas are good, but still difficult to implement. What happens
when a company fails to find every device it should be tested against?
It almost seems that what we need is the concept of a reference
platform.

Having a reference platform allows for a single point of contact for
everyone wanting IETF Certification.

I would also suggest that the task of implementing such a platform
should be up to the WGs creating the standards or the companies
authoring the standard. This would also give you a group that could
administer the platform. Of course there would have to be some rules of
conduct so that nobody could be excluded from performing their
interoperability testing. (Do I smell a BOF here?) I'm sure groups
holding reference platforms could 

Re: Why does Valdis trust UL?

2002-02-02 Thread Einar Stefferud

I keep working on Keeping It Simple in honor of Stupid;-)...  (KISS)

In keeping with this, and still seeking some progress, you might note 
that my position is reasonably fluid, since the solution(s) do not 
seem to be obvious from the beginning.

It is extremely difficult to do what is needed in the form of 
Enforcement, which requires Punishment Consequences and trial courts 
and all such.  All of which we all agree, should not be mounted or 
provided by IETF.

But, let's suppose that someone assembled some documented test cases 
for Interoperability, such as were used first for the first pair of 
implementations the justified moving a standard from Proposed 
Standard to Draft Standard.

At levels above IP/TCP I suspect that there is very little code 
required to do the testing.  what is required instead of code is 
scenarios for sending this that and the other thing in both 
directions between interworking systems.

I am assuming that such a test was performed at least once, whether 
documented or not.  I further assume that this could plausibly be 
used as an initial Public Standard for testing.  This is the 
specification of the test, not the code for the test.  What kinds of 
objects are to be exchanged successfully before that first pair can 
be accepted as proof of interworking between that first justifying 
pair of independent applications.

I suggest that the first thing to do is stop tossing those test specs 
in the trash after they are used, as though they have no further 
value.
They in fact have the value of a seed that can grow into a valuable 
long term testing protocol for all that care about interworking, such 
that any customer seeking to buy the most interoperable systems can 
use the published test suite protocol to do in-house testing on the 
systems offered by bidding vendors.

So, what I propose is to do something that will give the customers a 
tool for protecting themselves from careless or heedless or even 
dishonest vendors.

As things are now, we, the end users and customers are basically 
defenseless in the face of what appear to be hostile vendors who are 
without any checks and balances in the hands of crippled customers.

If nothing else, our customer community should be interested in 
founding an operation that will supply interoperability test 
scenarios for themselves.  to hell with expecting the vendors to 
protect the customers.  If the testing tools are not in the hands of 
the customers, who can you trust.

Don't tell me that we should trust the Marketing Droids;-)...
How much testing do those Droids do?

I suspect they mostly test market savvy, not product reliability.

But, being suspicious is not a useful thing without some tools to use 
for seeking truth.

I prefer to Trust, but Verify!
This is the power in customer emPOWERment.

BTW, I do not expect much help from vendors in this strategy.
Though one or two might find some advantage in helping out.

Especially if they offer real interworking systems;-)...

Cheers..Stef


At 12:22 -0500 30/01/02, Mark Adam wrote:
Since interoperability on a one-to-many scale would be a problem,
perhaps approaching it from the many-to-one point of view would be
better.

Einar's ideas are good, but still difficult to implement. What happens
when a company fails to find every device it should be tested against?
It almost seems that what we need is the concept of a reference
platform.

Having a reference platform allows for a single point of contact for
everyone wanting IETF Certification.

I would also suggest that the task of implementing such a platform
should be up to the WGs creating the standards or the companies
authoring the standard. This would also give you a group that could
administer the platform. Of course there would have to be some rules of
conduct so that nobody could be excluded from performing their
interoperability testing. (Do I smell a BOF here?) I'm sure groups
holding reference platforms could find some way to make money off of
this without breaking the rules.

I'm not saying this would be easy to implement, but it might be worth a
thought.

mark---

At 00:25 1/29/02, Einar Stefferud wrote:
  Well now, an idea blinked on here;-)...
  
  As Paul Hoffman noted, it costs a small fortune for an entire set of
  vendor products to be tested against all other interworking products
  (N**2 pairs is the estimate) and there is no proffered business model
  for doing this for the entire involved industry..
  
  But, maybe someone can devise a business model for testing one
  product against all the others that claim to conform to the standard
  under test.
  
  I know that HP did this ounce for their Internet products by hiring a
  person to do it from one of their customer's sites on the Internet.
  It does not matter here who or where it was done.
  
  But, this puts the burden on the vendors that wish to be able to
  claim inter-workability with all others, or with some subset of their
  choice.
  
  Or 

Re: Why does Valdis trust UL?

2002-01-31 Thread Mark Adam

Since interoperability on a one-to-many scale would be a problem,
perhaps approaching it from the many-to-one point of view would be
better.

Einar's ideas are good, but still difficult to implement. What happens
when a company fails to find every device it should be tested against?
It almost seems that what we need is the concept of a reference
platform.

Having a reference platform allows for a single point of contact for
everyone wanting IETF Certification.

I would also suggest that the task of implementing such a platform
should be up to the WGs creating the standards or the companies
authoring the standard. This would also give you a group that could
administer the platform. Of course there would have to be some rules of
conduct so that nobody could be excluded from performing their
interoperability testing. (Do I smell a BOF here?) I'm sure groups
holding reference platforms could find some way to make money off of
this without breaking the rules.

I'm not saying this would be easy to implement, but it might be worth a
thought. 

mark---

At 00:25 1/29/02, Einar Stefferud wrote:
Well now, an idea blinked on here;-)...

As Paul Hoffman noted, it costs a small fortune for an entire set of 
vendor products to be tested against all other interworking products 
(N**2 pairs is the estimate) and there is no proffered business model 
for doing this for the entire involved industry..

But, maybe someone can devise a business model for testing one 
product against all the others that claim to conform to the standard 
under test.

I know that HP did this ounce for their Internet products by hiring a 
person to do it from one of their customer's sites on the Internet. 
It does not matter here who or where it was done.

But, this puts the burden on the vendors that wish to be able to 
claim inter-workability with all others, or with some subset of their 
choice.

Or they can identify those that do not interwork for the benefit of 
those that want to know such stuff.

This then becomes an individual company decision, and does not 
require massed agreement, or require synchronized work schedules. 
Just put your system on the net and find someone out there to test 
against.  Doing it on the real net is just fine for this testing 
model.

Of course, the vendors that do this can brag or not, as they wish.

And here is no great concern for whether every vendor does it or not.

And the market can make up its mind by itself.

For my view, I have trouble believing that all those vendors are not 
vitally interested in inter-working among their products.

And, in addition, I would hope that someone might mount an open 
discussion mailing list for people to use to post their private 
experiences with what does or does not work.

And last:  This is no longer a useful IETF discussion;-)...\Stef


At 09:01 -0800 28/01/02, John  W Noerenberg II wrote:
At 10:19 PM -0500 1/26/02, [EMAIL PROTECTED] wrote:

I have in my bedroom a night light, which I purchased at a local
grocery store.  It has a UL logo on it, which doesn't tell me much
about its suitability as a night light (I can't tell if it's bright
enough, or if it's too bright, or what its power consumption is),
but it *does* tell me 2 things:

1) It has been *tested* and found free of any known safety design problems.
It may not *work* as a night light, but it won't shock me when I go to
throw it in the trash can because it's not suitable.

2) A high enough percentage of night light manufacturers get UL listed
that I can afford to be suspicious of any company that doesn't have
the logo on their product.

Underwriters Laboratories, Inc.  is a non-profit corporation that 
was founded in 1894.  This 
http://www.ul.com/about/otm/otmv3n2/labdata.htmarticle describes 
the process UL uses for developing their standards.  Many UL 
standards receive ANSI certification.  According to the article, UL 
relies on information from a number of sources while developing a 
standard.

UL tests products submitted by its customers for *conformance* to 
its standards.  UL's reputation depends on the rigor and 
independence of their testing.  I don't know how it costs to submit 
a product for testing, but obtaining UL certification isn't free. 
UL's certification program is successful, because when consumers 
like Valdis (and me) see a UL label, they believe in its value.  As 
Valdis points out, the value of the label has limits.

Certification isn't the work of a volunteer organization like the 
IETF.  It could be the work of an organization like Underwriters 
Labs.  This would be a good thing for Internet standards, imho.

One idea proposed multiple times in this meandering discussion is 
that those advocating testing should put up or shut up -- create a 
testing organization or move on to other topics.  I concur with both 
those suggestions.  I'm sure you'll all be pleased this is my last 
word on the topic.

best,
--

john noerenberg
[EMAIL PROTECTED]
   

Re: Why does Valdis trust UL?

2002-01-30 Thread Tony Dal Santo

Does UL go after companies that produce unsafe devices.  My guess would
be no.  As far as UL is concerned, companies voluntarily bring their
products to them for certification.  It is the consumers and legal
authorities that give UL such a big stick.  And with this model, UL
seems to be fairly free of legal hassles since (from their perspective)
seeking certification is voluntary.

It seems to me a similar model could be followed by IETF or anyone with
the business sense to start a company (e.g. the InterOperability Lab
at UNH for Ethernet).  Actually, the task should be very simple.  The
IETF is supposed to require two independent interoperable implementation
before something can become a standard (and ideally even before becoming
a draft standard).  The Internet Interoperability Lab could just test
voluntary submissions against the two versions used when approving the
standard.

Is the practice of requiring two independent implementations still being
followed?  Or is the problem that a generic RFC is being treated as an
approved standard before it has actually become so (see RFC 2600).

Tony




Re: Why does Valdis trust UL?

2002-01-29 Thread Einar Stefferud

Well now, an idea blinked on here;-)...

As Paul Hoffman noted, it costs a small fortune for an entire set of 
vendor products to be tested against all other interworking products 
(N**2 pairs is the estimate) and there is no proffered business model 
for doing this for the entire involved industry..

But, maybe someone can devise a business model for testing one 
product against all the others that claim to conform to the standard 
under test.

I know that HP did this ounce for their Internet products by hiring a 
person to do it from one of their customer's sites on the Internet. 
It does not matter here who or where it was done.

But, this puts the burden on the vendors that wish to be able to 
claim inter-workability with all others, or with some subset of their 
choice.

Or they can identify those that do not interwork for the benefit of 
those that want to know such stuff.

This then becomes an individual company decision, and does not 
require massed agreement, or require synchronized work schedules. 
Just put your system on the net and find someone out there to test 
against.  Doing it on the real net is just fine for this testing 
model.

Of course, the vendors that do this can brag or not, as they wish.

And here is no great concern for whether every vendor does it or not.

And the market can make up its mind by itself.

For my view, I have trouble believing that all those vendors are not 
vitally interested in inter-working among their products.

And, in addition, I would hope that someone might mount an open 
discussion mailing list for people to use to post their private 
experiences with what does or does not work.

And last:  This is no longer a useful IETF discussion;-)...\Stef


At 09:01 -0800 28/01/02, John  W Noerenberg II wrote:
At 10:19 PM -0500 1/26/02, [EMAIL PROTECTED] wrote:

I have in my bedroom a night light, which I purchased at a local
grocery store.  It has a UL logo on it, which doesn't tell me much
about its suitability as a night light (I can't tell if it's bright
enough, or if it's too bright, or what its power consumption is),
but it *does* tell me 2 things:

1) It has been *tested* and found free of any known safety design problems.
It may not *work* as a night light, but it won't shock me when I go to
throw it in the trash can because it's not suitable.

2) A high enough percentage of night light manufacturers get UL listed
that I can afford to be suspicious of any company that doesn't have
the logo on their product.

Underwriters Laboratories, Inc.  is a non-profit corporation that 
was founded in 1894.  This 
http://www.ul.com/about/otm/otmv3n2/labdata.htmarticle describes 
the process UL uses for developing their standards.  Many UL 
standards receive ANSI certification.  According to the article, UL 
relies on information from a number of sources while developing a 
standard.

UL tests products submitted by its customers for *conformance* to 
its standards.  UL's reputation depends on the rigor and 
independence of their testing.  I don't know how it costs to submit 
a product for testing, but obtaining UL certification isn't free. 
UL's certification program is successful, because when consumers 
like Valdis (and me) see a UL label, they believe in its value.  As 
Valdis points out, the value of the label has limits.

Certification isn't the work of a volunteer organization like the 
IETF.  It could be the work of an organization like Underwriters 
Labs.  This would be a good thing for Internet standards, imho.

One idea proposed multiple times in this meandering discussion is 
that those advocating testing should put up or shut up -- create a 
testing organization or move on to other topics.  I concur with both 
those suggestions.  I'm sure you'll all be pleased this is my last 
word on the topic.

best,
--

john noerenberg
[EMAIL PROTECTED]
   --
   While the belief we  have found the Answer can separate us
   and make us forget our humanity, it is the seeking that continues
   to bring us together, the makes and keeps us human.
   -- Daniel J. Boorstin, The Seekers, 1998
   --




Re: Why does Valdis trust UL?

2002-01-29 Thread John C Klensin

John,

One addition to your description -- a small, but important,
point...

ANSI (of which both UL and the normative standards on which
their more detailed testing/evaluation standards are based are
members and accredited SDOs) makes a careful decision between
safety standards and other sorts of things.  By the
definitions they use, the IETF has never done a safety standard.
That is probably  A Good Thing.

The safety standards tend to be rigidly normative, specifying
exactly what is permitted and what is not.  There is no need for
our sort of interoperability testing, because things are
required to conform to an explicit set of specifications and
requirements.  And, where that isn't done, there are usually
requirements for approval by the local authority or use of
approved equipment.  In both cases, the term to approve
implies some sort of inspection or certification entity.

The characteristic that these things have in common is that they
are designed to be incorporated into legislation.  The National
Electrical Code, with which many readers of this list are
probably familiar (my apologies to those, especially out of the
US, who are not -- you already knew that we do things in odd
ways here), is a good example.  The Code itself is nothing more
than an ANSI Standard.  Conformance is voluntary, right?   Well,
the text is full of references to approved devices and approval
by local authority.  

Then various jurisdictions come along, take that voluntary
standard, and pass laws saying that it is illegal to do
electrical things any other way.  In the process, they specify
the local approval authority (your friendly neighborhood
electrical inspector in most cases) and the list of bodies that
can approve approved devices.  The latter is a list that
usually has only one entry on it, and that entry is UL.  If
Valdis can buy a non-UL-certified night light in Vermont, he
gets a choice.  In many jurisdictions, it is illegal to sell
such things, or household fire insurance is scrap paper if there
is a fire traced to a device without UL certification.  And, for
some devices, the codes themselves require that only certified
devices get installed.

Now, in our business, partially because we don't do safety
standards, we rely on external certification processes,
including a lot of self-certification, rather than these
elaborate drills that prevent selling or installing things that,
in the judgement of some organization, are non-conforming.
Stef's most recent notion of people doing their own
interoperabiity testing and announcing what they find if they
want to is exactly self-certification.  And it has been around
for years.  

But that brings us right back to where this series of thread
started: the company in question has never, to my knowledge,
made a loophole-free public claim that it conforms to anything
the IETF has produced, especially at the applications level.  If
they had made such a claim, and obviously didn't conform, then
someone might have a reasonable cause of action against them,
with or without a public announcement process.  But they are
doing exactly what they claim to be doing (read the licenses) --
delivering software that may or may not work and may or may not
be good for anything.  If one doesn't like that, one should
presumably go elsewhere or figure out why there isn't an
elsewhere and do something about it.

john




--On Monday, 28 January, 2002 09:01 -0800 John  W Noerenberg II
[EMAIL PROTECTED] wrote:

 At 10:19 PM -0500 1/26/02, [EMAIL PROTECTED] wrote:
 
 I have in my bedroom a night light, which I purchased at a
 local grocery store.  It has a UL logo on it, which doesn't
 tell me much about its suitability as a night light (I can't
 tell if it's bright enough, or if it's too bright, or what
 its power consumption is), but it *does* tell me 2 things:
 
 1) It has been *tested* and found free of any known safety
 design problems. It may not *work* as a night light, but it
 won't shock me when I go to throw it in the trash can because
 it's not suitable.
 
 2) A high enough percentage of night light manufacturers get
 UL listed that I can afford to be suspicious of any company
 that doesn't have the logo on their product.
 
 Underwriters Laboratories, Inc.  is a non-profit corporation
 that was founded in 1894.  This
 http://www.ul.com/about/otm/otmv3n2/labdata.htmarticle
 describes the process UL uses for developing their standards.
 Many UL standards receive ANSI certification.  According to
 the article, UL relies on information from a number of sources
 while developing a standard.
 
 UL tests products submitted by its customers for *conformance*
 to its standards.  UL's reputation depends on the rigor and
 independence of their testing.  I don't know how it costs to
 submit a product for testing, but obtaining UL certification
 isn't free.  UL's certification program is successful, because
 when consumers like Valdis (and me) see a UL label, they
 believe in its value.  As Valdis 

Why does Valdis trust UL?

2002-01-28 Thread John W Noerenberg II
Title: Why does Valdis trust UL?


At 10:19 PM -0500 1/26/02, [EMAIL PROTECTED] wrote:

I have in my bedroom a night light, which
I purchased at a local
grocery store. It has a UL logo on it, which doesn't tell me
much
about its suitability as a night light (I can't tell if it's
bright
enough, or if it's too bright, or what its power consumption is),
but it *does* tell me 2 things:

1) It has been *tested* and found free of any known safety design
problems.
It may not *work* as a night light, but it won't shock me when I go
to
throw it in the trash can because it's not suitable.

2) A high enough percentage of night light manufacturers get UL
listed
that I can afford to be suspicious of any company that doesn't
have
the logo on their product.

Underwriters Laboratories, Inc. is a non-profit corporation
that was founded in 1894. This article
describes the process UL uses for developing their standards.
Many UL standards receive ANSI certification. According to the
article, UL relies on information from a number of sources while
developing a standard.

UL tests products submitted by its customers for *conformance* to
its standards. UL's reputation depends on the rigor and
independence of their testing. I don't know how it costs to
submit a product for testing, but obtaining UL certification isn't
free. UL's certification program is successful, because when
consumers like Valdis (and me) see a UL label, they believe in its
value. As Valdis points out, the value of the label has
limits.

Certification isn't the work of a volunteer organization like the
IETF. It could be the work of an organization like Underwriters
Labs. This would be a good thing for Internet standards,
imho.

One idea proposed multiple times in this meandering discussion is
that those advocating testing should put up or shut up -- create a
testing organization or move on to other topics. I concur with
both those suggestions. I'm sure you'll all be pleased this is
my last word on the topic.

best,
-- 


john noerenberg
[EMAIL PROTECTED]

--
 While the belief we have found the Answer can separate
us
 and make us forget our humanity, it is the seeking that
continues
 to bring us together, the makes and keeps us human.
 -- Daniel J. Boorstin, The Seekers, 1998

--