Hi Ran, and thanks for your reply.

There are two separate issues that we need to distill.  First, what to
do about draft-housley-two-maturity-levels-00, and second, how do take
input to improve the overall process?

I have not really come down on one side or the other on this draft
(yet).  To be sure, two maturity levels seem better than three, and as
you know, I've proposed a single maturity level in the past, so to me,
the draft goes in the right direction.  However, I do not know how many
times we get to change this sort of procedure, and I believe the
community and IESG choice could be better informed than it is today. 
Having been involved in NEWTRK, and having produced what I think was the
only output from that group, in terms of RFCs or processes, I think I
know about which I write, when I say this community can become a bit of
an echo chamber, and could use a bit of formal academic input. 
Conveniently, there are researchers in this area.  This is an even
stronger reason for for me to not state an opinion about whether to
advance the draft.

As to the questions I asked, you and I obviously hold very different
views.  In the end, it is of course for researchers (who are the real
target of my questions) to ask what questions that they think might be
telling about our process.  I hope this discussion informs them, if and
when they review it.

You claim I have a vendor bias.  Guilty, but I am also concerned that we
do the right things for the right reasons, and that motivations are
reasonably aligned so that we have some reason to believe that what is
proposed will work and not have perverse impact.  Absent some serious
analysis, we also are making the assumption that the logic of decisions
of over twenty years ago holds today, when in fact we don't really even
know if it held then.

And now as to your specifics, you have placed a lot of weight on one
example, seemingly extrapolating from it, the Joint Interoperability
Test Command.  I have no experience working with them, and defer to
yours.  However, when you say,

>       As examples, the JITC and TIC requirements pay a great
>       deal of attention to whether some technology is past PS.
>       Various IPv6 Profile documents around the world also pay
>       much attention to whether a particular specification is
>       past PS.

It leads to the following questions:

    * Would the vendors have implemented the functionality ANYWAY? 
      Specifically, would other RFPs have already driven vendors in this
      direction?  Can you cite a counter example, where that was not the
      case?
    * Is the defense industry at all representative of the broader
      market?  My own experience leads to an answer of, “barely at all”,
      and this has been assuredly the case with the Internet where a
      huge portion has run on on PS, Internet-Drafts, and proprietary
      standards, and not waited for advancement.  Examples have included
      BGP, MPLS-VPNs, HTTP, SSL, and Netflow, just to name a few. 

But again, I would like to see a rigorous analysis, rather than simply
rely on either of our personal experiences.

> The IETF already has a tendency to be very vendor-focused &
> vendor-driven.  It is best, however, if the IETF keeps the 
> interests of both communities balanced (rather than tilting 
> towards commercial vendors).
While this is a perhaps laudable idea, someone has to do the work to get
specifications to the next standards level.  The whole point of my
questions is to determine what motivations that someone might have for
actually performing that work.

>> If we look at the 1694
>> Proposed Standards, are we seeing a lack of implementation due to lack
>> of stability?  I would claim that there are quite a number of examples
>> to the contrary (but see below).
> Wrong question.  How clever to knock down the wrong strawman.

There's no need to be rude or snarky with me, even if you disagree.  You
are looking at this from the angle of the customers, and that's
perfectly reasonable.  I'm looking at it from the developers' point of
view, and from the supply side of your equation.  Both seem reasonably
valid, and so I have no qualms with the question part of your (A),
although as I mentioned above, I question your answer.

>       B) whether that signal has a feedback loop to implementers/
>          vendors that still works.
>       The answer to this is also clearly YES.  Technologies that
>       appear in RFPs or Tender Requirements have a stronger
>       business case for vendors/implementers, hence are more
>       likely to be widely implemented.

Certainly so, but I don't understand how you made the leap of logic from
your question to your answer.  Do we have situations, for instance,
where a proposed standard is compared to a draft standard, or a draft
standard is compared to a full standard, and one is chosen over the
other?  If so, are they the norm, and are they likely to drive
implementation?  Also, if all this gets you is interoperability, but
doesn't actually solve your problem, is THAT a good thing for the
customer?  IMHO this was precisely the case for SNMPv3.

>> Is there any reason for a developer to believe that the day after
>> a "mature" standard is announced, a new Internet Draft won't
>> in some way obsolete that work? 
>
> By definition, Internet-Drafts cannot obsolete any 
> standards-track document while they remain Internet-Drafts

I think you misunderstand what I mean by "obsolete".  I don't mean
"obsolete" in the sense that you see it in the RFC directory, but
whether a new idea will overtake what has just been standardized.  Put
another way, if a developer is going to spend money ossifying work, what
is the risk they will lose their investment to something new?  I'll
grant you it's almost impossible to calculate, but perhaps history can
be of some use.  That analysis should be done.


>> Question #3: What does such a signal say to the IETF?  
> It is a positive feedback loop, indicating that work is
> stable and interoperable.  It also says that gratuitous
> changes are very unlikely to happen.  By contrast, 
> technologies at Proposed Standard very frequently have 
> substantial changes, often re-cycling back to PS with
> those major changes.

Where do we see cases where gratuitous changes take place at Proposed?

> Further, the new approach will have the effect of making
> it easier to publish technologies at Proposed Standard,
> which would be good all around.

Why do you come to this conclusion, given that PS isn't changing?

>> I know of at least one case where work was not permitted
>> in the IETF precisely because a FULL STANDARD was said
>> to need soak time.  It was SNMP, and the work that was
>> not permitted at the time was what would later become
>> ISMS.
> That is history from long ago, under very different
> process rules from now, so is totally irrelevant.  
> It isn't even a good example of how things generally 
> worked at that time.

I'll clarify.  This was with SNMPv3 – and while "long ago" is a matter
of perspective, I am unaware of any rule changes that would have
impacted that case.  And since there are few cases of full standards in
any event, the sample size is a problem ;-)

>> Question #4:  Is there a market advantage gained by an implementer
>> working to advance a specification's maturity?
> Again, wrong question.

Why?  Keeping in mind we're  looking at reasons for people to do the work.

> That noted, the answer is clearly Yes.  Early implementers who show
> interoperability are well positioned to win RFPs that require a
> technology that has moved beyond Proposed Standard, while trailing
> implementers often end up unqualified to bid/tender due to the
> absence of such a feature.

Again, there seems to be a leap of logic here that somehow because a
specification has attained draft or full standard will cause work to be
done that wasn't done before.  Those are facts not in evidence.

>> If there *is* a market advantage, is that something a standards
>> organization wants?  
> Yes, because it encourages broad implementation, broad interoperability,
> and broad adoption of openly specified standards.
>
>> Might ossification of a standard retard innovation
>> by discouraging extensions or changes?
> This is wildly less likely under 2-track than it already
> is today, partly because it will be much easier for a
> sensible revision to move to Proposed Standard.
 
I don't understand what you're saying.  Are you arguing that recycling
back down to proposed will be easier?

Regards,

Eliot
_______________________________________________
Ietf mailing list
Ietf@ietf.org
https://www.ietf.org/mailman/listinfo/ietf

Reply via email to