> On Fri, May 23, 2008 at 8:42 PM, Allen Brown <[EMAIL PROTECTED]> wrote:
>> It's tough reading.  I can only read a few minutes at a time
>> before my eyes glaze over.  And I haven't finished.  But I
>> don't think I will be able to consider the big picture when
>> I do finish, because of the aforementioned glazing.
>
> I'm assuming the glazing is largely due to phrasing and fairly arcane
> subject matter with its own unique vocabulary -- the overlap of law
> and a specialized area of technology -- rather than typography.  But I
> would appreciate any clues  you might contribute on how to lessen that
> problem. I am aware that the Dupal theme I'm using formats footnotes
> in small type size and am working toward improving that. (I've found a
> theme more suitable for customization.)

Yes, I have a hard time reading legalese. I remember the first time
I applied for a patent. The legal document was opaque to me.  Over
time I have gotten less hopeless about this.  Occasionally I can
even read small sections of patents and delude myself into thinking
that I understand.

I don't hesitate to increase the size of text. My eyes are too old
for tiny fonts.

> I'm aiming toward both something like what you're looking at and a
> condensed version that omits the commentary. (I realize that the
> commentary will put a lot of people to sleep.) At the same time, it's
> necessary in creating a document that has a prayer of withstanding a
> concerted legal challenge to use precise terminology and to be
> sufficiently detailed to avoid ambiguity that could lead to different
> interpretations. Condensing is part of the polishing process and I
> have a fair distance to go in that department.

At the expense of increasing the amount of work involved, it might
be possible to make a legal document and a "readable" document in
parallel, with hyperlinks (in the form of footnotes) between each
corresponding part.  In this way you wouldn't have to compromise
either readability or rigor.  (Hyperlinks are one of the greatest
inventions of the 20th century.)

I am also reminded of how patents are written. There is an
introductory part, which us *supposed* to be readable. And then
there are the claims, which are rigorous.

>> I think the problem of determining whether a program or dataset
>> is conforming is difficult, and I don't know if you defined this.
>
> It is not defined as such, but legally defensible tests are framed by
> the clause in paragraph 10 requiring specification of, "conformity
> requirements essential to achieve interoperability" and in paragraph
> 11 by "fully specified conformity assessment procedures adequate to
> ensure interoperability."
>
> It's difficult to go much beyond establshing the legal tests in this
> area because the methodology necessary for conformity assessment in
> regard to a particular specification is variable. I agree
> wholeheartedly that conformity assessment is not a simple process.
>
>> But it seems to me that in a practical sense only a parser
>> can say if a dataset is conforming.  How would you address
>> this?
>
> As to the specific language of the text, right now I'm inclined not to
> go much beyond what is already said in the body, although I may add
> some information in the corresponding notes.

Yes.  My worry is that the big guys will game the system and
gut the value of an interoperability spec.  On the other hand,
you don't want to overspecify the requirements for a spec.
That risks quashing innovation.

> Back in the real world, parsing and comparison to a reference is about
> as far as you can go in terms of real-time on-the-fly data set
> validation. In XML markup languages, validation against the grammar
> specified by a reference schema or DTD is the normal method. See
> <http://en.wikipedia.org/wiki/XML_schema>.
>
> But many IT standards have normative requirements that cannot be
> parsed, e.g., data presentation aspects. Commonly used conformance
> assessment methods for presentation aspects include creation or
> designation of a reference implementing application or creation of
> test suites. Under the reference application approach, developers can
> compare their implementations' output to that of the reference
> application, using a standard battery of data sets.
>
> Test suites normally establish reference displays of how particular
> small data sets should appear visually when rendered. The small data
> sets and their reference displays are commonly developed concurrently
> with a standard's development and implementation, allowing
> implementing developers to to compare results as they go, renegotiate
> normative requirements as necessary, etc.
>
> There's a good example of a test suite in active development and use
> here.
> <http://www.w3.org/2004/CDF/TestSuite/WICD_CDR_WP1/wicdmatrix.xhtml>.
> It's the test suite for the WICD profiles for creating W3C Compound
> Document Formats. The link is to the summary matrix, showing the
> conformance/implementation state of several participating browser
> developers' implementations. If you click on the links in the column
> to the left of the visible table, you'll be taken to the text of the
> relevant normative requirement, and beneath the text of each normative
> requirement, there is a link to the test for that requirement.
>
> The test displays one or more graphic images or formatted text, a
> written description of what you should be seeing, and a link to a
> screenshot of what you should be seeing. If what the developer sees is
> what s/he is supposed to see, then the browser has passed that test.
> The matrix lets everyone see what how far along everyone is in
> implementation and conformance.

This is brushing up against what I worry about. It relies on human
assessment of whether the code is doing the right thing. In the
examples I looked at the answer seemed clear. But to one intent on
gaming the system I think there could be loopholes available.

> There are similar test suites around for other data types, of course,
> e.g., audio and video.  But generally speaking, what the conformity
> assessment procedure is largely depends on what the relevant normative
> requirement is. I.e., if it's not presentation but a different aspect
> of implementation, then a different kind of procedure will be
> necessary. E.g., if you are dealing with a standard for spreadsheet
> formulas, then the assessment procedure might call for entering
> particular data and to arrive at a specified mathematical result.

Audio and video are *so* subjective.  How do you create a standard
that can't be subverted?

> I don't think I'm pushing the envelope too far with the legal tests
> I've boiled down in regard to conformity assessment procedures..
> ISO/IEC JTC 1 Directives, which govern the preparation of
> international standards in the IT sector, require:
>
> "It is the responsibility of each JTC 1 Subcommittee to ensure that
> any conformity requirements in its standards or ISPs for
> implementation in products are unambiguous and that conformity to
> those requirements is verifiable.
> ...

Desirable goals.  And as one who has never done this, they appear
difficult to achieve.  Consider the C++ and C standards.  Revised
several times.  And in that case I don't know that anyone was
actually trying to game the system; I assume not.  If there had
been, would we have a standard today?

> "A conformity assessment methodology may include the specification of
> some or all of the following: terminology, basic concepts,
> requirements and guidance concerning test methods, test specification
> and means of testing, and requirements and guidance concerning the
> operation of conformity assessment services and the presentation of
> results.
> ...
>
> "This policy statement specifies the JTC 1 position on
> interoperability and clarifies the relationship between
> interoperability and conformity. ... For the purpose of this policy
> statement, interoperability is understood to be the ability of two or
> more IT systems to exchange information at one or more standardised
> interfaces and to make mutual use of the information that has been
> exchanged. An IT system is a set of IT resources providing services at
> one or more interfaces.
> ...
>
> "Standards designed to facilitate interoperability need to specify
> clearly and unambiguously the conformity requirements that are
> essential to achieve the interoperability. Complexity and the number
> of options should be kept to a minimum and the implementability of the
> standards should be demonstrable. Verification of conformity to those
> standards should then give a high degree of confidence in the
> interoperability of IT systems using those standards. However, the
> confidence in interoperability given by conformity to one or more
> standards is not always sufficient and there may be need to use an
> interoperability assessment methodology in demonstrating
> interoperability between two or more IT systems in practice.

What is an "interoperability assessment methodology"?

USB has regular meetings for testing, which are informally called
"plugfests".  Vendors test their products on the sponsored OSs.
At the time I was involved with this that included Windoze and
Mac.  Linux was not represented because you had to pay a lot
to be an OS tester.  As a result very few products claim Linux
compatibility.

These plugfests were mediocre for assessing electrical compatibility.
(Some of the specs were strictly on a trust basis, unverified
except by honest vendors.)

And for protocol verification they were also mediocre. There
are standards for various device types, such as mass storage.
But the only assessment is that the device functions on
Windoze. Working on the Mac is a plus, but can be waived.
And the device driver is allowed to do any number of
protocol work-arounds to deal with errors in the device firmware.
Plus the USB spec is only approximated by the Windoze USB code.
The result is a very large number of USB devices which violate
the USB specs, but are licensed to use the USB logo.  The Linux
device driver folks are well aware of this.  You can see this
in the number of code exceptions and blacklists to deal with
non-functioning products.

And the vendor attitude is "it works on Windoze, what's your
problem?"

I would hope to see something better than that.  Of course,
USB is not an example of a good open standard.  It was written
primarily by Micro$haft, for Micro$haft.

> "An assessment methodology for interoperability may include the
> specification of some or all of the following: terminology, basic
> concepts, requirements and guidance concerning test methods, the
> appropriate depth of testing, test specification and means of testing,
> and requirements and guidance concerning the operation of assessment
> services and the presentation of results. In technical areas where
> there is a conformity assessment methodology and an interoperability
> assessment methodology, the relationship between them must be
> specified."
>
> It all sounds great, but It's routinely ignored in an era when the
> international standards for IT development scene has been captured by
> big vendors. I'm trying to take a rather huge body of international
> law, antitrust law, and case decision precedents involving
> interoperability and accessibility down to fundamental principles, a
> set of criteria that eGovernment and government procurement folk,
> including their lawyers can use as a meta-standard for evaluation and
> selection of IT standards. E.g., just the European Community Court of
> First Instance decision in the Microsoft case last September runs
> nearly 1,400 paragraphs.

Yes, and given how widely this is ignored there may be quite a number
of people who don't know how to do this right.  You have pointed
to some examples of careful standards.  Ones I was not aware of.
If I were tasked with coming up with a standard I would greatly
appreciate examples to look at. Perhaps such examples could be
collected (and commented upon) in an appendix of best practices.

> As nearly as I can tell there are no published efforts to summarize
> the law governing interoperability and accessibility in standards
> work, which is scattered from here to Kingdom Come. I believe
> fervently that universal interoperability and accessibility must of
> necessity be the glue points in the software infrastructure of the
> Information Society unless we are aiming for such a society only for
> the wealthy.

Has anyone written a document explaining how to write a standard?
A sort of "Standard for Standards"?  Seems like a desirable thing.

> Since no one else had summarized the law in this area, I decided it
> was a worthy task. It may take very many more drafts  to achieve a
> stable release. But I'm sensitive to the fact that universal
> interoperability and accessibility can bring about economic growth in
> developing nations that can dramatically improve health care, tackle
> the starvation problem, and raise much of humanity from other shackles
> of poverty. So I'm highly motivated to get this document right and
> really appreciate any feedback.

Aye, worthy indeed!  And I agree about the reasons this is needed.
I hope it isn't too late.  The super wealthy are entrenched.
They are doing a good job of protecting their own interests
by convincing Joe-sixpack to vote against his own advantage.
But I won't name any names.  We've already had too much of that
on this list.

> Thank you for the helpful criticism,
>
> Marbux.

I'm not convinced my comments are helpful.  But at least I am trying.
At times, very trying.  :-)
-- 
Allen Brown  [EMAIL PROTECTED]  http://brown.armoredpenguin.com/~abrown/
  It is not necessary to understand things in order to argue about them.
  --- Pierre de Beaumarchais


_______________________________________________
EUGLUG mailing list
euglug@euglug.org
http://www.euglug.org/mailman/listinfo/euglug

Reply via email to