Actually -- I'd disagree because that is a very narrow view of the
specification.  When validating MARC, I'd take the approach to validate
structure (which allows you to then read any MARC format) -- then use a
separate process for validating content of fields, which in my opinion,
is more open to interpretation based on system usage of the data.

Wait, so is there any formal specification of "validity" that you can look at to determine your definition of "validity", or it's just "well, if I can recover it into useful data, using my own algorithms"

I think we computer programmers are really better-served by reserving the notion of "validity" for things specified by formal specifications -- as we normally do, talking about any other data format. And the only formal specifications I can find for Marc21 say that leader bytes 20-23 should be 4500. (Not true of Marc in general just Marc21).

Now it may very well be (is!) true that the library community with Marc have been in the practice of tolerating "working" Marc that is NOT valid according to any specification. So, sure, we may need to write software to take account of that sordid history. But I think it IS a sordid history -- not having a specification to ensure validity makes it VERY hard to write any new software that recognizes what you expect it to be recognize, because what you expect it to recognize isn't formally specified anywhere. It's a problem. We shouldn't try to hide the problem in our discussions by using the word "valid" to mean something different than we use it for any modern data format. "valid" only has a meaning when you're talking about valid according to some specific specification.

Reply via email to