Hi Seth:

On 8/28/2013 12:59 PM, Seth McGinnis wrote:
Hi Sean,

Personally, I would regard that as suspect behavior.

I'm of the opinion that it's best practice for a data service to
affect the data it operates on in very targeted and transparent
way, and that it should pass everything else through untouched.
If it doesn't do that, it's harder to trust the service, because
the more ways that it modifies the data, the more opportunities
there are for it to get something wrong, and the more likely it
is that the error will go unnoticed because it's hiding amongst a
bunch of other changes.

We take this philosophy also.


Case in point, I have recently discovered that when the THREDDS
server subsets CF data that uses a coordinate system other that
lat-lon, it mangles a bunch of the ancillary data in the file.  I
suspect this happens, ironically enough, precisely because it's
trying to make the data CF-1.0 conformant instead of just leaving
it as-is...

Let us know what the specifics are and we can try to make it better.


So yes, I think it's significant that the data coming out of
THREDDS is in a different convention than the source files, and
that it's cause for concern.

A famous story from NCAR data portal early days is that some users added the Conventions = "CF-1.0" attribute to their data because they had to make their data CF compliant. But thats all they did. Hey, it passed the compliance checker! ;)

Regards,
John
_______________________________________________
CF-metadata mailing list
CF-metadata@cgd.ucar.edu
http://mailman.cgd.ucar.edu/mailman/listinfo/cf-metadata

Reply via email to