> Pedro Vicente
> pedro.vice...@space-research.org
> https://twitter.com/_pedro__vicente
> http://www.space-research.org/
>
>
>
>
> - Original Message -
> *From:* John Caron <jcaron1...@gmail.com>
> *To:* Pedro Vicente <pedro.vice...@space-r
...@space-research.org
> https://twitter.com/_pedro__vicente
> http://www.space-research.org/
>
>
>
>
> - Original Message -
> *From:* John Caron <jcaron1...@gmail.com>
> *To:* Pedro Vicente <pedro.vice...@space-research.org>
> *Cc:* cf-m
Hi David:
At the risk of giving more useful answers to the wrong question, i will say
that we could do something other than require ancillary or coordinate
variables to only have dimensions that the parent variable has. There just
must be a simple and explicit rule for mapping between parent and
oops, sorry misread the question
On Mon, Dec 21, 2015 at 1:06 PM, Jim Biard <jbi...@cicsnc.org> wrote:
> John,
>
> But ancillary variables aren't coordinates, are they?
>
> Jim
>
>
> On 12/21/15 1:08 PM, John Caron wrote:
>
> i think a crucial property of
i think a crucial property of coordinates is that their dimensions are a
subset of the variable's dimensions. otherwise one cant assign the correct
coordinate.
the length of a char valued coordinate is an exception, really char(n) is
logically the same as string.
while a dim=1 coordinate could
i think its just that CF 1.7 document has been stuck for going on 3 years
now.
-- Forwarded message --
From: Ethan Davis eda...@ucar.edu
Date: Wed, May 6, 2015 at 3:29 PM
Subject: Re: [CF-metadata] Geos grid mapping
To: Randy Horne rho...@excaliburlabs.com
Cc: John Caron ca
, at 3:19 PM, John Caron ca...@ucar.edu wrote:
CF TRAC ticket 72 defines geostationary projection, accepted for 1.7.
1) I _think_ that Vertical Perspective is an earlier version of this,
and it should be deprecated in favor of geostationary. Does anyone have a
different opinion?
2) The draft
I have both
cf-metadata@cgd.ucar.edu
and
cf-metad...@lists.llnl.gov
it appears the many (most?) still use cf-metadata@cgd.ucar.edu, i
guess its being forwarded to cf-metad...@lists.llnl.gov
___
CF-metadata mailing list
CF-metadata@cgd.ucar.edu
of the system for connecting the Trac
issue-tracker to the main mailing list at ucar.edu.
- Jeff
On 11/18/14, 11:52 AM, John Caron wrote:
I have both
cf-metadata@cgd.ucar.edu
and
cf-metad...@lists.llnl.gov
it appears the many (most?) still use cf-metadata@cgd.ucar.edu, i guess its
CF TRAC ticket 72 defines geostationary projection, accepted for 1.7.
1) I _think_ that Vertical Perspective is an earlier version of this, and
it should be deprecated in favor of geostationary. Does anyone have a
different opinion?
2) The draft 1.7 does not yet include it, so its been hard to
My preference is that one explicitly puts in the units. For dimensionless,
1 or is ok for udunits. If the units attribute isnt there, I assume
that the user forgot to specify it, so the units are unknown.
Im not sure what CF actually says, but it would be good to clarify.
John
On Thu, Oct 30,
I understand that netCDF coordinate variables have to be strictly
monotonic, and no-one wants to define what this means for the general case
of strings; that is fine.
in CDM, monontonicity is required to make the 1D coordinate maps
invertible. For string valued coordinates, the equivilent
Hi all:
Yes, 1.10 would follow 1.9, if needed, on that branch.
In software, we are used to major version increments meaning API may have
changed. Since any CF file using features of the extended model will not
be fully visable (or even break) software that only understands the classic
model, it
As I recall, the original proposal was for station_altitude. We decided to
change station to platform. At the same time it was thought that the
existing standard name of surface altitude would be synonymous. I at
least was thinking of ground stations. So I think we make a mistake there
and
Hi Timothy:
From my POV, groups are a way to, um, group things (variables, coord
systems, etc) together. They should be used for the same reason we use
directories in a file system, to prevent a big mess of things in one
namespace. On the surface, your layout above seems reasonable for those
Hi Karl and all:
NetCDF-4 compression and chunking are transparent to the user, and are
compatible with the classic data model.
I think we should be gathering experiences with the enhanced data model,
and start a CF-2.X convention draft document that uses the enhanced model.
It would also be a
A true count of seconds since an epoch matches the approach of TAI, not
UTC.
On 7/16/14, 5:58 PM, John Caron wrote:
Hi Maik:
Unfortunately, CF references the udunits package which is no longer
being developed, at least for date/time coordinates.
udunits does not support leap seconds
Hi Maik:
Unfortunately, CF references the udunits package which is no longer being
developed, at least for date/time coordinates.
udunits does not support leap seconds.
your best bet is to add an auxiliary time coordinate which uses leap
seconds, eg TAI. your specialized software can use that
On 9/19/2013 3:58 PM, Schultz, Martin wrote:
Now, here is another use case, which we haven't implemented yet - partly
because we didn't see how it can be done in a CF consistent way:
While there has been a definition of a standard file layout for data from multiple
stations (a contribution
Hi all:
A few thoughts from my (possibly limited) POV.
1. Best strategy for storing very large collections of data in flat
files like netcdf?
- store coherent chunks of the dataset in each file. Seems like a good
file size these days is 100M - 1 Gbyte.
- choose coherence by deciding on the
Hi Sean:
What feature of CF are you using that you need to preserve the version?
John
If you torture data long enough, it will confess.
-- Economist Ronald Coase
On 8/28/2013 3:28 AM, Gaffney, Sean P. wrote:
Hi,
Here at the British Oceanographic Data Centre we use THREDDS to deliver
and
On Aug 28, 2013, at 08:56, John Caron ca...@unidata.ucar.edu wrote:
Hi Sean:
What feature of CF are you using that you need to preserve the version?
John
If you torture data long enough, it will confess.
-- Economist Ronald Coase
On 8/28/2013 3:28 AM, Gaffney, Sean P. wrote:
Hi,
Here
Hi Seth:
On 8/28/2013 12:59 PM, Seth McGinnis wrote:
Hi Sean,
Personally, I would regard that as suspect behavior.
I'm of the opinion that it's best practice for a data service to
affect the data it operates on in very targeted and transparent
way, and that it should pass everything else
Hi Mark and Jonathan:
I will say straight up that I dont understand the implications of this
particular discussion. But in case its helpful, I have found it
important to separate the data model and the encoding of the model in
the netCDF file. Probably you are both already doing that, but I
Hi Jonathan:
On 6/4/2013 4:17 AM, Jonathan Gregory wrote:
Dear John Caron and John Maurer
I agree with John C that the problem arises when the coordinate variables
are not size one. John M's example
float lon(lon) ;
float lat(lat) ;
float alt(alt) ;
float temp
from John Caron ca...@unidata.ucar.edu -
Date: Tue, 4 Jun 2013 10:54:45 -0600
From: John Caron ca...@unidata.ucar.edu
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:17.0) Gecko/20130509
Thunderbird/17.0.6
To: cf-metadata@cgd.ucar.edu
Subject: Re: [CF-metadata] CF-1.6 DSG
Hi John:
1) The coordinates attribute is not new to DSG, it has been around since
the beginning of CF. This defines the auxiliary coordinate variables,
that is, coordinate variables that do not follow lat(lat) template. I
think it would be good if Grads et al could be upgraded to use them.
Hi Derrick:
Some first reactions:
1. the aggregation or collection will have two trajectories, the first
is along a time axis time and the second is along a time axis time_avg.
since they are both trajectories, then i think you dont have a problem
with more than one feature type.
however,
On 3/21/2013 9:41 AM, Steve Hankin wrote:
On 3/21/2013 8:25 AM, John Caron wrote:
Probably my proposal comes down to 2 parts, which are separable:
1. Find a suitable replacement for udunits as a reference library for
CF calendar dates. Unfortunately, udunits used a slightly
non-standard
Hi all:
Another thing to consider is the relationship to the current udunit/CF
standard for specifying dates in the unit string
period SINCE date
The udunits documentation
http://www.unidata.ucar.edu/software/udunits/udunits-2/udunits2lib.html#Grammar
not being very clear, I wrote up my
On 3/20/2013 9:41 AM, David Hassell wrote:
Hello,
My beer/coffee/perocet levels are too low to want to comment broadly on
this, so I'll just make one comment ...
Really the main advantage is that data writers are less likely to
make a mistake specifying
1952-08-15T00:00:00Z
than
2434567
://www.metoffice.gov.uk/
*From:* CF-metadata [mailto:cf-metadata-boun...@cgd.ucar.edu] *On Behalf
Of *Steve Hankin
*Sent:* 24 February 2013 19:07
*To:* John Caron
*Cc:* cf-metadata@cgd.ucar.edu
*Subject:* Re: [CF-metadata] New standard name
On 3/14/2013 6:39 AM, Izidor Pelajic wrote:
Hi all,
I'm new at netCDF, new at CF, and moderate MATLAB user. I'm new at
netCDF in a sense of days or week.
So my problem (beyond the fact that I do not have general knowledge)
is next. I was provided with a .nc file. In it there are 3 information
of
On 2013-03-06 15:18, John Caron wrote:
The Rotated Pole projection here:
http://cf-pcmdi.llnl.gov/documents/cf-conventions/1.6/cf-conventions.html#appendix-grid-mappings
Rotated pole
grid_mapping_name = rotated_latitude_longitude
Map parameters
---
Am 06.03.2013 um 17:09 schrieb John Caron ca...@unidata.ucar.edu:
Hi Heiko:
So
grid_north_pole_longitude =
normalizeLongitude180(longitudeOfSouthernPoleInDegrees)
grid_north_pole_latitude = -1 * longitudeOfSouthernPoleInDegrees
?
where does one find documentation on proj4's
Hi Seth:
Thanks for the description, its really helpful.
Im just wondering how we document this in the CF convention, so that
implementers have something to check against. Is there a reference
implementation we can use ?
John
On 3/6/2013 1:17 PM, Seth McGinnis wrote:
So there are an
Hi Jonathan:
On 12/17/2012 10:51 AM, Jonathan Gregory wrote:
Dear John
I have prototyped similar functionality in the CDM, documented here:
http://www.unidata.ucar.edu/software/netcdf-java/CDM/CalendarDateTime.html
Thanks for this. I recall the earlier discussion and I think what you have
Hi Cathy:
I think that you are using backwards compatible in a different way.
The current proposal(s) would not change files that are written with
:Conventions=CF-1.x, where x = 6. Files with x 6 could still use
the (ill-advised) old way if they want to, by putting an explicit
calendar
Hi Chris:
On 12/17/2012 4:50 PM, Chris Barker - NOAA Federal wrote:
I think a solution shouldn't break current files which followed what had
been a standard for a long time (however ill-advised the standard was). I
don't have a good sense of what might break if the standard changed in terms
of
Hi all:
Heres what I understand of the conversation:
1. Theres nothing to do about existing files CF-1.6 and before. we are
stuck with the udunits mixed calendar.
2. Starting with the next version after acceptance, (1.7 currently), we
can do something different. I agree that forcing people
that
in the late/winter spring timeframe, with some guidance on what
functions were
desired and a pass through our prioritization board.
Best,
-- Cecelia
On 12/5/2012 12:25 PM, John Caron wrote:
Hi all:
Its probably the right thing to do to make gregorian (Mixed
Gregorian/Julian calendar) the default
/2012 12:25 PM, John Caron wrote:
Hi all:
Its probably the right thing to do to make gregorian (Mixed
Gregorian/Julian calendar) the default calendar for COARDS/CF, for
backwards compatibility. However, CDM may leave proleptic_gregorian
(ISO8601) as its default.
And I would strongly suggest
files because the Java interface is not
backwards compatible.
Cathy Smith
NOAA/ESRL PSD
On 12/5/12 12:25 PM, John Caron wrote:
Hi all:
Its probably the right thing to do to make gregorian (Mixed
Gregorian/Julian calendar) the default calendar for COARDS/CF, for
backwards compatibility. However, CDM
On 10/10/2012 6:36 AM, Jim Biard wrote:
John,
I agree that there is no ambiguity in the original dataset. I was just
pointing out a scenario in which the difficulty mentioned by Phil in his
original post would manifest itself. The methodology referenced by Russ
won't handle it. If you (for
Hi Jim:
_FillValue/missing_value refers to the packed value, so theres no
ambiguity in the original dataset. It is best to make sure its outside
the range of real values, but even if not, one just has to search for
that exact bit pattern.
If someone rewrites the data, its their
:
Heiko,
We worked with John Caron several years ago to get these into the CDM
in the NetCDF-Java library, so if you have g1 or g2 coordinates they
will work with codes that use NetCDF-java (like the Matlab NCTOOLBOX
and Unidata's IDV), and I remember we drafted up some documentation
to submit to CF
yes, i typically get 3 copies. it would be nice not to.
On 6/25/2012 4:29 PM, Seth McGinnis wrote:
I do want to see Trac messages, but since this update happened I'm getting two
or three copies of each one. Is anyone else having this problem?
What do I need to do to get only one copy?
If no one has any objections, i move that we accept this ticket:
Adding the geostationary projection.
https://cf-pcmdi.llnl.gov/trac/ticket/72
the changes needed to the spec are at the bottom of the ticket, we didnt
know how to change the top part.
On 4/30/2012 8:40 PM, andrew walsh wrote:
Hi John and CF-Metadata list,
Based on your earlier advice I decided using the Scalar way to represent
the coordinate lat, long and time rather than Vector way i.e lat(lat0,
lon(lon), time(time)
mainly for reason of simplicity.
the correct other
a reason for increasing; I assume it's just
because sect 9 was conceived for observations in the first place. However, John
Caron may well have a comment. I don't think anything prevents your storing
the data in the orthogonal multidimensional representation, which existed
before sect 9 did and doesn't
On 4/29/2012 5:33 PM, andrew walsh wrote:
Hi John,
My responses inline below.
Andrew
- Original Message - From: John Caron ca...@unidata.ucar.edu
To: cf-metadata@cgd.ucar.edu
Sent: Saturday, April 28, 2012 2:39 AM
Subject: Re: [CF-metadata] Ocean CTD data following CF Conventions
mailto:jim.bi...@noaa.gov
828-271-4900
On Apr 30, 2012, at 8:46 AM, John Caron wrote:
On 4/29/2012 5:33 PM, andrew walsh wrote:
Hi John,
My responses inline below.
Andrew
- Original Message - From: John Caron
ca...@unidata.ucar.edu mailto:ca...@unidata.ucar.edu
To: cf-metadata
Variables:
time
lat
lon
temperature(pressure)
The scalar approach you suggest as in section H.3.3. Single profile
of the CF v1.6 standard
is simpler than the vector approach so we will take your advice.
Regards,
Andrew
- Original Message - From: John Caron ca...@unidata.ucar.edu
and a document
specifying the CDL.
Thanks and Regards,
Andrew Walsh
Ref.
(1) Konovalov et. al (March 2012), Proposal to adopt a quality flag
scheme standard
for oceanographic and marine meteorological data, Version 1.2.
- Original Message - From: John Caron ca...@unidata.ucar.edu
On 4/24/2012 4:16 AM, Hedley, Mark wrote:
There are particular issue here with format interoperability and conversion
with respect to phenomenon. In GRIB2, for example, there are codes which
identify vector components:
Wind direction (from which blowing) degree true
Wind speed ms
On 4/20/2012 4:54 AM, Gaffney, Sean P. wrote:
Hi everyone, Thanks for all your feedback.
It's made things clearer for me now. Just to summarise then, a suite of
attributes that would provide a precise numeric description of the data within
a variable could then be (ignoring things such as
Heres my two cents:
The meaning of the x_coordinate and y_coordinate is actually well
defined. But it does not mean x=east and y=north. It means the input to
the projection function proj(x,y) - (lat,lon), which are defined in
appendix F, with pointers to reference software. AFAIU, these
On 4/19/2012 9:13 AM, Gaffney, Sean P. wrote:
Hi all,
My name is Sean Gaffney, from the British Oceanographic Data Centre, and I'm
working on a project dealing with numerical model data that are in CF compliant
NetCDF, so I thought I'd sign up to the community.
The project I am working on
On 4/11/2012 3:59 PM, Jonathan Gregory wrote:
Dear Randy
While I understand why this would be convenient for you, I'm inclined to think
that your convenience would be outweighed by the inconvenience to the users of
the data in this case. I don't expect there is any existing CF-aware analysis
Hi all:
Lets see, I havent followed the entire conversation, but:
1) Andrew if you can send me a sample file (not just the CDL) I can
check if it works in the CDM with the new 1.6 conventions, and maybe
give you some advice from my POV.
2) Aggregation in the CDM comes in 2 flavors. 1) The
On 3/28/2012 6:26 AM, Jonathan Gregory wrote:
Dear all
Brian has a good point that sect 5.3 allows that there might be missing data
in aux coord variables. Appendix A - which is equally ancient in the story of
CF :-) - is not consistent with this, because it didn't allow _FillValue or
On 3/28/2012 10:49 AM, John Caron wrote:
I think we have a number of valid use cases for missing data in aux
coordinates, and i would vote to allow that.
sorry i didnt really answer jonathan's concern. I would suggest this
wording:
Auxiliary coordinates do not have to be monotonic or have
On 3/26/2012 2:24 AM, Jonathan Gregory wrote:
Dear Ros
Regarding this requirement:
9.6 Where any auxiliary coordinate variable contains a missing value, all
other coordinate, auxiliary coordinate and data values corresponding to that
element should also contain missing values.
Appendix A
Hi all:
From a CDM developer perspective, an auxiliary coordinate is just as
good as a regular coordinate variable. The extra requirements on
coordinate variables are helpful in knowing when to optimize, eg
monotonicity allows one to efficiently find the index given the
coordinate value.
On 3/23/2012 1:59 PM, Jim Biard wrote:
Hi.
Jonathan's reply contained the section:
9.6 Where any auxiliary coordinate variable contains a missing
value, all
other coordinate, auxiliary coordinate and data values
corresponding to that
element should also contain missing
Example 5.10. British National Grid
int crs ;
crs:grid_mapping_name = transverse_mercator;
crs:semi_major_axis = 6377563.396 ;
crs:semi_minor_axis = 6356256.910 ;
crs:inverse_flattening = 299.3249646 ;
crs:latitude_of_projection_origin = 49.0 ;
hi michael
1) There are a few typos (see attached) that made it invalid NcML. also
.ncml suffix is preferred over .xml
2) for cdm_data_type, i might not call this a grid, maybe image is
better. the satellite group should probably come up with a feature type
name for geostationary data.
3)
Hi all:
If the checker can read from opendap, then putting the files on an
opendap server might be helpful, in
that only the metadata is read instead of the entire file.
There are a few difference among servers (Hyrax, TDS, GDS, etc) but im
pretty sure this would work for
netcdf files well
Possible defects:
1) Example 7.11. Extreme statistics and spell-lengths
climatology_bounds=2007-12-1 6:00, 2000-8-2 6:00;
should be
climatology_bounds=2007-12-1 6:00, 2008-3-1 6:00;
perhaps clearer as
climatology_bounds=2007-12-1, 2008-3-1;
2) flip order of 7.11 and 7.12, since 7.12
Thanks, Jonathan.
On 2/21/2012 11:18 AM, Jonathan Gregory wrote:
Dear John
However, im not sure what this sentence in 7.4 means: When
considering intervals within days, if the earlier time of day is
equal to the later time of day, then the method is applied to a full
24 hour day.
It doesn't
Im working with NOAA's CFSR dataset, reading it into the CDM, which
essentially means converting GRIB into CF/netCDF.
ref:
http://nomads.ncdc.noaa.gov/data.php#cfs-reanal-data
http://dss.ucar.edu/datasets/ds093.2/
from NCAR's DSS page:
CFSR monthly atmospheric, oceanic and land surface
On 12/22/2011 2:11 AM, Jonathan Gregory wrote:
Dear all
The existing Unidata recommendation is OK and we could incorporate it into
CF but it would help to be more precise, for instance: If the Conventions att
includes no commas, it is interpreted as a blank-separated list of conventions;
if it
Hi Etienne:
Generally, CF has followed proj4 conventions as much as we could, and
John Snyder's reference book is often underneath:
http://pubs.usgs.gov/pp/1395/report.pdf
OGC specs tended to come from experts using the EPSG, and generally we
are less familiar with that.
some specific
hi:
I think I would also advocate adding another standard attribute,
something like units_label which would be a label for the units in a
plot, not necessarily udunit compliant.
eg:
var : units = ;
var : units_label = milligrams per gram of dry sediment;
john
On 12/9/2011 2:09 AM,
On 12/9/2011 11:37 AM, Jonathan Gregory wrote:
Dear John
I prefer the idea that Thomas has put forward of an umbrella, rather than
containing the vector/tensor components in one data variable, because
* I really don't like the idea of units being mixed within a data variable.
I think things
On 11/24/2011 2:53 PM, Thomas Lavergne wrote:
Dear all,
This email is a proposal to strenghthen the storage and exploitation of
vector/tensor data in CF. Thanks to Jonathan for commenting an early version of
this note.
As far as I can tell, vectors are not handled as such by CF, only their
Hi Matthias:
The CDM/TDS does not support Compression by Gathering, nor does any
other package i know of. There was some internal talk in CF of
deprecating this. I am cc'ing the CF list in case anyone there wants to
comment.
Have you looked at using netcdf-4 with compression?
John
On
Hi Ute:
On 11/25/2011 6:01 AM, Ute Brönner wrote:
Hi folks,
I kind of lost track of our latest discussions and had the feeling that this
was partly outside the mailing group; so I will try to sum up what we were
discussing.
My latest try was to produce NetCDF for particle trajectory trying
Original Message
Subject: Re: [CF-metadata] Proposed addition to CF principles: outside
conventions
Date: Mon, 21 Nov 2011 14:39:13 -0700
From: Seth McGinnis mcgin...@ucar.edu
To: John Caron ca...@unidata.ucar.edu
Hi John,
I'm generally in favor of this proposal
Hi Vegard:
I see some of these kinds of things from NCEP, encoded in GRIB, and Im
still trying to understand what they are. So, some questions from a
non-modeler:
On 11/15/2011 2:10 AM, Vegard Bønes wrote:
Hi!
I am trying to create a document containing various probability values for
Heres a few comments on this discussion from my POV:
1) to summarize whats already in CF1.6:
section A9.2:
It is strongly recommended that there should be a station variable
(which may be of any type) with the attribute cf_role=”timeseries_id”,
whose values uniquely identify the stations.
It
On 8/24/2011 6:23 AM, Jonathan Gregory wrote:
Dear John
It seems to me it would be better to somehow denote the epoch
seperately, because its kind of silly keeping track of # millisecs
between two dates separated by 50 million years. plus its hard.
what about:
01-01-01 12:00 epoch 50m BCE
;
The problem is that udunits ends up computing times for -64 Myear as:
63998634-12-14 00:00:00 BCE
so you lose precision on the year.
Don
On 8/19/11 10:45 AM, John Caron wrote:
Regarding paleoclimate, a point I forgot is that some modellers may
wish to
have years which are very large negative
On 8/23/2011 6:13 AM, Lynnes, Christopher S. (GSFC-6102) wrote:
On Aug 22, 2011, at 6:36 PM, John Caron wrote:
On 8/22/2011 6:37 AM, Jonathan Gregory wrote:
Dear Chris
Perhaps there could be an attribute we could set that says whether we have
accounted for leap seconds? With the absence
On 8/19/2011 7:48 AM, Lynnes, Christopher S. (GSFC-6102) wrote:
On Aug 19, 2011, at 7:54 AM, Jon Blower wrote:
Hi Chris,
Does the calendar system usually define whether leap-seconds are taken into
account or not?
Generally speaking, I don't believe so, if only because calendar systems
Regarding paleoclimate, a point I forgot is that some modellers may
wish to
have years which are very large negative numbers (many more than four
digits)
if they set up the model with the true date for the run. Although for
geological timescales you might say that this isn't necessary and you
Hi Jonathan:
On 8/18/2011 9:41 AM, Jonathan Gregory wrote:
Dear John
http://www.unidata.ucar.edu/software/netcdf-java/CDM/DateTime.html
If there's interest, I can propose as a CF convention. Otherwise it
can remain a CDM extension.
Thank you for doing this. I think it's an attractive
In March I promised to do something with the udunits handling of fuzzy
time units email conversation. I now have a preliminary implementation
of extended date coordinates. docs are here:
http://www.unidata.ucar.edu/software/netcdf-java/CDM/DateTime.html
On 8/8/2011 3:43 PM, Jim Biard wrote:
Hi.
I have a time series of monthly averaged values. I have an
integer-valued time coordinate variable and an associated time_bounds
variable. Is it correct to use the 15th of February and the 16th of
all the other months for my time centers, or should
On 7/26/2011 4:24 AM, Jonathan Gregory wrote:
Dear all
For datasets which are intended for analysis by end-users I think it would be
undesirable to remove the requirement of providing explicit lat and lon
coords even if a grid_mapping is provided. I think it is unrealistic to expect
all
On 8/3/2011 8:19 AM, Jon Blower wrote:
Hi all,
I've been following this thread with great interest. For me it boils down to
this question:
- Is the datum always known by the data provider?
If the answer is yes then I see no reason to omit the datum (and plenty of
reasons to include it).
Hi all:
If I understand the situation, a GCM, whether global or regional, always
has a datum. In the simple case, its just the radius of a spherical
earth. However, for current global models its typical that this datum
doesnt affect anything (presumably as long as its within the typical
Hi Chris:
I think we agreed that this sentence in section 5 should be removed:
The|axis|attribute is not allowed for auxiliary coordinate variables.
If anyone has a better idea, let me know. otherwise i will submit a
defect change.
John
On 7/20/2011 7:51 AM, Chris Paver wrote:
Dear list,
On 7/20/2011 3:43 AM, Comiskey, Glenn wrote:
Thanks for the reply John.
Just to confirm, the vertical coordinate is described by a number of
other metadata attributes such as axis, standard_name,
_CoordinateAxisType etc. and therefore wouldn't expect any problems with
applications discovering
On 7/14/2011 8:48 AM, Comiskey, Glenn wrote:
Hi,
The use of the variable unit value degrees would appear to discount
a data set from being both COARDS and CF compliant, at least according
to the CF conventions document, i.e. v1.5, section 3.1 states The
COARDS convention prohibits the unit
On 6/30/2011 1:26 PM, Randy Horne wrote:
Paragraph 5.6 Coordinate Reference Systems, Grid
Mappings, and Projections, first sentence:
When the coordinate variables for a horizontal grid are not longitude and
latitude, it is required that the true latitude and longitude coordinates be
supplied
/esmf_releases/public/last/ESMF_refdoc/
Brian
On Fri, Apr 01, 2011 at 05:04:42AM -0600, John Caron wrote:
Hi Martin and all:
Im wondering what essential methods a calendar library needs to have
to support, eg 360 day calendars? The only one that comes to mind is
to calculate the difference between
On 4/15/2011 10:29 AM, Steve Emmerson wrote:
John,
On 04/14/2011 08:09 PM, John Caron wrote:
Im thinking that ne wants a grammer that allows
moleX / mole * mole/ moleX = 1
moleX / mole * mole/ moleY= moleX/ moleY
and so on.
I wonder if this list could accumulate a list of such examples
Hi Jonathan:
On 4/14/2011 2:14 AM, Jonathan Gregory wrote:
Dear John and Steve
I agree with Steve about this:
On 04/13/2011 02:25 PM, John Caron wrote:
the point im trying to make is that it would be better to understand
that mol mol-1 (canonical udunit = 1) is not the same as m3 m-3
On 4/13/2011 2:52 PM, Steve Emmerson wrote:
On 04/13/2011 02:25 PM, John Caron wrote:
the point im trying to make is that it would be better to understand
that mol mol-1 (canonical udunit = 1) is not the same as m3 m-3
(canonical udunit = 1).
In my opinion, the distinction between mol/mol
1 - 100 of 190 matches
Mail list logo