Folks,
Is there any way to 'cache' the input data in the Build server so it doesn't
have to be downloaded every time?
Or could smaller data be a tar ball in the test setup?
Kuo, are any of the tests larger than the others? Can they be made smaller?
I'm amazed that the config file test use sm
Concurring with Alex; 3 - 5 minutes each with the majority of the time
going to read the data. I'm working on a six year old Mac, so nothing
fancy there.
On Sun, Jul 17, 2016 at 4:06 AM, Ibrahim Jarif
wrote:
> Hi Alex,
> We're trying to run all the examples in the climate/examples [0] directory
Hi Alex,
We're trying to run all the examples in the climate/examples [0] directory.
The build gets stuck for the following files.
1. *time_series_with_regions.py*
2. *subregions_portrait_diagram.py*
3. *multi_model_taylor_diagram.py*
4. *multi_model_evaluation.py*
[0] - https://githu
Hi Lewis,
Do you know which examples specifically? The configuration file examples I
have most recently tested usually take roughly 3-5 minutes for me to run.
The majority of the time is spent downloading the data (as you have
mentioned) and then processing (mainly regridding). Depending on the
ex
Hi Folks,
Ibrahim and I have been working on running the examples as part of the OCW
Smoke Testing effort.
They seem to take a hellishly long time to complete both for him (in India
e.g. downloading .nc) and for me in JPL on an Ubuntu VM.
Does anyone else have an idea about how long the examples ta