Christopher Barker wrote:
On 11/3/10 7:19 PM, John Caron wrote:
1) It seems clear that at each time step, you need to write out the data
for whatever particles currently exist. I assume that if you wanted to
break up the data for a very long run, you would partition by time, ie
time steps 1-1000
On 04.11.2010 16:07, John Caron wrote:
> On 11/4/2010 5:50 AM, Ben Hetland wrote:
[...]
>> In a recent case I had, the data were to be transferred to
>> a client computer over the Internet for viewing locally. In that case
>> reducing the content of the file to the absolute minimum set of
>> proper
On 11/3/10 7:19 PM, John Caron wrote:
1) It seems clear that at each time step, you need to write out the data
for whatever particles currently exist. I assume that if you wanted to
break up the data for a very long run, you would partition by time, ie
time steps 1-1000, 1001-2000, etc. would be
On 11/4/2010 5:50 AM, Ben Hetland wrote:
Hello List,
Since I'm also dealing a bit with data sets holding "particle
collections" through time, I'd like to contribute some thoughts
regarding this. Our primary use here at SINTEF is for inputs to and
results from oil drift simulations, so we're deal
Hello List,
Since I'm also dealing a bit with data sets holding "particle
collections" through time, I'd like to contribute some thoughts
regarding this. Our primary use here at SINTEF is for inputs to and
results from oil drift simulations, so we're dealing mostly at the sea
surface and below, al