I like the idea of a function handle, however I would also like the ability
to append to a function handle, for e.g. restarted, adaptive or
mesh-altering simulations. This can easily be done by creating a new group
with metadata when a function handle is created in append mode. There is no
need for hashing.

Using internal links to metadata when rewriting functions would also allow
us to use the existing code for reading back functions.

-Øyvind
15. okt. 2013 13:13 skrev "Chris Richardson" <[email protected]> følgende:

>
> On 15/10/2013 08:17, Øyvind Evju wrote:
>
>> 2013/10/12 Chris Richardson <[email protected]>
>>
>>  It has been suggested to enhance the HDF5File interface.
>>>
>>> I am collecting ideas and comments by the mailing list. Please
>>> reply if you have any opinions about this
>>> at all.
>>>
>>> HDF5 files are like filesystems, and can contain groups (like
>>> directories) and datasets (like files).
>>> Each group or dataset can also have attributes, which may be
>>> string, double, or vector<double> for example.
>>>
>>> Currently, HDF5File in dolfin allows the following:
>>>
>>> read and write of Vector, Function, MeshFunction, Mesh and
>>> MeshValueCollection objects,
>>> via simple write(object, "dataset_name") or read(object,
>>> "dataset_name") methods, see HDF5File.h
>>>
>>> Complex objects, such as Functions are saved as HDF5 groups,
>>> containing datasets for e.g. dofmap, vector etc.
>>>
>>> Proposal
>>> --------
>>> Allow time-series of any of the usual objects listed above. This
>>> could also be used to reorganise
>>> TimeSeriesHDF5 (in dolfin/adaptivity)
>>>
>>> Implementation
>>> --------------
>>> Rather than just writing once, a Function or Vector may be written
>>> many times, perhaps with timestamp
>>> information.
>>>
>>>     u = Function(Q)
>>>     hdf = HDF5File("example.h5", "w")
>>>
>>>     # Suggestions for better object names are welcome...
>>>     hdf_u = HDF5FunctionHandle("/my_**function", u)
>>>     hdf_u.write(0.0)
>>>     ...
>>>     # later time
>>>     hdf_u.write(1.0)
>>>
>>> To read back in:
>>>
>>>     u = Function(Q)
>>>     hdf = HDF5File("example.h5", "r")
>>>     hdf_u = HDF5FunctionHandle("/my_**function", u)
>>>     # read back at time t=0.0
>>>     t=0.0
>>>     hdf_u.read(t)
>>>     # or just read back the next in sequence
>>>     hdf_u.read()
>>>
>>
>> I'm curious about the HDF5FunctionHandle. I guess you would need to
>> send the hdf5file object on initiation?
>>
>>
> Yes, I guess I forgot that...
>
>     hdf_u = HDF5FunctionHandle(hdf, "/my_function", u)
>
> The FunctionHandle would contain a shared_ptr reference to the function,
> so future writes would be guaranteed to be OK (using the same Function, at
> least), so no need to hash anything.
> I was thinking of just storing the updated vectors in the same group.
> ______________________________**_________________
> fenics mailing list
> [email protected]
> http://fenicsproject.org/**mailman/listinfo/fenics<http://fenicsproject.org/mailman/listinfo/fenics>
>
_______________________________________________
fenics mailing list
[email protected]
http://fenicsproject.org/mailman/listinfo/fenics

Reply via email to