Feature request: Load default dataset's parameters from HDF5 file
Opened this issue · 3 comments
The default parameters of the dataset (e.g., sources, spaces, ..) could be directly stored in and loaded from the hdf5 file. The expected behavior would be to load them from the hdf5 file if they are not provided by the user as parameter of the constructor (either directly or through the YAML file).
The most portable (i.e. compatible with both h5py and pytables) solution would be to create a dataset (or Array, if you are more familiar with pytables) for each parameter. To keep things clean I suggest to name the dataset with the name of the parameter prefixed by "_" (e.g. to store the sources
parameter, create a dataset _sources
). I am not sure this solution would work when the parameters are structured objects (e.g. spaces).
To avoid cluttering the hdf5 file with a dataset for each parameter, a parameters
table could contain all the required parameters. This would work with pytables only, but should solve the problem of the clutter and that of the structured parameters.
For storing spaces, could we use strings and store a YAML representation of the object?
YAML is supposed to be an optional front-end, and we don't have an easy way
to construct it from an object that was not instantiated from YAML.
On Tue, Feb 10, 2015 at 5:00 PM, Pascal Lamblin notifications@github.com
wrote:
For storing spaces, could we use strings and store a YAML representation
of the object?—
Reply to this email directly or view it on GitHub
#1394 (comment).