convert
: HDF5 conversion¶
This module provides classes and function to convert file formats supported by silx into HDF5 file. Currently, SPEC file and fabio images are the supported formats.
Read the documentation of silx.io.spech5
, silx.io.fioh5
and silx.io.fabioh5
for
information on the structure of the output HDF5 files.
Text strings are written to the HDF5 datasets as variable-length utf-8.
Warning
The output format for text strings changed in silx version 0.7.0. Prior to that, text was output as fixed-length ASCII.
To be on the safe side, when reading back a HDF5 file written with an older version of silx, you can test for the presence of a decode attribute. To ensure that you always work with unicode text:
>>> import h5py
>>> h5f = h5py.File("my_scans.h5", "r")
>>> title = h5f["/68.1/title"]
>>> if hasattr(title, "decode"):
... title = title.decode()
Note
This module has a dependency on the h5py library, which is not a mandatory dependency for silx. You might need to install it if you don’t already have it.
- write_to_h5(infile, h5file, h5path='/', mode='a', overwrite_data=False, link_type='soft', create_dataset_args=None, min_size=500)[source]¶
Write content of a h5py-like object into a HDF5 file.
Warning: External links in infile are ignored.
- Parameters
infile – Path of input file,
commonh5.File
,commonh5.Group
,h5py.File
orh5py.Group
h5file – Path of output HDF5 file or HDF5 file handle (h5py.File object)
h5path (str) – Target path in HDF5 file in which scan groups are created. Default is root (
"/"
)mode (str) – Can be
"r+"
(read/write, file must exist),"w"
(write, existing file is lost),"w-"
(write, fail if exists) or"a"
(read/write if exists, create otherwise). This parameter is ignored ifh5file
is a file handle.overwrite_data (bool) – If
True
, existing groups and datasets can be overwritten, ifFalse
they are skipped. This parameter is only relevant iffile_mode
is"r+"
or"a"
.link_type (str) – “soft” (default) or “hard”
create_dataset_args (dict) – Dictionary of args you want to pass to
h5py.File.create_dataset
. This allows you to specify filters and compression parameters. Don’t specifyname
anddata
. These arguments are only applied to datasets larger than 1MB.min_size (int) – Minimum number of elements in a dataset to apply chunking and compression. Default is 500.
The structure of the spec data in an HDF5 file is described in the documentation of
silx.io.spech5
.
- convert(infile, h5file, mode='w-', create_dataset_args=None)[source]¶
Convert a supported file into an HDF5 file, write scans into the root group (
/
).This is a convenience shortcut to call:
write_to_h5(h5like, h5file, h5path='/', mode="w-", link_type="soft")
- Parameters
infile – Path of input file or
commonh5.File
object orcommonh5.Group
objecth5file – Path of output HDF5 file, or h5py.File object
mode – Can be
"w"
(write, existing file is lost),"w-"
(write, fail if exists). This is ignored ifh5file
is a file handle.create_dataset_args – Dictionary of args you want to pass to
h5py.File.create_dataset
. This allows you to specify filters and compression parameters. Don’t specifyname
anddata
.