Command-line tools

nabu-histogram: extract or compute a histogram of a reconstructed volume

This command only works for HDF5 output.

Ideally, the histogram is computed during the reconstruction, so that the volume does not have to be re-loaded from disk after reconstruction.

  • If the volume histogram is available in the final HDF5 file, then this command extracts this histogram and creates a dedicated file.

  • If not, then the full histogram is computed from the volume, which takes time. You can tune how to compute the histogram (number of bins and amount of memory to use).

nabu-double-flatfield: compute the “double flat-field” of a dataset

“Double flat-field” is a way to remove rings artefacts during pre-processing of the projections. The principle is to compute the average of all projections, which creates a (artificial) flat image. Then, each projection is divided by this artificial flat.

This CLI tool is used to generate the “artificial flat”, meaning that it simply does the mean of all projections (or more involved processing if necessary, please refer to the options). The resulting image can then be fed to the nabu configuration file.

nabu-rotate: apply a rotation on all the images of a dataset

This command only works for HDF5 datasets. It perform a rotation with a given angle on each projection image in the input dataset.

nabu-zsplit: split a H5-NX file according to z-positions

This command is only relevant for HDF5 datasets.

Some datasets are acquired with the sample stage moving vertically between each scan (“Z-series”). This is different from helical scans where the vertical movement occurs during the scan. In the case of Z-series, the sample stage moves vertically once a scan is completed, resulting in a series of datasets with different “z” values.

This command is used to split such datasets into several files, where each file has a distinct “z” value.

By default, this command creates no additional data (no duplication) and use HDF5 virtual datasets

nabu-generate-info: generate the volume “.info file” for legacy tools

Some post-processing tools need the .info file that was generated by PyHST. Although all the information in this file can be obtained in the HDF5 reconstruction file, producing this .info file is helpful in order to keep the post-processing tools working without modification.

nabu-validator: check that a dataset can be reconstructed

Goal

This application checks a dataset to ensure it can be reconstructed.

By default it will check that phase retrieval and reconstruction can be done. This entails checking for default values and valid links (in the case of virtual HDF5 dataset).

Usage

$: nabu-validator [-h] [--ignore-dark] [--ignore-flat] [--no-phase-retrieval] [--check-nan] [--skip-links-check] [--all-entries] [--extend] path [entries [entries ...]]

Check if provided scan(s) seems valid to be reconstructed.

positional arguments:
  path                  Data to validate (h5 file, edf folder)
  entries               Entries to be validated (in the case of a h5 file)

optional arguments:
  -h, --help            show this help message and exit
  --ignore-dark         Do not check for dark
  --ignore-flat         Do not check for flat
  --no-phase-retrieval  Check scan energy, distance and pixel size
  --check-nan           Check frames if contains any nan.
  --skip-links-check, --no-link-check
                        Check frames dataset if have some broken links.
  --all-entries         Check all entries of the files (for HDF5 only for now)
  --extend              By default it only display items with issues. Extend will display them all

Example

On the bamboo_hercules dataset some data has not been copied. This ends up with the following output:

$: nabu-validator bambou_hercules_0001_1_1.nx --extend
💥💣💥
 3 issues found from hdf5 scan(master_file: bamboo_hercules/bambou_hercules_0001/bambou_hercules_0001_1_1.nx, entry: entry0000)
   - projection(s)  : INVALID - At least one dataset seems to have broken link
   - dark(s)        : INVALID - At least one dataset seems to have broken link
   - flat(s)        : INVALID - At least one dataset seems to have broken link
   + distance       :  VALID
   + energy         :  VALID
   + pixel size     :  VALID

Same example but with all related data copied (link):

👌👍👌
 No issue found from hdf5 scan(master_file: bamboo_hercules/bambou_hercules_0001/bambou_hercules_0001_1_1.nx, entry: entry0000).
   + projection(s)  :  VALID
   + dark(s)        :  VALID
   + flat(s)        :  VALID
   + distance       :  VALID
   + energy         :  VALID
   + pixel size     :  VALID