pyFAI.io package

pyFAI.io.image module

Module function to read images.

pyFAI.io.image.read_data(image_path)

Returns a numpy.array image from a file name or a URL.

Parameters:

image_path (str) – Path of the image file

Return type:

numpy.ndarray regardless the dimention or the content

Raises:
  • IOError – if the data is not reachable

  • TypeError – if the data is not an image (wrong size, wrong dimension)

pyFAI.io.image.read_image_data(image_path)

Returns a numpy.array image from a file name or a URL.

Parameters:

image_path (str) – Path of the image file

Return type:

numpy.ndarray

Raises:
  • IOError – if the data is not reachable

  • TypeError – if the data is not an image (wrong size, wrong dimension)

pyFAI.io.integration_config module

Module function to manage configuration files, all serialisable to JSON.

class pyFAI.io.integration_config.ConfigurationReader(config)

Bases: object

__init__(config)
Parameters:

config – dictonary

pop_detector()

Returns the detector stored in the json configuration.

Return type:

pyFAI.detectors.Detector

pop_method(default=None)

Returns a Method from the method field from the json dictionary.

Return type:

pyFAI.method_registry.Method

pop_ponifile()

Returns the geometry subpart of the configuration

pyFAI.io.integration_config.normalize(config, inplace=False, do_raise=False)

Normalize the configuration file to the one supported internally (the last one).

Parameters:
  • config (dict) – The configuration dictionary to read

  • inplace (bool) – In true, the dictionary is edited inplace

  • do_raise (bool) – raise ValueError if set. Else use logger.error

Raises:

ValueError – If the configuration do not match & do_raise is set

pyFAI.io.nexus module

Module for writing HDF5 in the Nexus style

class pyFAI.io.nexus.Nexus(filename, mode=None, creator=None, start_time=None)

Bases: object

Writer class to handle Nexus/HDF5 data

Manages:

  • entry

    • pyFAI-subentry

      • detector

TODO: make it thread-safe !!!

__init__(filename, mode=None, creator=None, start_time=None)

Constructor

Parameters:
  • filename – name of the hdf5 file containing the nexus

  • mode – can be ‘r’, ‘a’, ‘w’, ‘+’ ….

  • creator – set as attr of the NXroot

  • start_time – set as attr of the NXroot

close(end_time=None)

close the filename and update all entries

deep_copy(name, obj, where='/', toplevel=None, excluded=None, overwrite=False)

perform a deep copy: create a “name” entry in self containing a copy of the object

Parameters:
  • where – path to the toplevel object (i.e. root)

  • toplevel – firectly the top level Group

  • excluded – list of keys to be excluded

  • overwrite – replace content if already existing

find_detector(all=False)

Tries to find a detector within a NeXus file, takes the first compatible detector

Parameters:

all – return all detectors found as a list

flush()
classmethod get_attr(dset, name, default=None)

Return the attribute of the dataset

Handles the ascii -> unicode issue in python3 #275

Parameters:
  • dset – a HDF5 dataset (or a group)

  • name – name of the attribute

  • default – default value to be returned

Returns:

attribute value decoded in python3 or default

get_class(grp, class_type='NXcollection')

return all sub-groups of the given type within a group

Parameters:
  • grp – HDF5 group

  • class_type – name of the NeXus class

get_data(grp, class_type='NXdata')

return all dataset of the the NeXus class NXdata WRONG, do not use…

Parameters:
  • grp – HDF5 group

  • class_type – name of the NeXus class

get_dataset(grp, attr=None, value=None)

return list of dataset of the group matching the given attribute having the given value

Parameters:
  • grp – HDF5 group

  • attr – name of an attribute

  • value – requested value for the attribute

Returns:

list of dataset

get_default_NXdata()

Return the default plot configured in the nexus structure.

Returns:

the group with the default plot or None if not found

get_entries()

retrieves all entry sorted the latest first.

Returns:

list of HDF5 groups

get_entry(name)

Retrieves an entry from its name

Parameters:

name – name of the entry to retrieve

Returns:

HDF5 group of NXclass == NXentry

new_class(grp, name, class_type='NXcollection')

create a new sub-group with type class_type :param grp: parent group :param name: name of the sub-group :param class_type: NeXus class name :return: subgroup created

new_detector(name='detector', entry='entry', subentry='pyFAI')

Create a new entry/pyFAI/Detector

Parameters:
  • detector – name of the detector

  • entry – name of the entry

  • subentry – all pyFAI description of detectors should be in a pyFAI sub-entry

new_entry(entry='entry', program_name='pyFAI', title=None, force_time=None, force_name=False)

Create a new entry

Parameters:
  • entry – name of the entry

  • program_name – value of the field as string

  • title – description of experiment as str

  • force_time – enforce the start_time (as string!)

  • force_name – force the entry name as such, without numerical suffix.

Returns:

the corresponding HDF5 group

new_instrument(entry='entry', instrument_name='id00')

Create an instrument in an entry or create both the entry and the instrument if

pyFAI.io.nexus.from_isotime(text, use_tz=False)
Parameters:

text – string representing the time is iso format

pyFAI.io.nexus.get_isotime(forceTime=None)
Parameters:

forceTime (float) – enforce a given time (current by default)

Returns:

the current time as an ISO8601 string

Return type:

string

pyFAI.io.nexus.is_hdf5(filename)

Check if a file is actually a HDF5 file

Parameters:

filename – this file has better to exist

pyFAI.io.nexus.load_nexus(filename)

Tried to read-back a file from a Nexus file written by pyFAI

Parameters:

filename – the name of the nexus file

Returns:

parsed result

pyFAI.io.nexus.save_NXcansas(filename, result, title='something descriptive yet short', run='run-number', entry='entry', instrument='beamline', source_name='ESRF', source_type='synchotron', source_probe='x-ray', sample='sample', extra=None)

Save integrated data into a HDF5-file following the Nexus canSAS application definition: https://manual.nexusformat.org/classes/applications/NXcanSAS.html

Parameters:
  • filename – name of the file to be written

  • result – instance of Integrate1dResult

  • title – title of the experiment

  • entry – name of the entry

  • instrument – name/brand of the instrument

  • source_name – name/brand of the particule source

  • source_type – kind of source as a string

  • source_probe – Any of these values: ‘neutron’ | ‘x-ray’ | ‘electron’

  • sample – sample name

  • extra – extra metadata as a dict

pyFAI.io.nexus.save_NXmonpd(filename, result, title='monopd', entry='entry', instrument='beamline', source_name='ESRF', source_type='synchotron', source_probe='x-ray', sample='sample', extra=None)

Save integrated data into a HDF5-file following the Nexus powder diffraction application definition: https://manual.nexusformat.org/classes/applications/NXmonopd.html

Parameters:
  • filename – name of the file to be written

  • result – instance of Integrate1dResult

  • title – title of the experiment

  • entry – name of the entry

  • instrument – name/brand of the instrument

  • source_name – name/brand of the particule source

  • source_type – kind of source as a string

  • source_probe – Any of these values: ‘neutron’ | ‘x-ray’ | ‘electron’

  • sample – sample name

  • extra – extra metadata as a dict

pyFAI.io.ponifile module

Module function to manage poni files.

class pyFAI.io.ponifile.PoniFile(data=None)

Bases: object

API_VERSION = 2.1
__init__(data=None)
as_dict()
property detector
Return type:

Union[None, object]

property dist
Return type:

Union[None,float]

make_headers(type_='list')

Generate a header for files, as list or dict or str

property poni1
Return type:

Union[None,float]

property poni2
Return type:

Union[None,float]

read_from_dict(config)

Initialize this object using a dictionary.

Note

The dictionary is versionned.

Version: * 1: Historical version (i.e. unversioned) * 2: store detector and detector_config instead of pixelsize1, pixelsize2 and splinefile * 2.1: manage orientation of detector in detector_config

read_from_duck(duck)

Initialize the object using an object providing the same API.

The duck object must provide dist, poni1, poni2, rot1, rot2, rot3, wavelength, and detector.

read_from_file(filename)
read_from_geometryModel(model: GeometryModel, detector=None)

Initialize the object from a GeometryModel

pyFAI.gui.model.GeometryModel.GeometryModel

property rot1
Return type:

Union[None,float]

property rot2
Return type:

Union[None,float]

property rot3
Return type:

Union[None,float]

property wavelength
Return type:

Union[None,float]

write(fd)

Write this object to an open stream.

Parameters:

fd – file descriptor (opened file)

Returns:

nothing

pyFAI.io.sparse_frame module

Module for writing sparse frames in HDF5 in the Nexus style

pyFAI.io.sparse_frame.save_sparse(filename, frames, beamline='beamline', ai=None, source=None, extra={}, start_time=None)

Write the list of frames into a HDF5 file

Parameters:
  • filename – name of the file

  • frames – list of sparse frames (as built by sparsify)

  • beamline – name of the beamline as text

  • ai – Instance of geometry or azimuthal integrator

  • source – list of input files

  • extra – dict with extra metadata

  • start_time – float with the time of start of the processing

Returns:

None

Module contents

Module for “high-performance” writing in either 1D with Ascii , or 2D with FabIO or even nD with n varying from 2 to 4 using HDF5

Stand-alone module which tries to offer interface to HDF5 via H5Py and capabilities to write EDF or other formats using fabio.

Can be imported without h5py but then limited to fabio & ascii formats.

TODO:

  • Add monitor to HDF5

class pyFAI.io.AsciiWriter(filename=None, prefix='fai_', extension='.dat')

Bases: Writer

Ascii file writer (.xy or .dat)

__init__(filename=None, prefix='fai_', extension='.dat')
init(fai_cfg=None, lima_cfg=None)

Creates the directory that will host the output file(s)

write(data, index=0)

To be implemented

class pyFAI.io.DefaultAiWriter(filename, engine=None)

Bases: Writer

__init__(filename, engine=None)

Constructor of the historical writer of azimuthalIntegrator.

Parameters:
  • filename – name of the output file

  • ai – integrator, should provide make_headers method.

close()
flush()

To be implemented

init(fai_cfg=None, lima_cfg=None)

Creates the directory that will host the output file(s) :param fai_cfg: configuration for worker :param lima_cfg: configuration for acquisition

make_headers(hdr='#', has_mask=None, has_dark=None, has_flat=None, polarization_factor=None, normalization_factor=None, metadata=None)
Parameters:
  • hdr (str) – string used as comment in the header

  • has_dark (bool) – save the darks filenames (default: no)

  • has_flat (bool) – save the flat filenames (default: no)

  • polarization_factor (float) – the polarization factor

Returns:

the header

Return type:

str

save1D(filename, dim1, I, error=None, dim1_unit='2th_deg', has_mask=None, has_dark=False, has_flat=False, polarization_factor=None, normalization_factor=None, metadata=None)

This method save the result of a 1D integration as ASCII file.

Parameters:
  • filename (str) – the filename used to save the 1D integration

  • dim1 (numpy.ndarray) – the x coordinates of the integrated curve

  • I (numpy.mdarray) – The integrated intensity

  • error (numpy.ndarray or None) – the error bar for each intensity

  • dim1_unit (pyFAI.units.Unit) – the unit of the dim1 array

  • has_mask – a mask was used

  • has_dark – a dark-current was applied

  • has_flat – flat-field was applied

  • polarization_factor (float, None) – the polarization factor

  • normalization_factor (float, None) – the monitor value

  • metadata – JSON serializable dictionary containing the metadata

save2D(filename, I, dim1, dim2, error=None, dim1_unit='2th_deg', dim2_unit='chi_deg', has_mask=None, has_dark=False, has_flat=False, polarization_factor=None, normalization_factor=None, metadata=None, format_='edf')

This method save the result of a 2D integration.

Parameters:
  • filename (str) – the filename used to save the 2D histogram

  • dim1 (numpy.ndarray) – the 1st coordinates of the histogram

  • dim1 – the 2nd coordinates of the histogram

  • I (numpy.mdarray) – The integrated intensity

  • error (numpy.ndarray or None) – the error bar for each intensity

  • dim1_unit (pyFAI.units.Unit) – the unit of the dim1 array

  • dim2_unit (pyFAI.units.Unit) – the unit of the dim2 array

  • has_mask – a mask was used

  • has_dark – a dark-current was applied

  • has_flat – flat-field was applied

  • polarization_factor (float, None) – the polarization factor

  • normalization_factor (float, None) – the monitor value

  • metadata – JSON serializable dictionary containing the metadata

  • format – file-format to be used (FabIO format)

set_filename(filename)

Define the filename while will be used

write(data)

Minimalistic method to limit the overhead.

Parameters:

data – array with intensities or tuple (2th,I) or (I,2th,chi) :type data: Integrate1dResult, Integrate2dResult

class pyFAI.io.FabioWriter(filename=None, extension=None, directory='', prefix=None, index_format='_%04d', start_index=0, fabio_class=None)

Bases: Writer

Image file writer based on FabIO

__init__(filename=None, extension=None, directory='', prefix=None, index_format='_%04d', start_index=0, fabio_class=None)

Constructor of the class

Parameters:
  • filename

  • extension

  • prefix – basename of the file

  • index_format – “_%04s” gives “_0001” for example

  • start_index – often 0 or 1

  • fabio_class – type of file to write

close()
init(fai_cfg=None, lima_cfg=None, directory='pyFAI')

Creates the directory that will host the output file(s)

write(data, index=None, header=None)
Parameters:
  • data – 2d array to save

  • index – index of the file

  • header

class pyFAI.io.HDF5Writer(filename, hpath=None, entry_template=None, fast_scan_width=None, append_frames=False, mode='error')

Bases: Writer

Class allowing to write HDF5 Files.

CONFIG = 'configuration'
DATASET_NAME = 'data'
MODE_APPEND = 'append'
MODE_DELETE = 'delete'
MODE_ERROR = 'error'
MODE_OVERWRITE = 'overwrite'
__init__(filename, hpath=None, entry_template=None, fast_scan_width=None, append_frames=False, mode='error')

Constructor of an HDF5 writer:

Parameters:
  • filename (str) – name of the file

  • hpath (str) – Name of the entry group that will contains the NXprocess.

  • entry_template (str) – Formattable template to create a new entry (if hpath is not specified)

  • fast_scan_width (int) – set it to define the width of

close()
flush(radial=None, azimuthal=None)

Update some data like axis units and so on.

Parameters:
  • radial – position in radial direction

  • azimuthal – position in azimuthal direction

init(fai_cfg=None, lima_cfg=None)

Initializes the HDF5 file for writing :param fai_cfg: the configuration of the worker as a dictionary

set_hdf5_input_dataset(dataset)

record the input dataset with an external link

write(data, index=None)

Minimalistic method to limit the overhead. :param data: array with intensities or tuple (2th,I) or (I,2th,chi)

class pyFAI.io.Writer(filename=None, extension=None)

Bases: object

Abstract class for writers.

CONFIG_ITEMS = ['filename', 'dirname', 'extension', 'subdir', 'hpath']
__init__(filename=None, extension=None)

Constructor of the class

flush(*arg, **kwarg)

To be implemented

init(fai_cfg=None, lima_cfg=None)

Creates the directory that will host the output file(s) :param fai_cfg: configuration for worker :param lima_cfg: configuration for acquisition

setJsonConfig(json_config=None)

Sets the JSON configuration

write(data)

To be implemented

pyFAI.io.save_integrate_result(filename, result, title='title', sample='sample', instrument='beamline')

Dispatcher for saving in different formats