Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found

Target

Select target project
  • paulmc/fslpy
  • ndcn0236/fslpy
  • seanf/fslpy
3 results
Show changes
Showing
with 1497 additions and 311 deletions
``fsl.wrappers.randomise``
==========================
.. automodule:: fsl.wrappers.randomise
:members:
:undoc-members:
:show-inheritance:
......@@ -4,15 +4,29 @@
.. toctree::
:hidden:
fsl.wrappers.avwutils
fsl.wrappers.bedpostx
fsl.wrappers.bet
fsl.wrappers.bianca
fsl.wrappers.cluster_commands
fsl.wrappers.dtifit
fsl.wrappers.eddy
fsl.wrappers.epi_reg
fsl.wrappers.fast
fsl.wrappers.feat
fsl.wrappers.first
fsl.wrappers.flirt
fsl.wrappers.fnirt
fsl.wrappers.fsl_anat
fsl.wrappers.fsl_sub
fsl.wrappers.fslmaths
fsl.wrappers.fslstats
fsl.wrappers.fugue
fsl.wrappers.melodic
fsl.wrappers.misc
fsl.wrappers.oxford_asl
fsl.wrappers.randomise
fsl.wrappers.tbss
fsl.wrappers.wrapperutils
.. automodule:: fsl.wrappers
......
``fsl.wrappers.tbss``
=====================
.. automodule:: fsl.wrappers.tbss
:members:
:undoc-members:
:show-inheritance:
doc/images/fnirt_coefficient_field.png

97 KiB

File added
doc/images/fnirt_warp_field.png

81.9 KiB

File added
doc/images/nonlinear_registration_process.png

66.3 KiB

File added
......@@ -4,12 +4,57 @@
The ``fslpy`` package is a collection of utilities and data abstractions used
by |fsleyes_apidoc|_.
within `FSL <https://fsl.fmrib.ox.ac.uk/fsl/fslwiki>`_ and by
|fsleyes_apidoc|_.
The top-level Python package for ``fslpy`` is called :mod:`fsl`. It is
broadly split into the following sub-packages:
+----------------------+-----------------------------------------------------+
| :mod:`fsl.data` | contains data abstractions and I/O routines for a |
| | range of FSL and neuroimaging file types. Most I/O |
| | routines use `nibabel <https://nipy.org/nibabel/>`_ |
| | extensively. |
+----------------------+-----------------------------------------------------+
| :mod:`fsl.utils` | contains a range of miscellaneous utilities, |
| | including :mod:`fsl.utils.path`, |
| | :mod:`fsl.utils.run`, and :mod:`fsl.utils.bids` |
+----------------------+-----------------------------------------------------+
| :mod:`fsl.scripts` | contains a range of scripts which are installed as |
| | FSL commands. |
+----------------------+-----------------------------------------------------+
| :mod:`fsl.transform` | contains functions and classes for working with |
| | FSL-style linear and non-linear transformations. |
+----------------------+-----------------------------------------------------+
| :mod:`fsl.version` | simply contains the ``fslpy`` version number. |
+----------------------+-----------------------------------------------------+
| :mod:`fsl.wrappers` | contains Python functions which can be used to |
| | invoke FSL commands. |
+----------------------+-----------------------------------------------------+
The :mod:`fsl` package provides the top-level Python package namespace for
``fslpy``, and for other FSL python libaries. It is a `native namespace
package <https://packaging.python.org/guides/packaging-namespace-packages/>`_,
which means that there is no ``fsl/__init__.py`` file.
Other libraries can use the ``fsl`` package namepace simply by also omitting a
``fsl/__init__.py`` file, and by ensuring that there are no naming conflicts
with any sub-packages of ``fslpy`` or any other projects which use the ``fsl``
package namespace.
.. toctree::
:hidden:
self
fsl
fsl.data
fsl.scripts
fsl.transform
fsl.utils
fsl.wrappers
fsl.version
contributing
changelog
deprecation
dill
h5py
nibabel
nibabel.cifti2
nibabel.fileslice
nibabel.freesurfer
numpy
numpy.linalg
scipy
scipy.ndimage
scipy.ndimage.interpolation
six
#!/usr/bin/env python
#
# __init__.py - The fslpy library.
#
# Author: Paul McCarthy <pauldmccarthy@gmail.com>
#
"""The :mod:`fsl` package is a library which contains convenience classes
and functions for use by FSL python tools. It is broadly split into the
following sub-packages:
.. autosummary::
fsl.data
fsl.utils
fsl.scripts
fsl.version
fsl.wrappers
.. note:: The ``fsl`` namespace is a ``pkgutil``-style *namespace package* -
it can be used across different projects - see
https://packaging.python.org/guides/packaging-namespace-packages/
for details.
"""
__path__ = __import__('pkgutil').extend_path(__path__, __name__)
......@@ -9,4 +9,5 @@ models, constants, and other data-like things used throughout ``fslpy``.
"""
from .utils import guessType # noqa
from .utils import (guessType, # noqa
makeWriteable)
This diff is collapsed.
#!/usr/bin/env python
#
# bitmap.py - The Bitmap class
#
# Author: Paul McCarthy <pauldmccarthy@gmail.com>
#
"""This module contains the :class:`Bitmap` class, for loading bitmap image
files. Pillow is required to use the ``Bitmap`` class.
"""
import os.path as op
import pathlib
import logging
import numpy as np
import fsl.data.image as fslimage
log = logging.getLogger(__name__)
BITMAP_EXTENSIONS = ['.bmp', '.png', '.jpg', '.jpeg',
'.tif', '.tiff', '.gif', '.rgba',
'.jp2', '.jpg2', '.jp2k']
"""File extensions we understand. """
BITMAP_DESCRIPTIONS = [
'Bitmap',
'Portable Network Graphics',
'JPEG',
'JPEG',
'TIFF',
'TIFF',
'Graphics Interchange Format',
'Raw RGBA',
'JPEG 2000',
'JPEG 2000',
'JPEG 2000']
"""A description for each :attr:`BITMAP_EXTENSION`. """
class Bitmap(object):
"""The ``Bitmap`` class can be used to load a bitmap image. The
:meth:`asImage` method will convert the bitmap into an :class:`.Image`
instance.
"""
def __init__(self, bmp):
"""Create a ``Bitmap``.
:arg bmp: File name of an image, or a ``numpy`` array containing image
data.
"""
if isinstance(bmp, (pathlib.Path, str)):
try:
# Allow big/truncated images
import PIL.Image as Image
import PIL.ImageFile as ImageFile
Image .MAX_IMAGE_PIXELS = None
ImageFile.LOAD_TRUNCATED_IMAGES = True
except ImportError:
raise RuntimeError('Install Pillow to use the Bitmap class')
src = str(bmp)
img = Image.open(src)
# If this is a palette/LUT
# image, convert it into a
# regular rgb(a) image.
if img.mode == 'P':
img = img.convert()
data = np.array(img)
elif isinstance(bmp, np.ndarray):
src = 'array'
data = np.copy(bmp)
else:
raise ValueError('unknown bitmap: {}'.format(bmp))
# Make the array (w, h, c). Single channel
# (e.g. greyscale) images are returned as
# 2D arrays, whereas multi-channel images
# are returned as 3D. In either case, the
# first two dimensions are (height, width),
# but we watn them the other way aruond.
data = np.atleast_3d(data)
data = np.fliplr(data.transpose((1, 0, 2)))
data = np.array(data, dtype=np.uint8, order='C')
w, h = data.shape[:2]
self.__data = data
self.__dataSource = src
self.__name = op.basename(src)
def __hash__(self):
"""Returns a number which uniquely idenfities this ``Bitmap`` instance
(the result of ``id(self)``).
"""
return id(self)
def __str__(self):
"""Return a string representation of this ``Bitmap`` instance."""
return '{}({}, {})'.format(self.__class__.__name__,
self.dataSource,
self.shape)
def __repr__(self):
"""See the :meth:`__str__` method. """
return self.__str__()
@property
def name(self):
"""Returns the name of this ``Bitmap``, typically the base name of the
file.
"""
return self.__name
@property
def dataSource(self):
"""Returns the bitmap data source - typically the file name. """
return self.__dataSource
@property
def data(self):
"""Convenience method which returns the bitmap data as a ``(w, h, c)``
array, where ``c`` is either 3 or 4.
"""
return self.__data
@property
def shape(self):
"""Returns the bitmap shape - ``(width, height, nchannels)``. """
return self.__data.shape
def asImage(self):
"""Convert this ``Bitmap`` into an :class:`.Image` instance. """
width, height, nchannels = self.shape
if nchannels == 1:
dtype = np.uint8
elif nchannels == 3:
dtype = np.dtype([('R', 'uint8'),
('G', 'uint8'),
('B', 'uint8')])
elif nchannels == 4:
dtype = np.dtype([('R', 'uint8'),
('G', 'uint8'),
('B', 'uint8'),
('A', 'uint8')])
else:
raise ValueError('Cannot convert bitmap with {} '
'channels into nifti image'.format(nchannels))
if nchannels == 1:
data = self.data.reshape((width, height))
else:
data = np.zeros((width, height), dtype=dtype)
for ci, ch in enumerate(dtype.names):
data[ch] = self.data[..., ci]
data = np.asarray(data, order='F')
return fslimage.Image(data,
name=self.name,
dataSource=self.dataSource)
This diff is collapsed.
......@@ -30,6 +30,7 @@ specification:
NIFTI_XFORM_ALIGNED_ANAT
NIFTI_XFORM_TALAIRACH
NIFTI_XFORM_MNI_152
NIFTI_XFORM_TEMPLATE_OTHER
"""
......@@ -81,7 +82,14 @@ NIFTI_XFORM_MNI_152 = 4
"""MNI 152 normalized coordinates."""
NIFTI_XFORM_ANALYZE = 5
NIFTI_XFORM_TEMPLATE_OTHER = 5
"""Coordinates aligned to some template that is not MNI152 or Talairach.
See https://www.nitrc.org/forum/message.php?msg_id=26394 for details.
"""
NIFTI_XFORM_ANALYZE = 6
"""Code which indicates that this is an ANALYZE image, not a NIFTI image. """
......@@ -98,6 +106,36 @@ NIFTI_UNITS_PPM = 40
NIFTI_UNITS_RADS = 48
# NIFTI datatype codes
NIFTI_DT_NONE = 0
NIFTI_DT_UNKNOWN = 0
NIFTI_DT_BINARY = 1
NIFTI_DT_UNSIGNED_CHAR = 2
NIFTI_DT_SIGNED_SHORT = 4
NIFTI_DT_SIGNED_INT = 8
NIFTI_DT_FLOAT = 16
NIFTI_DT_COMPLEX = 32
NIFTI_DT_DOUBLE = 64
NIFTI_DT_RGB = 128
NIFTI_DT_ALL = 255
NIFTI_DT_UINT8 = 2
NIFTI_DT_INT16 = 4
NIFTI_DT_INT32 = 8
NIFTI_DT_FLOAT32 = 16
NIFTI_DT_COMPLEX64 = 32
NIFTI_DT_FLOAT64 = 64
NIFTI_DT_RGB24 = 128
NIFTI_DT_INT8 = 256
NIFTI_DT_UINT16 = 512
NIFTI_DT_UINT32 = 768
NIFTI_DT_INT64 = 1024
NIFTI_DT_UINT64 = 1280
NIFTI_DT_FLOAT128 = 1536
NIFTI_DT_COMPLEX128 = 1792
NIFTI_DT_COMPLEX256 = 2048
NIFTI_DT_RGBA32 = 2304
# NIFTI file intent codes
NIFTI_INTENT_NONE = 0
NIFTI_INTENT_CORREL = 2
......
......@@ -29,22 +29,56 @@ import os
import os.path as op
import subprocess as sp
import re
import sys
import glob
import json
import shlex
import shutil
import logging
import binascii
import numpy as np
import nibabel as nib
import fsl.utils.tempdir as tempdir
import fsl.utils.memoize as memoize
import fsl.data.image as fslimage
import fsl.utils.tempdir as tempdir
import fsl.utils.memoize as memoize
import fsl.utils.platform as fslplatform
import fsl.data.image as fslimage
log = logging.getLogger(__name__)
MIN_DCM2NIIX_VERSION = (1, 0, 2017, 12, 15)
"""Minimum version of dcm2niix that is required for this module to work. """
"""Minimum version of ``dcm2niix`` that is required for this module to work.
"""
CRC_DCM2NIIX_VERSION = (1, 0, 2019, 9, 2)
"""For versions of ``dcm2niix`` orf this version or newer, the ``-n`` flag,
used to convert a single DICOM series, requires that a CRC checksum
identifying the series be passed (see the :func:`seriesCRC`
function). Versions prior to this require the series number to be passed.
"""
def dcm2niix() -> str:
"""Tries to find an absolute path to the ``dcm2niix`` command. Returns
``'dcm2niix'`` (unqualified) if a specific executable cannot be found.
"""
fsldir = fslplatform.platform.fsldir
candidates = [
shutil.which('dcm2niix')
]
if fsldir is not None:
candidates.insert(0, op.join(fsldir, 'bin', 'dcm2niix'))
for c in candidates:
if c is not None and op.exists(c):
return c
return 'dcm2niix'
class DicomImage(fslimage.Image):
......@@ -80,12 +114,19 @@ class DicomImage(fslimage.Image):
@memoize.memoize
def enabled():
"""Returns ``True`` if ``dcm2niix`` is present, and recent enough,
``False`` otherwise.
def installedVersion():
"""Return a tuple describing the version of ``dcm2niix`` that is installed,
or ``None`` if dcm2niix cannot be found, or its version not parsed.
The returned tuple contains the following fields, all integers:
- Major version number
- Minor version number
- Year
- Month
- Day
"""
cmd = 'dcm2niix -h'
cmd = f'{dcm2niix()} -h'
versionPattern = re.compile(r'v'
r'(?P<major>[0-9]+)\.'
r'(?P<minor>[0-9]+)\.'
......@@ -102,80 +143,124 @@ def enabled():
match = re.match(versionPattern, word)
if match is None:
continue
if match is not None:
return (int(match.group('major')),
int(match.group('minor')),
int(match.group('year')),
int(match.group('month')),
int(match.group('day')))
installedVersion = (
int(match.group('major')),
int(match.group('minor')),
int(match.group('year')),
int(match.group('month')),
int(match.group('day')))
except Exception as e:
log.debug(f'Error parsing dcm2niix version string: {e}')
return None
# make sure installed version
# is equal to or newer than
# minimum required version
for iv, mv in zip(installedVersion, MIN_DCM2NIIX_VERSION):
if iv > mv: return True
elif iv < mv: return False
# if we get here, versions are equal
return True
def compareVersions(v1, v2):
"""Compares two ``dcm2niix`` versions ``v1`` and ``v2``. The versions are
assumed to be in the format returned by :func:`installedVersion`.
except Exception as e:
log.debug('Error parsing dcm2niix version string: {}'.format(e))
:returns: - 1 if ``v1`` is newer than ``v2``
- -1 if ``v1`` is older than ``v2``
- 0 if ``v1`` the same as ``v2``.
"""
return False
for iv1, iv2 in zip(v1, v2):
if iv1 > iv2: return 1
elif iv1 < iv2: return -1
return 0
def enabled():
"""Returns ``True`` if ``dcm2niix`` is present, and recent enough,
``False`` otherwise.
"""
installed = installedVersion()
required = MIN_DCM2NIIX_VERSION
return ((installed is not None) and
(compareVersions(installed, required) >= 0))
def scanDir(dcmdir):
"""Uses ``dcm2niix`` to scans the given DICOM directory, and returns a
list of dictionaries, one for each data series that was identified.
Each dictionary is populated with some basic metadata about the series.
"""Uses the ``dcm2niix -b o`` option to generate a BIDS sidecar JSON
file for each series in the given DICOM directory. Reads them all in,
and returns them as a sequence of dicts.
Some additional metadata is added to each dictionary:
- ``DicomDir``: The absolute path to ``dcmdir``
:arg dcmdir: Directory containing DICOM files.
:arg dcmdir: Directory containing DICOM series
:returns: A list of dictionaries, each containing metadata about
one DICOM data series.
:returns: A list of dicts, each containing the BIDS sidecar JSON
metadata for one DICOM series.
"""
if not enabled():
raise RuntimeError('dcm2niix is not available or is too old')
dcmdir = op.abspath(dcmdir)
cmd = 'dcm2niix -b o -ba n -f %s -o . {}'.format(dcmdir)
snumPattern = re.compile('^[0-9]+')
dcmdir = op.abspath(dcmdir)
cmd = f'{dcm2niix()} -b o -ba n -f %s -o . "{dcmdir}"'
series = []
with tempdir.tempdir() as td:
with open(os.devnull, 'wb') as devnull:
sp.call(cmd.split(), stdout=devnull, stderr=devnull)
sp.call(shlex.split(cmd), stdout=devnull, stderr=devnull)
files = glob.glob(op.join(td, '*.json'))
if len(files) == 0:
return []
# sort numerically by series number if possible
try:
def sortkey(f):
match = re.match(snumPattern, f)
snum = int(match.group(0))
return snum
files = sorted(files, key=sortkey)
except Exception:
files = sorted(files)
series = []
for fn in files:
with open(fn, 'rt') as f:
meta = json.load(f)
meta = json.load(f)
meta['DicomDir'] = dcmdir
# SeriesDescription is not
# guaranteed to be present
if 'SeriesDescription' not in meta:
meta['SeriesDescription'] = meta['SeriesNumber']
series.append(meta)
return series
# sort by series number
def key(s):
return s.get('SeriesNumber', sys.maxsize)
series = list(sorted(series, key=key))
return series
def seriesCRC(series):
"""Calculate a checksum string of the given DICOM series.
The returned string is of the form::
SeriesCRC[.echonumber]
Where ``SeriesCRC`` is an unsigned integer which is the CRC32
checksum of the ``SeriesInstanceUID``, and ``echonumber`` is
the ``EchoNumber`` of the series - this is only present for
multi-echo data, where the series is from the second or subsequent
echos.
:arg series: Dict containing BIDS metadata about a DICOM series,
as returned by :func:`scanDir`.
:returns: String containing a CRC32 checksum for the series.
"""
uid = series.get('SeriesInstanceUID', None)
echo = series.get('EchoNumber', None)
if uid is None:
return None
crc32 = str(binascii.crc32(uid.encode()))
if echo is not None and echo > 1:
crc32 = f'{crc32}.{echo}'
return crc32
def loadSeries(series):
......@@ -192,20 +277,39 @@ def loadSeries(series):
if not enabled():
raise RuntimeError('dcm2niix is not available or is too old')
dcmdir = series['DicomDir']
snum = series['SeriesNumber']
desc = series['SeriesDescription']
cmd = 'dcm2niix -b n -f %s -z n -o . -n {} {}'.format(snum, dcmdir)
dcmdir = series['DicomDir']
snum = series['SeriesNumber']
desc = series['SeriesDescription']
version = installedVersion()
with tempdir.tempdir() as td:
# Newer versions of dcm2niix
# require a CRC to identify
# series
if compareVersions(version, CRC_DCM2NIIX_VERSION) >= 0:
ident = seriesCRC(series)
with open(os.devnull, 'wb') as devnull:
sp.call(cmd.split(), stdout=devnull, stderr=devnull)
# Older versions require
# the series number
else:
ident = snum
files = glob.glob(op.join(td, '{}*.nii'.format(snum)))
images = [nib.load(f) for f in files]
cmd = f'{dcm2niix()} -b n -f %s -z n -o . -n "{ident}" "{dcmdir}"'
# Force-load images into memory
[i.get_data() for i in images]
with tempdir.tempdir() as td:
with open(os.devnull, 'wb') as devnull:
sp.call(shlex.split(cmd), stdout=devnull, stderr=devnull)
files = glob.glob(op.join(td, f'{snum}*.nii'))
images = [nib.load(f, mmap=False) for f in files]
# copy images so nibabel no longer
# refs to the files (as they will
# be deleted), and force-load the
# the image data into memory (to
# avoid any disk accesses due to
# e.g. memmap)
images = [nib.Nifti1Image(np.asanyarray(i.dataobj), None, i.header)
for i in images]
return [DicomImage(i, series, dcmdir, name=desc) for i in images]
......@@ -3,11 +3,19 @@
# dtifit.py - The DTIFitTensor class, and some related utility functions.
#
# Author: Paul McCarthy <pauldmccarthy@gmail.com>
# Author: Michiel Cottaar <michiel.cottaar@ndcn.ox.ac.uk>
#
"""This module provides the :class:`.DTIFitTensor` class, which encapsulates
the diffusion tensor data generated by the FSL ``dtifit`` tool.
The following utility functions are also defined:
There are also conversion tools between the diffusion tensors defined in 3 formats:
* (..., 3, 3) array with the full diffusion tensor
* (..., 6) array with the unique components (Dxx, Dxy, Dxz, Dyy, Dyz, Dzz)
* Tuple with the eigenvectors and eigenvalues (V1, V2, V3, L1, L2, L3)
Finally the following utility functions are also defined:
.. autosummary::
:nosignatures:
......@@ -33,6 +41,123 @@ from . import image as fslimage
log = logging.getLogger(__name__)
def eigendecompositionToTensor(V1, V2, V3, L1, L2, L3):
"""
Converts the eigenvalues/eigenvectors into a 3x3 diffusion tensor
:param V1: (..., 3) shaped array with the first eigenvector
:param V2: (..., 3) shaped array with the second eigenvector
:param V3: (..., 3) shaped array with the third eigenvector
:param L1: (..., ) shaped array with the first eigenvalue
:param L2: (..., ) shaped array with the second eigenvalue
:param L3: (..., ) shaped array with the third eigenvalue
:return: (..., 3, 3) array with the diffusion tensor
"""
check_shape = L1.shape
for eigen_value in (L2, L3):
if eigen_value.shape != check_shape:
raise ValueError("Not all eigenvalues have the same shape")
for eigen_vector in (V1, V2, V3):
if eigen_vector.shape != eigen_value.shape + (3, ):
raise ValueError("Not all eigenvectors have the same shape as the eigenvalues")
return (
L1[..., None, None] * V1[..., None, :] * V1[..., :, None] +
L2[..., None, None] * V2[..., None, :] * V2[..., :, None] +
L3[..., None, None] * V3[..., None, :] * V3[..., :, None]
)
def tensorToEigendecomposition(matrices):
"""
Decomposes the 3x3 diffusion tensor into eigenvalues and eigenvectors
:param matrices: (..., 3, 3) array-like with diffusion tensor
:return: Tuple containing the eigenvectors and eigenvalues (V1, V2, V3, L1, L2, L3)
"""
matrices = np.asanyarray(matrices)
if matrices.shape[-2:] != (3, 3):
raise ValueError("Expected 3x3 diffusion tensors")
shape = matrices.shape[:-2]
nvoxels = np.prod(shape)
# Calculate the eigenvectors and
# values on all of those matrices
flat_matrices = matrices.reshape((-1, 3, 3))
vals, vecs = npla.eigh(flat_matrices)
vecShape = shape + (3, )
l1 = vals[:, 2] .reshape(shape)
l2 = vals[:, 1] .reshape(shape)
l3 = vals[:, 0] .reshape(shape)
v1 = vecs[:, :, 2].reshape(vecShape)
v2 = vecs[:, :, 1].reshape(vecShape)
v3 = vecs[:, :, 0].reshape(vecShape)
return v1, v2, v3, l1, l2, l3
def tensorToComponents(matrices):
"""
Extracts the 6 unique components from a 3x3 diffusion tensor
:param matrices: (..., 3, 3) array-like with diffusion tensors
:return: (..., 6) array with the unique components sorted like Dxx, Dxy, Dxz, Dyy, Dyz, Dzz
"""
matrices = np.asanyarray(matrices)
if matrices.shape[-2:] != (3, 3):
raise ValueError("Expected 3x3 diffusion tensors")
return np.stack([
matrices[..., 0, 0],
matrices[..., 0, 1],
matrices[..., 0, 2],
matrices[..., 1, 1],
matrices[..., 1, 2],
matrices[..., 2, 2],
], -1)
def componentsToTensor(components):
"""
Creates 3x3 diffusion tensors from the 6 unique components
:param components: (..., 6) array-like with Dxx, Dxy, Dxz, Dyy, Dyz, Dzz
:return: (..., 3, 3) array with the diffusion tensors
"""
components = np.asanyarray(components)
if components.shape[-1] != 6:
raise ValueError("Expected 6 unique components of diffusion tensor")
first = np.stack([components[..., index] for index in (0, 1, 2)], -1)
second = np.stack([components[..., index] for index in (1, 3, 4)], -1)
third = np.stack([components[..., index] for index in (2, 4, 5)], -1)
return np.stack([first, second, third], -1)
def eigendecompositionToComponents(V1, V2, V3, L1, L2, L3):
"""
Converts the eigenvalues/eigenvectors into the 6 unique components of the diffusion tensor
:param V1: (..., 3) shaped array with the first eigenvector
:param V2: (..., 3) shaped array with the second eigenvector
:param V3: (..., 3) shaped array with the third eigenvector
:param L1: (..., ) shaped array with the first eigenvalue
:param L2: (..., ) shaped array with the second eigenvalue
:param L3: (..., ) shaped array with the third eigenvalue
:return: (..., 6) array with the unique components sorted like Dxx, Dxy, Dxz, Dyy, Dyz, Dzz
"""
return tensorToComponents(eigendecompositionToTensor(V1, V2, V3, L1, L2, L3))
def componentsToEigendecomposition(components):
"""
Decomposes diffusion tensor defined by its 6 unique components
:param components: (..., 6) array-like with Dxx, Dxy, Dxz, Dyy, Dyz, Dzz
:return: Tuple containing the eigenvectors and eigenvalues (V1, V2, V3, L1, L2, L3)
"""
return tensorToEigendecomposition(componentsToTensor(components))
def getDTIFitDataPrefix(path):
"""Returns the prefix (a.k,a, base name) used for the ``dtifit`` file
names in the given directory, or ``None`` if the ``dtifit`` files could
......@@ -117,53 +242,7 @@ def decomposeTensorMatrix(data):
:returns: A tuple containing the principal eigenvectors and
eigenvalues of the tensor matrix.
"""
# The image contains 6 volumes, corresponding
# to the Dxx, Dxy, Dxz, Dyy, Dyz, Dzz elements
# of the tensor matrix, at each voxel.
#
# We need to re-organise this into a series of
# complete 3x3 tensor matrices, one for each
# voxel.
shape = data.shape[:3]
nvoxels = np.prod(shape)
matrices = np.zeros((nvoxels, 3, 3), dtype=np.float32)
# Copy the tensor matrix elements
# into their respective locations
matrices[:, 0, 0] = data[..., 0].flat
matrices[:, 0, 1] = data[..., 1].flat
matrices[:, 1, 0] = data[..., 1].flat
matrices[:, 0, 2] = data[..., 2].flat
matrices[:, 2, 0] = data[..., 2].flat
matrices[:, 1, 1] = data[..., 3].flat
matrices[:, 1, 2] = data[..., 4].flat
matrices[:, 2, 1] = data[..., 4].flat
matrices[:, 2, 2] = data[..., 5].flat
# Calculate the eigenvectors and
# values on all of those matrices
vals, vecs = npla.eig(matrices)
vecShape = list(shape) + [3]
# Grr, np.linalg.eig does not
# sort the eigenvalues/vectors,
# so we have to do it ourselves.
order = vals.argsort(axis=1)
i = np.arange(nvoxels)[:, np.newaxis]
vecs = vecs.transpose(0, 2, 1)
vals = vals[i, order]
vecs = vecs[i, order, :]
l1 = vals[:, 2] .reshape(shape)
l2 = vals[:, 1] .reshape(shape)
l3 = vals[:, 0] .reshape(shape)
v1 = vecs[:, 2, :].reshape(vecShape)
v2 = vecs[:, 1, :].reshape(vecShape)
v3 = vecs[:, 0, :].reshape(vecShape)
return v1, v2, v3, l1, l2, l3
return componentsToEigendecomposition(data)
class DTIFitTensor(fslimage.Nifti):
......
This diff is collapsed.