Commit c00bfdaf authored by William Clarke's avatar William Clarke
Browse files

Merge branch 'master' into 'master'

Master changes for 1.1.5 into production repo

See merge request fsl/fsl_mrs!23
parents 6ffd9c76 0e22a051
This document contains the FSL-MRS release history in reverse chronological order.
1.1.5 (Wednesday 11th August 2021)
----------------------------------
- Updated example MRSI data to conform to NIfTI-MRS standard.
- Quantification will not fail if volume fractions do not sum exactly to 1.0 (to within 1E-3).
- fixed bug in fsl_mrsi looking for TE in wrong header structure.
- New mrs_tools command 'conjugate' to help fix NIfTI-MRS data with the wrong phase/frequency convention.
- basis_tools remove has number of HLSVD components reduced to stop odd broad resonance behaviour.
- fsl_mrs_proc align can now align across all higher dimension FIDs. Pass 'all' as dimension tag.
- New command "fsl_mrs_proc model". HSLVD modelling of peaks in defined region. Number of components settable.
- Updates to basis set simulator. Non-uniform slice select gradients are now handled.
1.1.4 (Tuesday 3rd August 2021)
-------------------------------
- Fixed bug in calculation of molality concentration. Tissue mole fractions had been swapped for tissue volume fractions. Molar concentrations unaffected.
......
%% Cell type:markdown id: tags:
# Example of MRSI fitting on the command line
This notebook demos the process of fitting an MRSI scan using the command line scripts included in FSL-MRS.
### Contents:
- [1. Reconstruction, Processing and Data Conversion](#1.-Reconstruction,-processing-and-data-conversion)
- [2. Segmentation](#2.-Tissue-segmentation)
- [3. Fitting](#3.-Fitting)
- [4. Visualisation of fit](#4.-Visualisation-of-fit)
Will Clarke
June 2020
University of Oxford
%% Cell type:markdown id: tags:
## 1. Reconstruction, processing and data conversion
MRSI reconstruction (from k-space data) can be highly specialised depending on the sequence. For example reconstruction of non-cartesian trajectories (e.g. spirals or concentric-rings) requires regridding or use of the NUFFT. Due to the specialised nature of MRSI reconstruction FSL-MRS does not contain tools for this step.
Simmilarly post-processing of MRSI data is commonly done as part of the reconstruction process. Though many of the processing tools in FSL-MRS can be run on MRSI data they have not been created with MRSI in mind.
This example therefore assumes that you are able to provide reconstructed and processed data ready for fitting. Data can be converted to NIfTI from the Siemens DICOM format using spec2nii. The authors of FSL-MRS and spec2nii are happy to add supported formats if example data and interpretation is provided.
%% Cell type:markdown id: tags:
## 2. Tissue segmentation
Tissue segmentation is required to have meaningful scaling of water referenced metabolite concentrations. mrsi_segment will produce three files, corresponding to the GM, WM and CSF FAST tissue segmentations registered to the MRSI volume.
Run tissue segmentation on the packaged T1 data and mask using the MRSI data file. Here we provide a (partial) .anat file produced by [fsl_anat](https://fsl.fmrib.ox.ac.uk/fsl/fslwiki/fsl_anat) to speed up execution.
This step requires an installation of FSL compatible with fslpy.
%% Cell type:code id: tags:
``` python
#%%capture
%sx mkdir -p MRSI
%sx mrsi_segment -o MRSI -a example_data/example_mrsi/T1.anat example_data/example_mrsi/mrsi.nii.gz
```
%% Cell type:markdown id: tags:
## 3. Fitting
The below is a call to the main MRSI wrapper script. Not the use of the :code:`--add_MM` flag as this metabolite basis does not contain a macromolecule basis. Also note that the mrsi dataset provided is accompagnied by a JSON sidecar file which contains some useful information such as the echo time, the central frequency, and the dwell time.
The below is a call to the main MRSI wrapper script. Note the use of the `--add_MM` flag as this metabolite basis does not contain a macromolecule basis. Also note that the mrsi dataset provided is accompagnied by a JSON sidecar file which contains some useful information such as the echo time, the central frequency, and the dwell time.
The script will by default run in parallel on the available CPU cores. Depending on hardware this should take a few minutes.
%% Cell type:code id: tags:
``` python
%sx fsl_mrsi --data example_data/example_mrsi/mrsi.nii.gz \
--basis example_data/example_mrsi/3T_slaser_32vespa_1250.BASIS \
--output MRSI/example_mrsi_fit \
--mask example_data/example_mrsi/mask.nii.gz \
--h2o example_data/example_mrsi/wref.nii.gz \
--tissue_frac MRSI/mrsi_seg_wm.nii.gz MRSI/mrsi_seg_gm.nii.gz MRSI/mrsi_seg_csf.nii.gz \
--add_MM \
--baseline_order 2 \
--combine PCho GPC --combine Cr PCr --combine NAA NAAG --combine Glu Gln --combine Glc Tau \
--ignore Gly HG \
--ignore Gly Gly_1 HG \
--overwrite
```
%% Cell type:markdown id: tags:
## 4. Visualisation of fit
Now take a look at the outputs. A PNG of the fit to the average of all voxels is provided for a quick sanity check. The folders contain the following:
- concs : concentration for each metabolite or combined metabolites (subfolders contain different types of referencing)
- fit : the model prediction FID, the residual FID, and the baseline (also in the time domain).
- qc : QC parameters split per metabolite
- uncertainties : the std of the fit per metabolite
%% Cell type:code id: tags:
``` python
%sx ls -l MRSI/example_mrsi_fit
```
%% Cell type:markdown id: tags:
Now run the command below in a terminal in order to load the data in FSLeyes. Follow the instructions [here](https://users.fmrib.ox.ac.uk/~saad/fsl_mrs/html/visualisation.html#fsleyes) to set it up in such a way that you can explore each voxel's individual fit.
Now run the command below in a terminal in order to load the data in FSLeyes. You will need to install the NIfTI-MRS plugin for FSLeyes. Instructions for installation are available [online](https://open.win.ox.ac.uk/pages/wclarke/fsleyes-plugin-mrs/install.html).
Then follow the the instructions [here](https://open.win.ox.ac.uk/pages/wclarke/fsleyes-plugin-mrs/mrsi_results.html) to set it up in such a way that you can explore each voxel's individual fit.
To get started, after installing the plugin, run the following command:
```
fsleyes example_data/example_mrsi/T1.anat/T1.nii.gz example_data/example_mrsi/mrsi.nii.gz MRSI/example_mrsi_fit/fit/fit.nii.gz MRSI/example_mrsi_fit/fit/baseline.nii.gz MRSI/example_mrsi_fit/concs/internal/NAA+NAAG.nii.gz
fsleyes -smrs example_data/example_mrsi/T1.anat/T1.nii.gz example_data/example_mrsi/mrsi.nii.gz
```
%% Cell type:code id: tags:
``` python
%sx fsleyes example_data/example_mrsi/T1.anat/T1.nii.gz example_data/example_mrsi/mrsi.nii.gz MRSI/example_mrsi_fit/fit/fit.nii.gz MRSI/example_mrsi_fit/fit/baseline.nii.gz MRSI/example_mrsi_fit/concs/internal/NAA+NAAG.nii.gz
%sx fsleyes -smrs example_data/example_mrsi/T1.anat/T1.nii.gz example_data/example_mrsi/mrsi.nii.gz
```
......
This source diff could not be displayed because it is stored in LFS. You can view the blob instead.
This source diff could not be displayed because it is stored in LFS. You can view the blob instead.
......@@ -428,6 +428,21 @@ class NIFTI_MRS(Image):
else:
raise TypeError('dim should be int or a string matching one of the dim tags.')
def iterate_over_spatial(self):
"""Iterate over spatial voxels yeilding a data array the shape of the FID and any higher dimensions + index.
:yield: Complex FID data with any higher dimensions. Index to data.
:rtype: tuple
"""
data = self.data
def calc_slice_idx(idx):
slice_obj = list(idx[:3]) + [slice(None), ] * (self.data.ndim - 3)
return tuple(slice_obj)
for idx in np.ndindex(data.shape[:3]):
yield self.data[idx], calc_slice_idx(idx)
def generate_mrs(self, dim=None, basis_file=None, basis=None, ref_data=None):
"""Generator for MRS or MRSI objects from the data, optionally returning a whole dimension as a list.
......
Subproject commit 22c8af790cd10e7c87b21bfbebded97d6c534348
Subproject commit 58b4ac88902abf935f096a225354c9d4eb5f6bbc
......@@ -198,6 +198,7 @@ def convert(args):
def add(args):
from fsl_mrs.utils.mrs_io import read_basis
from fsl_mrs.utils import basis_tools
import json
import numpy as np
......@@ -217,13 +218,23 @@ def add(args):
bw = 1 / json_dict['basis_dwell']
width = json_dict['basis_width']
basis_tools.add_basis(
fid, name, cf, bw, args.target,
# Check that target exists
target = Path(args.target)
if not target.is_dir():
raise NotADirectoryError('Target must be a directory of FSL-MRS basis (json) files')
# Load target
target_basis = read_basis(target)
target_basis = basis_tools.add_basis(
fid, name, cf, bw, target_basis,
scale=args.scale,
width=width,
conj=args.conj,
pad=args.pad,
sim_info=args.info)
pad=args.pad)
# Write to json without overwriting existing files
target_basis.save(target, info_str=args.info)
def shift(args):
......
......@@ -85,6 +85,7 @@ def main():
help='List of files to align')
align_group.add_argument('--dim', type=str, default='DIM_DYN',
help='NIFTI-MRS dimension tag to align across.'
'Or "all" to align over all spectra in higer dimensions.'
'Default = DIM_DYN')
align_group.add_argument('--ppm', type=float, nargs=2,
metavar='<lower-limit upper-limit>',
......@@ -143,6 +144,22 @@ def main():
hlsvdparser.set_defaults(func=remove)
add_common_args(hlsvdparser)
# model subcommand - model peaks using HLSVD
modelparser = sp.add_parser('model', add_help=False,
help='Model peaks with HLSVD.')
model_group = modelparser.add_argument_group('HLSVD modelling arguments')
model_group.add_argument('--file', type=str, required=True,
help='Data file(s)')
model_group.add_argument('--ppm', type=float, nargs=2,
metavar='<lower-limit upper-limit>',
default=[4.5, 4.8],
help='ppm limits of removal window')
model_group.add_argument('--components', type=int,
default=5,
help='Number of components to model.')
modelparser.set_defaults(func=model)
add_common_args(modelparser)
# tshift subcommand - shift/resample in timedomain
tshiftparser = sp.add_parser('tshift', add_help=False,
help='shift/resample in timedomain.')
......@@ -473,7 +490,9 @@ def average(dataobj, args):
def align(dataobj, args):
if args['dim'] not in dataobj.data.dim_tags:
if args['dim'].lower() == 'all':
pass
elif args['dim'] not in dataobj.data.dim_tags:
raise InappropriateDataError(f'Data ({dataobj.datafilename}) has no {args["dim"]} dimension.'
f' Dimensions are is {dataobj.data.dim_tags}.')
......@@ -521,6 +540,16 @@ def remove(dataobj, args):
return datacontainer(corrected, dataobj.datafilename)
def model(dataobj, args):
modelled = preproc.hlsvd_model_peaks(dataobj.data,
limits=args['ppm'],
components=args['components'],
report=args['generateReports'],
report_all=args['allreports'])
return datacontainer(modelled, dataobj.datafilename)
def tshift(dataobj, args):
shifted = preproc.tshift(dataobj.data,
tshiftStart=args['tshiftStart'],
......
......@@ -234,8 +234,8 @@ def main():
# Echo time
if args.TE is not None:
echotime = args.TE * 1E-3
elif 'TE' in mrsi.header:
echotime = mrsi.header['TE']
elif 'EchoTime' in mrsi_data.hdr_ext:
echotime = mrsi_data.hdr_ext['EchoTime']
else:
echotime = None
# Repetition time
......
......@@ -90,7 +90,7 @@ def main():
# Reorder tool
reorderparser = sp.add_parser(
'reorder',
help='Reorider higher dimensions of NIfTI-MRS.')
help='Reorder higher dimensions of NIfTI-MRS.')
reorderparser.add_argument('--file', type=Path, required=True,
help='File to reorder')
reorderparser.add_argument('--dim_order', type=str, nargs='+', required=True,
......@@ -104,6 +104,19 @@ def main():
help='Override output file names.')
reorderparser.set_defaults(func=reorder)
# conjugate tool
conjparser = sp.add_parser(
'conjugate',
help='Conjugate data to correct phase/frequency convention in a NIfTI-MRS file.')
conjparser.add_argument('--file', type=Path, required=True,
help='File to conjugate')
conjparser.add_argument('--output',
required=True, type=Path, default=Path('.'),
help='output folder (defaults to current directory)')
conjparser.add_argument('--filename', type=str,
help='Override output file names.')
conjparser.set_defaults(func=conj)
# Parse command-line arguments
args = p.parse_args()
......@@ -297,5 +310,28 @@ def reorder(args):
reordered.save(file_out)
def conj(args):
"""Conjugate the data in a nifti-mrs file
:param args: Argparse interpreted arguments
:type args: Namespace
"""
from fsl_mrs.utils import nifti_mrs_tools as nmrs_tools
from fsl_mrs.utils import mrs_io
# 1. Load the file
infile = mrs_io.read_FID(str(args.file))
name = args.file.with_suffix('').with_suffix('').name
# 2. conjugate the file
outfile = nmrs_tools.conjugate(infile)
# 3. Save the output file
if args.filename:
file_out = args.output / args.filename
else:
file_out = args.output / name
outfile.save(file_out)
if __name__ == '__main__':
main()
......@@ -91,6 +91,18 @@ def test_nifti_mrs_generator():
break
def test_nifti_mrs_spatial_generator():
obj = NIFTI_MRS(data['unprocessed'])
for gen_data, slice_idx in obj.iterate_over_spatial():
assert gen_data.shape == (4096, 32, 64)
assert slice_idx == (0, 0, 0,
slice(None, None, None),
slice(None, None, None),
slice(None, None, None))
break
def test_nifti_mrs_gen_mrs():
obj = NIFTI_MRS(data['unprocessed'])
......
......@@ -157,3 +157,19 @@ def test_reorder(tmp_path):
'--file', str(test_data_split)])
assert (tmp_path / 'reordered_file.nii.gz').exists()
# Test conjugate option
def test_conjugate(tmp_path):
subprocess.check_call(['mrs_tools', 'conjugate',
'--output', str(tmp_path),
'--filename', 'conj_file',
'--file', str(svs)])
assert (tmp_path / 'conj_file.nii.gz').exists()
subprocess.check_call(['mrs_tools', 'conjugate',
'--output', str(tmp_path),
'--file', str(svs)])
assert (tmp_path / 'metab.nii.gz').exists()
......@@ -195,11 +195,42 @@ def mrsi_data_diff(tmp_path):
return testfile, nmrs
@pytest.fixture
def svs_data_uncomb_reps(tmp_path):
coils = 2
noiseconv = 0.1 * np.eye(coils)
coilamps = np.random.randn(coils)
coilphs = np.random.random(coils) * 2 * np.pi
FID, hdr = syntheticFID(noisecovariance=noiseconv,
coilamps=coilamps,
coilphase=coilphs,
points=512)
FID2, hdr = syntheticFID(noisecovariance=noiseconv,
coilamps=coilamps,
coilphase=coilphs,
points=512)
FID = np.stack((np.asarray(FID).T, np.asarray(FID2).T), axis=2)
FID = FID.reshape((1, 1, 1) + FID.shape)
nmrs = gen_new_nifti_mrs(FID,
hdr['dwelltime'],
hdr['centralFrequency'],
dim_tags=['DIM_COIL', 'DIM_DYN', None])
testname = 'svsdata_uncomb_reps.nii'
testfile = op.join(tmp_path, testname)
nmrs.save(testfile)
return testfile, nmrs
def splitdata(svs, mrsi):
return svs[0], mrsi[0], svs[1], mrsi[1]
def test_filecreation(svs_data, mrsi_data, svs_data_uncomb, mrsi_data_uncomb):
def test_filecreation(svs_data, mrsi_data, svs_data_uncomb, mrsi_data_uncomb, svs_data_uncomb_reps):
svsfile, mrsifile, svsdata, mrsidata = splitdata(svs_data, mrsi_data)
data = read_FID(svsfile)
......@@ -221,6 +252,11 @@ def test_filecreation(svs_data, mrsi_data, svs_data_uncomb, mrsi_data_uncomb):
assert data.shape == (2, 2, 2, 512, 4)
assert np.allclose(data.data, mrsidata.data)
# Uncombined and reps
data = read_FID(svs_data_uncomb_reps[0])
assert data.shape == (1, 1, 1, 512, 2, 2)
assert np.allclose(data.data, svs_data_uncomb_reps[1].data)
def test_coilcombine(svs_data_uncomb, mrsi_data_uncomb, tmp_path):
svsfile, mrsifile, svsdata, mrsidata = splitdata(svs_data_uncomb,
......@@ -313,7 +349,6 @@ def test_align(svs_data, mrsi_data, tmp_path):
assert np.allclose(data.data, directRun.data)
# Run coil combination on both sets of data using the command line
subprocess.check_call(['fsl_mrs_proc',
'align',
'--dim', 'DIM_DYN',
......@@ -331,6 +366,27 @@ def test_align(svs_data, mrsi_data, tmp_path):
assert np.allclose(data.data, directRun.data, atol=1E-1, rtol=1E-1)
def test_align_all(svs_data_uncomb_reps, tmp_path):
svsfile, svsdata = svs_data_uncomb_reps[0], svs_data_uncomb_reps[1]
# Run align on both sets of data using the command line
subprocess.check_call(['fsl_mrs_proc',
'align',
'--dim', 'all',
'--file', svsfile,
'--ppm', '-10', '10',
'--output', tmp_path,
'--filename', 'tmp'])
# Load result for comparison
data = read_FID(op.join(tmp_path, 'tmp.nii.gz'))
# Run directly
directRun = preproc.align(svsdata, 'all', ppmlim=(-10, 10))
assert np.allclose(data.data, directRun.data)
def test_ecc(svs_data, mrsi_data, tmp_path):
svsfile, mrsifile, svsdata, mrsidata = splitdata(svs_data, mrsi_data)
......@@ -403,6 +459,44 @@ def test_remove(svs_data, mrsi_data, tmp_path):
assert np.allclose(data.data, directRun.data)
def test_model(svs_data, mrsi_data, tmp_path):
svsfile, mrsifile, svsdata, mrsidata = splitdata(svs_data, mrsi_data)
# Run model on both sets of data using the command line
subprocess.check_call(['fsl_mrs_proc',
'model',
'--file', svsfile,
'--ppm', '-10', '10',
'--components', '5',
'--output', tmp_path,
'--filename', 'tmp'])
# Load result for comparison
data = read_FID(op.join(tmp_path, 'tmp.nii.gz'))
# Run directly
directRun = preproc.hlsvd_model_peaks(svsdata, (-10, 10), components=5)
assert np.allclose(data.data, directRun.data)
# Run model on both sets of data using the command line
subprocess.check_call(['fsl_mrs_proc',
'model',
'--file', mrsifile,
'--ppm', '-10', '10',
'--components', '5',
'--output', tmp_path,
'--filename', 'tmp'])
# Load result for comparison
data = read_FID(op.join(tmp_path, 'tmp.nii.gz'))
# Run directly
directRun = preproc.hlsvd_model_peaks(mrsidata, (-10, 10), components=5)
assert np.allclose(data.data, directRun.data)
def test_align_diff(svs_data_diff, mrsi_data_diff, tmp_path):
svsfile, svsdata = svs_data_diff[0], svs_data_diff[1]
......
......@@ -6,7 +6,6 @@ Test basis tools.
Copyright Will Clarke, University of Oxford, 2021
'''
from pathlib import Path
from shutil import copytree
import pytest
import numpy as np
......@@ -34,9 +33,8 @@ def test_convert_lcmodel(tmp_path):
assert np.isclose(basis.cf, new_basis.cf)
def test_add_basis(tmp_path):
out_loc = tmp_path / 'test_basis'
copytree(fsl_basis_path, out_loc)
def test_add_basis():
basis = mrs_io.read_basis(fsl_basis_path)
mac_in = fsl_io.readJSON(extra_basis)
fid = np.asarray(mac_in['basis_re']) + 1j * np.asarray(mac_in['basis_im'])
......@@ -44,31 +42,27 @@ def test_add_basis(tmp_path):
bw = 1 / mac_in['basis_dwell']
with pytest.raises(basis_tools.IncompatibleBasisError) as exc_info:
basis_tools.add_basis(fid, 'mac2', cf, bw, out_loc)
basis_tools.add_basis(fid, 'mac2', cf, bw, basis)
assert exc_info.type is basis_tools.IncompatibleBasisError
assert exc_info.value.args[0] == "The new basis FID covers too little time, try padding."
basis_tools.add_basis(fid, 'mac1', cf, bw, out_loc, pad=True)
new_basis = mrs_io.read_basis(out_loc)
new_basis = basis_tools.add_basis(fid, 'mac1', cf, bw, basis, pad=True)
index = new_basis.names.index('mac1')
assert 'mac1' in new_basis.names
fid_pad = np.pad(fid, (0, fid.size))
basis_tools.add_basis(fid_pad, 'mac2', cf, bw, out_loc)
new_basis = mrs_io.read_basis(out_loc)
new_basis = basis_tools.add_basis(fid_pad, 'mac2', cf, bw, basis)
index = new_basis.names.index('mac2')
assert 'mac2' in new_basis.names
assert np.allclose(new_basis._raw_fids[:, index], fid_pad[0::2])
basis_tools.add_basis(fid_pad, 'mac3', cf, bw, out_loc, scale=True, width=10)
new_basis = mrs_io.read_basis(out_loc)
new_basis = basis_tools.add_basis(fid_pad, 'mac3', cf, bw, basis, scale=True, width=10)
index = new_basis.names.index('mac3')
assert 'mac3' in new_basis.names
assert new_basis.basis_fwhm[index] == 10
basis_tools.add_basis(fid_pad, 'mac4', cf, bw, out_loc, width=10, conj=True)
new_basis = mrs_io.read_basis(out_loc)
new_basis = basis_tools.add_basis(fid_pad, 'mac4', cf, bw, basis, width=10, conj=True)
index = new_basis.names.index('mac4')
assert 'mac4' in new_basis.names
assert np.allclose(new_basis._raw_fids[:, index], fid_pad[0::2].conj())
......
......@@ -75,6 +75,14 @@ def test_align():
== 'fsl_mrs.utils.preproc.nifti_mrs_proc.align, dim=DIM_DYN, '\
'target=None, ppmlim=(1.0, 4.0), niter=1, apodize=5.'
# Align across all spectra
aligned3 = nproc.align(with_coils, 'all', ppmlim=(1.0, 4.0), niter=1, apodize=5)
assert aligned3.hdr_ext['ProcessingApplied'][0]['Method'] == 'Frequency and phase correction'
assert aligned3.hdr_ext['ProcessingApplied'][0]['Details']\
== 'fsl_mrs.utils.preproc.nifti_mrs_proc.align, dim=all, '\
'target=None, ppmlim=(1.0, 4.0), niter=1, apodize=5.'
def test_aligndiff():
# For want of data this is a bizzare way of using this function.
......@@ -110,6 +118,18 @@ def test_remove():
== 'fsl_mrs.utils.preproc.nifti_mrs_proc.remove_peaks, limits=(4, 5.3), limit_units=ppm+shift.'
def test_hlsvd_model():
nmrs_obj = read_FID(wrefc)
nmrs_obj = nproc.average(nmrs_obj, 'DIM_DYN')
modeled = nproc.hlsvd_model_peaks(nmrs_obj, (4, 5.30), components=3)
assert modeled.hdr_ext['ProcessingApplied'][1]['Method'] == 'HLSVD modeling'
assert modeled.hdr_ext['ProcessingApplied'][1]['Details']\
== 'fsl_mrs.utils.preproc.nifti_mrs_proc.hlsvd_model_peaks,'\
' limits=(4, 5.3), limit_units=ppm+shift, components=3.'
def test_tshift():
nmrs_obj = read_FID(wrefc)
nmrs_obj = nproc.average(nmrs_obj, 'DIM_DYN')
......
"""Test the miscellaneous functions for NIFTI-MRS tools
from pathlib import Path
Author: Will Clarke <william.clarke@ndcn.ox.ac.uk>
Copyright (C) 2021 University of Oxford
"""
import numpy as np