Commit e81c542c authored by Paul McCarthy's avatar Paul McCarthy 🚵
Browse files

Merge branch 'generify_matlab_vs_python' into 'master'

Generify matlab vs python

See merge request !5
parents a16c8d98 eb2dab26
......@@ -36,6 +36,7 @@ A page should open in your web browser - to access the practicals, navigate
into one of the `getting_started`, `advanced_programming`, or `applications`
directories, and click on the `.ipynb` file you are interested in.
Some practical sub-directories will also contain a `README.md` file with additional information specific to the practical. Make sure you read this as well **before running the notebooks**.
Have fun!
......
# Matlab vs. Python
This directory contains 5 demonstrations, each of which uses python and Matlab to solve the same problem:
1. `bloch` - a simple Bloch simulator that numerically integrates the Bloch equation for magnetisation dynamics
2. `fitting` - an example of using the Scipy package for fitting a biophysical model to noisy data
3. `migp` - an example of MIGP dimension reduction prior to group ICA on fMRI data
4. `partial_fourier` - a projection-onto-convex-sets Partial Fourier image reconstruction of k-space data that is partially sampled
5. `rbf` - simple example of fitting/interpolating noisy simulated data using radial basis functions
Each demonstration contain both `Matlab` and `python` code examples so you can compare the implementations.
## Required software
### FSL
The python code is contained in jupyter notebooks which can be run in the `fslpython` environment (requires `FSL`) using:
```
fslpython -m notebook
```
### Matlab
A `matlab` installation is required to run the Matlab examples.
\ No newline at end of file
......@@ -4,7 +4,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Imports"
"# Imports"
]
},
{
......@@ -216,7 +216,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.3"
"version": "3.7.6"
}
},
"nbformat": 4,
......
%% Cell type:markdown id: tags:
Imports
# Imports
%% Cell type:code id: tags:
``` python
import numpy as np
from scipy.integrate import ode, solve_ivp
import matplotlib.pyplot as plt
```
%% Cell type:markdown id: tags:
# Define the Bloch equation
$$\frac{d\vec{M}}{dt} = \vec{M}\times \vec{B} - \frac{M_x + M_y}{T2} - \frac{M_z - M_0}{T1}$$
In this section:
- define a function
- numpy functions like np.cross
%% Cell type:code id: tags:
``` python
# define bloch equation
def bloch(t, M, T1, T2):
# get effective B-field at time t
B = B_eff(t)
# cross product of M and B, add T1 and T2 relaxation terms
return np.array([M[1]*B[2] - M[2]*B[1] - M[0]/T2,
M[2]*B[0] - M[0]*B[2] - M[1]/T2,
M[0]*B[1] - M[1]*B[0] - (M[2]-1)/T1])
# alternatively
#return np.cross(M,B) - np.array([M[0]/T2, M[1]/T2, (M[2]-1)/T1])
```
%% Cell type:markdown id: tags:
# Define the pulse sequence
We work in the rotating frame, so we only need the amplitude envelope of the RF pulse
Typically, B1 excitation fields point in the x- and/or y-directions
Gradient fields point in the z-direction
In this simple example, a simple sinc-pulse excitation pulse is applied for 1 ms along the x-axis
then a gradient is turned on for 1.5 ms after that.
In this section:
- constants such as np.pi
- functions like np.sinc
%% Cell type:code id: tags:
``` python
# define effective B-field
def B_eff(t):
# Do nothing for 0.25 ms
if t < 0.25:
return np.array([0, 0, 0])
# Sinc RF along x-axis and slice-select gradient on for 1.00 ms
elif t < 1.25:
return np.array([np.pi*np.sinc((t-0.75)*4), 0, np.pi])
# Do nothing for 0.25 ms
elif t < 1.50:
return np.array([0, 0, 0])
# Slice refocusing gradient on for 1.50 ms
# Half the area of the slice-select gradient lobe
elif t < 3.00:
return np.array([0, 0, -(1/3)*np.pi])
# Pulse sequence finished
else:
return np.array([0, 0, 0])
```
%% Cell type:markdown id: tags:
# Plot the pulse sequence
In this section:
- unpacking return values
- unwanted return values
- list comprehension
- basic plotting
%% Cell type:code id: tags:
``` python
# Create 2 vertical subplots
_, ax = plt.subplots(2, 1, figsize=(12,12))
# Get pulse sequence B-fields from 0 - 5 ms
pulseq = [B_eff(t) for t in np.linspace(0,5,1000)]
pulseq = np.array(pulseq)
# Plot RF
ax[0].plot(pulseq[:,0])
ax[0].set_ylabel('B1')
# Plot gradient
ax[1].plot(pulseq[:,2])
ax[1].set_ylabel('Gradient')
```
%% Cell type:markdown id: tags:
# Integrate ODE
This uses a Runge-Kutta variant called the "Dormand-Prince method" by default. Other ode integration methods are available.
In this section:
- ode solvers
- lambdas (anonymous functions)
%% Cell type:code id: tags:
``` python
# Set the initial conditions
# equilibrium magnetization (M) = (0, 0, 1)
M = [0, 0, 1]
# Set time interval for integration
t = [0, 5]
# Set max step size
dt = 0.005
# Set T1 and T2
T1, T2 = 1500, 50
# Integrate ODE
# In Scipy 1.2.0, the first argument to solve_ivp must be a function that takes exactly 2 arguments
sol = solve_ivp(lambda t, M : bloch(t, M, T1, T2), t, M, max_step=dt)
# Grab output
t = sol.t
M = sol.y
```
%% Cell type:markdown id: tags:
# Plot Results
In this section:
- more plotting
%% Cell type:code id: tags:
``` python
# Create single axis
_, ax = plt.subplots(figsize=(12,12))
# Plot x, y and z components of Magnetisation
ax.plot(t, M[0,:], label='Mx')
ax.plot(t, M[1,:], label='My')
ax.plot(t, M[2,:], label='Mz')
# Add legend and grid
ax.legend()
ax.grid()
```
......
......@@ -6,6 +6,9 @@
"source": [
"# MIGP\n",
"\n",
"> **Before you run this notebook**: \n",
"> 1. Make sure you have the necessary software dependencies (see the [readme](README.md))\n",
"\n",
"For group ICA, `melodic` uses multi-session temporal concatenation. This will perform a single 2D ICA run on the concatenated data matrix (obtained by stacking all 2D data matrices of every single data set on top of each other).\n",
"\n",
"![temporal concatenation](concat_diag.png)\n",
......@@ -24,7 +27,7 @@
"\n",
"## This notebook\n",
"\n",
"This notebook will download an open fMRI dataset (~50MB) for use in the MIGP demo, regresses confounds from the data, performs spatial smoothing with 10mm FWHM, and then runs group `melodic` with `MIGP`.\n",
"This notebook will download an open fMRI dataset (~50MB) for use in the MIGP demo, regress confounds from the data, perform spatial smoothing with 10mm FWHM, and then run group `melodic` with `MIGP`.\n",
"\n",
"* [Fetch the data](#download-the-data)\n",
"* [Clean the data](#clean-the-data)\n",
......@@ -62,9 +65,7 @@
"\n",
"It comprises 10 preprocessed resting-state fMRI selected from 72 patients diagnosed with schizophrenia and 74 healthy controls (6mm isotropic, TR=2s, 150 volumes).\n",
"\n",
"Create a directory in the users home directory to store the downloaded data:\n",
"\n",
"> **NOTE:** [`expanduser`](https://docs.python.org/3.7/library/os.path.html#os.path.expanduser) will expand the `~` to the be users home directory:"
"Create a directory in the current directory to store the downloaded data:"
]
},
{
......@@ -73,7 +74,7 @@
"metadata": {},
"outputs": [],
"source": [
"data_dir = op.expanduser('~/nilearn_data')\n",
"data_dir = './nilearn_data'\n",
"\n",
"if not op.exists(data_dir):\n",
" os.makedirs(data_dir)"
......@@ -85,7 +86,9 @@
"source": [
"Download the data (if not already downloaded):\n",
"\n",
"> **Note:** We use a method from [`nilearn`](https://nilearn.github.io/index.html) called [`fetch_cobre`](https://nilearn.github.io/modules/generated/nilearn.datasets.fetch_cobre.html) to download the fMRI data"
"> **Note:** \n",
"> 1. We use a method from [`nilearn`](https://nilearn.github.io/index.html) called [`fetch_cobre`](https://nilearn.github.io/modules/generated/nilearn.datasets.fetch_cobre.html) to download the fMRI data.\n",
"> 2. `fetch_cobre` has been deprecated in recent versions of `nilearn`. It will soon be replaced here in there near future. In the interim, the deprecation warning has been suppressed with `filterwarnings`."
]
},
{
......@@ -94,6 +97,11 @@
"metadata": {},
"outputs": [],
"source": [
"# suppress deprecation warning\n",
"import warnings\n",
"warnings.filterwarnings(\"ignore\")\n",
"\n",
"# download dataset\n",
"d = datasets.fetch_cobre(data_dir=data_dir) "
]
},
......
%% Cell type:markdown id: tags:
# MIGP
> **Before you run this notebook**:
> 1. Make sure you have the necessary software dependencies (see the [readme](README.md))
For group ICA, `melodic` uses multi-session temporal concatenation. This will perform a single 2D ICA run on the concatenated data matrix (obtained by stacking all 2D data matrices of every single data set on top of each other).
![temporal concatenation](concat_diag.png)
Resulting in **high dimension** datasets!
Furthermore, with ICA we are typically only interested in a comparitively low dimension decomposition so that we can capture spatially extended networks.
Therefore the first step is to reduce the dimensionality of the data. This can be achieved in a number of ways, but `melodic`, by default, uses `MIGP`.
> MIGP is an incremental approach that aims to provide a very close approximation to full temporal concatenation followed by PCA, but without the large memory requirements *(Smith et al., 2014)*.
Essentially, MIGP stacks the datasets incrementally in the temporal dimension, and whenever the temporal dimension exceeds a specified size, a PCA-based temporal reduction is performed.
> MIGP does not increase at all in memory requirement with increasing numbers of subjects, no large matrices are ever formed, and the computation time scales linearly with the number of subjects. It is easily parallelisable, simply by applying the approach in parallel to subsets of subjects, and then combining across these with the same “concatenate and reduce” approach described above *(Smith et al., 2014)*.
## This notebook
This notebook will download an open fMRI dataset (~50MB) for use in the MIGP demo, regresses confounds from the data, performs spatial smoothing with 10mm FWHM, and then runs group `melodic` with `MIGP`.
This notebook will download an open fMRI dataset (~50MB) for use in the MIGP demo, regress confounds from the data, perform spatial smoothing with 10mm FWHM, and then run group `melodic` with `MIGP`.
* [Fetch the data](#download-the-data)
* [Clean the data](#clean-the-data)
* [Run `melodic`](#run-melodic)
* [Plot group ICs](#plot-group-ics)
Firstly we will import the necessary packages for this notebook:
%% Cell type:code id: tags:
``` python
from nilearn import datasets
from nilearn import image
from nilearn import plotting
import nibabel as nb
import numpy as np
import os.path as op
import os
import glob
import matplotlib.pyplot as plt
```
%% Cell type:markdown id: tags:
<a class="anchor" id="download-the-data"></a>
## Fetch the data
This data is a derivative from the [COBRE](http://fcon_1000.projects.nitrc.org/indi/retro/cobre.html) sample found in the International Neuroimaging Data-sharing Initiative, originally released under Creative Commons - Attribution Non-Commercial.
It comprises 10 preprocessed resting-state fMRI selected from 72 patients diagnosed with schizophrenia and 74 healthy controls (6mm isotropic, TR=2s, 150 volumes).
Create a directory in the users home directory to store the downloaded data:
> **NOTE:** [`expanduser`](https://docs.python.org/3.7/library/os.path.html#os.path.expanduser) will expand the `~` to the be users home directory:
Create a directory in the current directory to store the downloaded data:
%% Cell type:code id: tags:
``` python
data_dir = op.expanduser('~/nilearn_data')
data_dir = './nilearn_data'
if not op.exists(data_dir):
os.makedirs(data_dir)
```
%% Cell type:markdown id: tags:
Download the data (if not already downloaded):
> **Note:** We use a method from [`nilearn`](https://nilearn.github.io/index.html) called [`fetch_cobre`](https://nilearn.github.io/modules/generated/nilearn.datasets.fetch_cobre.html) to download the fMRI data
> **Note:**
> 1. We use a method from [`nilearn`](https://nilearn.github.io/index.html) called [`fetch_cobre`](https://nilearn.github.io/modules/generated/nilearn.datasets.fetch_cobre.html) to download the fMRI data.
> 2. `fetch_cobre` has been deprecated in recent versions of `nilearn`. It will soon be replaced here in there near future. In the interim, the deprecation warning has been suppressed with `filterwarnings`.
%% Cell type:code id: tags:
``` python
# suppress deprecation warning
import warnings
warnings.filterwarnings("ignore")
# download dataset
d = datasets.fetch_cobre(data_dir=data_dir)
```
%% Cell type:markdown id: tags:
<a class="anchor" id="clean-the-data"></a>
## Clean the data
Regress confounds from the data and to spatially smooth the data with a gaussian filter of 10mm FWHM.
> **Note:**
> 1. We use [`clean_img`](https://nilearn.github.io/modules/generated/nilearn.image.clean_img.html) from the [`nilearn`](https://nilearn.github.io/index.html) package to regress confounds from the data
> 2. We use [`smooth_img`](https://nilearn.github.io/modules/generated/nilearn.image.smooth_img.html) from the [`nilearn`](https://nilearn.github.io/index.html) package to spatially smooth the data
> 3. [`zip`](https://docs.python.org/3.3/library/functions.html#zip) takes iterables and aggregates them in a tuple. Here it is used to iterate through four lists simultaneously
> 4. We use list comprehension to loop through all the filenames and append suffixes
%% Cell type:code id: tags:
``` python
# Create a list of filenames for cleaned and smoothed data
clean = [f.replace('.nii.gz', '_clean.nii.gz') for f in d.func]
smooth = [f.replace('.nii.gz', '_clean_smooth.nii.gz') for f in d.func]
# loop through each subject, regress confounds and smooth
for img, cleaned, smoothed, conf in zip(d.func, clean, smooth, d.confounds):
print(f'{img}: regress confounds: ', end='')
image.clean_img(img, confounds=conf).to_filename(cleaned)
print(f'smooth.')
image.smooth_img(img, 10).to_filename(smoothed)
```
%% Cell type:markdown id: tags:
To run ```melodic``` we will need a brain mask in MNI152 space at the same resolution as the fMRI.
> **Note:**
> 1. We use [`load_mni152_brain_mask`](https://nilearn.github.io/modules/generated/nilearn.datasets.load_mni152_brain_mask.html) from the [`nilearn`](https://nilearn.github.io/index.html) package to load the MNI152 mask
> 2. We use [`resample_to_img`](https://nilearn.github.io/modules/generated/nilearn.image.resample_to_img.html) from the [`nilearn`](https://nilearn.github.io/index.html) package to resample the mask to the resolution of the fMRI
> 3. We use [`math_img`](https://nilearn.github.io/modules/generated/nilearn.image.math_img.html) from the [`nilearn`](https://nilearn.github.io/index.html) package to binarize the resample mask
> 4. The mask is plotted using [`plot_anat`](https://nilearn.github.io/modules/generated/nilearn.plotting.plot_anat.html) from the [`nilearn`](https://nilearn.github.io/index.html) package
%% Cell type:code id: tags:
``` python
# load a single fMRI dataset (func0)
func0 = nb.load(d.func[0].replace('.nii.gz', '_clean_smooth.nii.gz'))
# load MNI153 brainmask, resample to func0 resolution, binarize, and save to nifti
mask = datasets.load_mni152_brain_mask()
mask = image.resample_to_img(mask, func0)
mask = image.math_img('img > 0.5', img=mask)
mask.to_filename(op.join(data_dir, 'brain_mask.nii.gz'))
# plot brainmask to make sure it looks OK
disp = plotting.plot_anat(image.index_img(func0, 0))
disp.add_contours(mask, threshold=0.5)
```
%% Cell type:markdown id: tags:
<a class="anchor" id="run-melodic"></a>
### Run ```melodic```
Generate a command line string and run group ```melodic``` on the smoothed fMRI with a dimension of 10 components:
> **Note**:
> 1. Here we use python [f-strings](https://www.python.org/dev/peps/pep-0498/), formally known as literal string interpolation, which allow for easy formatting
> 2. [`op.join`](https://docs.python.org/3.7/library/os.path.html#os.path.join) will join path strings using the platform-specific directory separator
> 3. [`','.join(smooth)`](https://docs.python.org/3/library/stdtypes.html#str.join) will create a comma seprated string of all the items in the list `smooth`
%% Cell type:code id: tags:
``` python
# generate melodic command line string
melodic_cmd = f"melodic -i {','.join(smooth)} --mask={op.join(data_dir, 'brain_mask.nii.gz')} -d 10 -v -o cobre.gica "
print(melodic_cmd)
```
%% Cell type:markdown id: tags:
> **Note:**
> 1. Here we use the `!` operator to execute the command in the shell
> 2. The `{}` will expand the contained python variable in the shell
%% Cell type:code id: tags:
``` python
# run melodic
! {melodic_cmd}
```
%% Cell type:markdown id: tags:
<a class="anchor" id="plot-group-ics"></a>
### Plot group ICs
Now we can load and plot the group ICs generated by ```melodic```.
This function will be used to plot ICs:
> **NOTE:**
> 1. Here we use [`plot_stat_map`](https://nilearn.github.io/modules/generated/nilearn.plotting.plot_stat_map.html) from the [`nilearn`](https://nilearn.github.io/index.html) package to plot the orthographic images
> 2. [`subplots`](https://matplotlib.org/api/_as_gen/matplotlib.pyplot.subplots.html) from `matplotlib.pyplot` creates a figure and multiple subplots
> 3. [`find_xyz_cut_coords`](https://nilearn.github.io/modules/generated/nilearn.plotting.find_xyz_cut_coords.html) from the [`nilearn`](https://nilearn.github.io/index.html) package will find the image coordinates of the center of the largest activation connected component
> 4. [`zip`](https://docs.python.org/3.3/library/functions.html#zip) takes iterables and aggregates them in a tuple. Here it is used to iterate through two lists simultaneously
> 5. [`iter_img`](https://nilearn.github.io/modules/generated/nilearn.image.iter_img.html) from the [`nilearn`](https://nilearn.github.io/index.html) package creates an iterator from an image that steps through each volume/time-point of the image
%% Cell type:code id: tags:
``` python
def map_plot(d):
N = d.shape[-1]
fig, ax = plt.subplots(int(np.ceil((N/2))),2, figsize=(12, N))
for img, ax0 in zip(image.iter_img(d), ax.ravel()):
coord = plotting.find_xyz_cut_coords(img, activation_threshold=3.5)
plotting.plot_stat_map(img, cut_coords=coord, vmax=10, axes=ax0)
return fig
```
%% Cell type:markdown id: tags:
Hopefully you can see some familiar looking RSN spatial patterns:
%% Cell type:code id: tags:
``` python
# Load ICs
ics = nb.load('cobre.gica/melodic_IC.nii.gz')
# plot
fig = map_plot(ics)
```
......
# MIGP Demo
This demo is comprised of three notebooks:
1. `MIGP.ipynb` will introduce MIGP, download an open fMRI dataset for use in the MIGP demo, regress confounds from the data, perform spatial smoothing, and then run group melodic ICA.
2. `matlab_MIGP.ipynb` will perform Matlab MIGP dimension reduction, run group ICA, and then plot the group ICs.
3. `python_MIGP.ipynb` will perform python MIGP dimension reduction, run group ICA, and then plot the group ICs.
The `MIGP.ipynb` notebook will need to be run first to ensure that the necessary data is downloaded and setup.
## Running the notebooks
These notebooks can be run in the `fslpython` environment using:
```
fslpython -m notebook
```
## Dependencies
These notebooks require both `FSL` and `Matlab` to be installed.
### FSL
In addition to FSL you will need to install the python package `nilearn` into the `fslpython` environment with the following command:
> **Note**: `sudo` is required if FSL is installed in an admin/root protected location.
```
sudo $FSLDIR/fslpython/bin/conda install -n fslpython -c conda-forge nilearn
```
### Matlab
In addition to Matlab you will need to know the location of the Matlab executable/binary. Common matlab paths are below, but they may vary on your system:
```
# MacOS
/Applications/MATLAB_{ver}.app/bin/matlab
# Linux
/usr/local/matlab/{ver}/bin/matlab
```
\ No newline at end of file
......@@ -3,69 +3,71 @@
% not yet finally tweaked/evaluated, or published - please do not pass on.
% adapted by Sean Fitz, 2020
%% User Options (equivalent to melodic defaults)
INlist=dir('~/nilearn_data/cobre/fmri_*_smooth.nii.gz'); % list of input 4D NIFTI standard space datasets
INmask='~/nilearn_data/brain_mask.nii.gz'; % 3D NIFTI mask image
GO='matMIGP'; % output filename
dPCAint=299; % internal number of components - typically 2-4 times number of timepoints in each run (if you have enough RAM for that)
dPCAout=299; % number of eigenvectors to output - should be less than dPCAint and more than the final ICA dimensionality
sep_vn=false; % switch on separate variance nomalisation for each input dataset
%% Run MIGP
Nsub=length(INlist); [~,r]=sort(rand(Nsub,1)); % will process subjects in random order
mask=read_avw(INmask);
masksize=size(mask);
mask=reshape(mask,prod(masksize),1);
demean = @(x) x - repmat(mean(x,1),[size(x,1), 1, 1]);
for i = 1:Nsub
% read data
filename=[INlist(r(i)).folder, '/', INlist(r(i)).name];
fprintf('Reading data file %s\n', filename);
grot=double(read_avw(filename));
grot=reshape(grot,prod(masksize),size(grot,4));
% demean
fprintf('\tRemoving mean image\n');
grot=demean(grot(mask~=0,:)');
% var-norm separately for each subject
if sep_vn==true
fprintf('\tNormalising by voxel-wise variance\r');
[uu,ss,vv]=svds(grot,30);
vv(abs(vv)<2.3*std(vv(:)))=0;
stddevs=max(std(grot-uu*ss*vv'),0.001);
grot=grot./repmat(stddevs,size(grot,1),1);
end
if (i==1)
W=demean(grot); clear grot;
else
% concat
W=[W; demean(grot)]; clear grot;
% reduce W to dPCAint eigenvectors
if size(W,1)-10 > dPCAint
fprintf('\tReducing data matrix to a %i dimensional subspace\n', dPCAint);
[uu,dd]=eigs(W*W',dPCAint);
W=uu'*W;
clear uu;
function demo_MIGP_NIFTI(data_dir)
%% User Options (equivalent to melodic defaults)
INlist=dir([data_dir, '/cobre/fmri_*_smooth.nii.gz']); % list of input 4D NIFTI standard space datasets
INmask=[data_dir, '/brain_mask.nii.gz']; % 3D NIFTI mask image
GO='matMIGP'; % output filename
dPCAint=299; % internal number of components - typically 2-4 times number of timepoints in each run (if you have enough RAM for that)
dPCAout=299; % number of eigenvectors to output - should be less than dPCAint and more than the final ICA dimensionality
sep_vn=false; % switch on separate variance nomalisation for each input dataset
%% Run MIGP
Nsub=length(INlist); [~,r]=sort(rand(Nsub,1)); % will process subjects in random order
mask=read_avw(INmask);
masksize=size(mask);
mask=reshape(mask,prod(masksize),1);
demean = @(x) x - repmat(mean(x,1),[size(x,1), 1, 1]);
for i = 1:Nsub
% read data
filename=[INlist(r(i)).folder, '/', INlist(r(i)).name];
fprintf('Reading data file %s\n', filename);
grot=double(read_avw(filename));
grot=reshape(grot,prod(masksize),size(grot,4));
% demean
fprintf('\tRemoving mean image\n');
grot=demean(grot(mask~=0,:)');
% var-norm separately for each subject
if sep_vn==true
fprintf('\tNormalising by voxel-wise variance\r');
[uu,ss,vv]=svds(grot,30);
vv(abs(vv)<2.3*std(vv(:)))=0;
stddevs=max(std(grot-uu*ss*vv'),0.001);
grot=grot./repmat(stddevs,size(grot,1),1);
end
if (i==1)
W=demean(grot); clear grot;
else
% concat
W=[W; demean(grot)]; clear grot;
% reduce W to dPCAint eigenvectors
if size(W,1)-10 > dPCAint
fprintf('\tReducing data matrix to a %i dimensional subspace\n', dPCAint);
[uu,dd]=eigs(W*W',dPCAint);
W=uu'*W;
clear uu;
end
end
end
end
%% Reshape and save
%% Reshape and save
grot=zeros(prod(masksize),dPCAout);
grot(mask~=0,:)=W(1:dPCAout,:)'; grot=reshape(grot,[masksize ,dPCAout]);
fprintf('Save to %s.nii.gz\n', GO)
save_avw(grot,GO,'f',[1 1 1 1]);
system(sprintf('fslcpgeom %s %s -d',filename,GO));
grot=zeros(prod(masksize),dPCAout);
grot(mask~=0,:)=W(1:dPCAout,:)'; grot=reshape(grot,[masksize ,dPCAout]);
fprintf('Save to %s.nii.gz\n', GO)
save_avw(grot,GO,'f',[1 1 1 1]);
system(sprintf('fslcpgeom %s %s -d',filename,GO));
......@@ -6,8 +6,13 @@
"source": [
"## Matlab MIGP\n",
"\n",
"This notebook will load the dimension reduced data from Matlab MIGP, run group ICA, and then plot the group ICs.\n",
"> **Before you run this notebook**: \n",
"> 1. Make sure you have the necessary software dependencies (see the [readme](README.md))\n",
"> 2. You must run [MIGP.ipynb](MIGP.ipynb) to ensure that the necessary data is downloaded and setup\n",
"\n",
"This notebook will perform *Matlab* MIGP dimension reduction, run group ICA, and then plot the group ICs.\n",
"\n",
"* [Run python `MIGP`](#run-matlab-migp)\n",
"* [Run `melodic`](#run-matlab-melodic)\n",
"* [Plot group ICs](#plot-matlab-group-ics)\n",
"\n",
......@@ -32,9 +37,51 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"It will be necessary to know the location where the data was stored so that we can load the brainmask:\n",
"It will be necessary to know the location where the data was stored in [MIGP.ipynb](MIGP.ipynb)."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"data_dir = './nilearn_data'"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a class=\"anchor\" id=\"run-matlab-migp\"></a>\n",
"### Run Matlab `MIGP`\n",
"\n",
"The code for Matlab MIGP is in the `demo_MIGP_NIFTI.m` m-file in the current directory.\n",
"\n",
"We are going to run Matlab MIGP from the command line using the `-batch` option. You will need to know the path to the Matlab executable (see the [readme](README.md))\n",
"\n",
"> **IMPORTANT:** Make sure you change the `MATLAB_EXE` string to be appropriate for your system.\n",
"\n",
"> **Note**: \n",
"> 1. Here we use python [f-strings](https://www.python.org/dev/peps/pep-0498/), formally known as literal string interpolation, which allow for easy formatting\n",
"> 2. Here we use the `!` operator to execute the command in the shell\n",
"> 3. The `{}` will expand the contained python variable in the shell"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Generate Matlab MIGP command line string\n",
"\n",
"# Change the MATLAB_EXE to point to the Matlab executable on your system \n",
"MATLAB_EXE = '/Applications/MATLAB_R2019b.app/bin/matlab'\n",
"\n",
"CMD = f'{MATLAB_EXE} -batch \\\"demo_MIGP_NIFTI(\\'{data_dir}\\')\\\"'\n",
"\n",
"> **Note**: [`expanduser`](https://docs.python.org/3.7/library/os.path.html#os.path.expanduser) will expand the `~` to the be users home directory"
"! echo {CMD}"
]
},
{
......@@ -43,7 +90,8 @@
"metadata": {},
"outputs": [],
"source": [
"data_dir = op.expanduser('~/nilearn_data')"
"# Run Matlab command line string\n",
"! {CMD}"
]
},
{
......
%% Cell type:markdown id: tags:
## Matlab MIGP
This notebook will load the dimension reduced data from Matlab MIGP, run group ICA, and then plot the group ICs.
> **Before you run this notebook**:
> 1. Make sure you have the necessary software dependencies (see the [readme](README.md))
> 2. You must run [MIGP.ipynb](MIGP.ipynb) to ensure that the necessary data is downloaded and setup
This notebook will perform *Matlab* MIGP dimension reduction, run group ICA, and then plot the group ICs.
* [Run python `MIGP`](#run-matlab-migp)
* [Run `melodic`](#run-matlab-melodic)
* [Plot group ICs](#plot-matlab-group-ics)
Firstly we will import the necessary packages for this notebook:
%% Cell type:code id: tags:
``` python
from nilearn import plotting
from nilearn import image
import nibabel as nb
import matplotlib.pyplot as plt
import numpy as np
import os.path as op
```
%% Cell type:markdown id: tags:
It will be necessary to know the location where the data was stored so that we can load the brainmask:
It will be necessary to know the location where the data was stored in [MIGP.ipynb](MIGP.ipynb).
%% Cell type:code id: tags:
``` python
data_dir = './nilearn_data'
```
%% Cell type:markdown id: tags:
<a class="anchor" id="run-matlab-migp"></a>
### Run Matlab `MIGP`
The code for Matlab MIGP is in the `demo_MIGP_NIFTI.m` m-file in the current directory.
We are going to run Matlab MIGP from the command line using the `-batch` option. You will need to know the path to the Matlab executable (see the [readme](README.md))
> **Note**: [`expanduser`](https://docs.python.org/3.7/library/os.path.html#os.path.expanduser) will expand the `~` to the be users home directory
> **IMPORTANT:** Make sure you change the `MATLAB_EXE` string to be appropriate for your system.
> **Note**:
> 1. Here we use python [f-strings](https://www.python.org/dev/peps/pep-0498/), formally known as literal string interpolation, which allow for easy formatting
> 2. Here we use the `!` operator to execute the command in the shell
> 3. The `{}` will expand the contained python variable in the shell
%% Cell type:code id: tags:
``` python
# Generate Matlab MIGP command line string
# Change the MATLAB_EXE to point to the Matlab executable on your system
MATLAB_EXE = '/Applications/MATLAB_R2019b.app/bin/matlab'
CMD = f'{MATLAB_EXE} -batch \"demo_MIGP_NIFTI(\'{data_dir}\')\"'
! echo {CMD}
```
%% Cell type:code id: tags:
``` python
data_dir = op.expanduser('~/nilearn_data')
# Run Matlab command line string
! {CMD}
```
%% Cell type:markdown id: tags:
<a class="anchor" id="run-matlab-melodic"></a>
### Run ```melodic```
Generate a command line string and run group ```melodic``` on the Matlab MIGP dimension reduced data with a dimension of 10 components. We disable MIGP because it was already run separately in Matlab.
> **Note**:
> 1. Here we use python [f-strings](https://www.python.org/dev/peps/pep-0498/), formally known as literal string interpolation, which allow for easy formatting
> 2. [`op.join`](https://docs.python.org/3.7/library/os.path.html#os.path.join) will join path strings using the platform-specific directory separator
%% Cell type:code id: tags:
``` python
# generate melodic command line string
melodic_cmd = f"melodic -i matMIGP.nii.gz --mask={op.join(data_dir, 'brain_mask.nii.gz')} -d 10 -v --nobet --disableMigp -o matmigp.gica"
print(melodic_cmd)
```
%% Cell type:markdown id: tags:
> **Note:**
> 1. Here we use the `!` operator to execute the command in the shell
> 2. The `{}` will expand the contained python variable in the shell
%% Cell type:code id: tags:
``` python
# run melodic
! {melodic_cmd}
```
%% Cell type:markdown id: tags:
<a class="anchor" id="plot-matlab-group-ics"></a>
### Plot group ICs
Now we can load and plot the group ICs generated by ```melodic```.
This function will be used to plot ICs:
> **NOTE:**
> 1. Here we use [`plot_stat_map`](https://nilearn.github.io/modules/generated/nilearn.plotting.plot_stat_map.html) from the [`nilearn`](https://nilearn.github.io/index.html) package to plot the orthographic images
> 2. [`subplots`](https://matplotlib.org/api/_as_gen/matplotlib.pyplot.subplots.html) from `matplotlib.pyplot` creates a figure and multiple subplots
> 3. [`find_xyz_cut_coords`](https://nilearn.github.io/modules/generated/nilearn.plotting.find_xyz_cut_coords.html) from the [`nilearn`](https://nilearn.github.io/index.html) package will find the image coordinates of the center of the largest activation connected component
> 4. [`zip`](https://docs.python.org/3.3/library/functions.html#zip) takes iterables and aggregates them in a tuple. Here it is used to iterate through two lists simultaneously
> 5. [`iter_img`](https://nilearn.github.io/modules/generated/nilearn.image.iter_img.html) from the [`nilearn`](https://nilearn.github.io/index.html) package creates an iterator from an image that steps through each volume/time-point of the image
%% Cell type:code id: tags:
``` python
def map_plot(d):
N = d.shape[-1]
fig, ax = plt.subplots(int(np.ceil((N/2))),2, figsize=(12, N))
for img, ax0 in zip(image.iter_img(d), ax.ravel()):
coord = plotting.find_xyz_cut_coords(img, activation_threshold=3.5)
plotting.plot_stat_map(img, cut_coords=coord, vmax=10, axes=ax0)
return fig
```
%% Cell type:markdown id: tags:
Hopefully you can see some familiar looking RSN spatial patterns:
%% Cell type:code id: tags:
``` python
ics = nb.load('matmigp.gica/melodic_IC.nii.gz')
fig = map_plot(ics)
```
......
......@@ -6,6 +6,11 @@
"source": [
"## Python MIGP\n",
"\n",
"> **Before you run this notebook**: \n",
"> 1. Make sure you have the necessary software dependencies (see the [readme](README.md))\n",
"> 2. You must run [MIGP.ipynb](MIGP.ipynb) to ensure that the necessary data is downloaded and setup\n",
"\n",
"\n",
"This notebook will perform *python* MIGP dimension reduction, run group ICA, and then plot the group ICs.\n",
"\n",
"* [Run python `MIGP`](#run-python-migp)\n",
......@@ -36,9 +41,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"It will be necessary to know the location where the data was stored so that we can load the brainmask. \n",
"\n",
"> **Note**: [`expanduser`](https://docs.python.org/3.7/library/os.path.html#os.path.expanduser) will expand the `~` to the be users home directory"
"It will be necessary to know the location where the data was stored in [MIGP.ipynb](MIGP.ipynb)."
]
},
{
......@@ -47,7 +50,7 @@
"metadata": {},
"outputs": [],
"source": [
"data_dir = op.expanduser('~/nilearn_data')"
"data_dir = './nilearn_data'"
]
},
{
......@@ -252,13 +255,6 @@
"ics = nb.load('pymigp.gica/melodic_IC.nii.gz')\n",
"fig = map_plot(ics)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],