Commit 1345f65a authored by Christoph Arthofer's avatar Christoph Arthofer
Browse files

Initial commit

parents
File added
# Default ignored files
/shelf/
/workspace.xml
# Datasource local storage ignored files
/dataSources/
/dataSources.local.xml
# Editor-based HTTP Client requests
/httpRequests/
<component name="InspectionProjectProfileManager">
<settings>
<option name="USE_PROJECT_PROFILE" value="false" />
<version value="1.0" />
</settings>
</component>
\ No newline at end of file
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="ProjectRootManager" version="2" project-jdk-name="Python 3.7 (chris_env)" project-jdk-type="Python SDK" />
</project>
\ No newline at end of file
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="ProjectModuleManager">
<modules>
<module fileurl="file://$PROJECT_DIR$/.idea/py_submit.iml" filepath="$PROJECT_DIR$/.idea/py_submit.iml" />
</modules>
</component>
</project>
\ No newline at end of file
<?xml version="1.0" encoding="UTF-8"?>
<module type="PYTHON_MODULE" version="4">
<component name="NewModuleRootManager">
<content url="file://$MODULE_DIR$" />
<orderEntry type="jdk" jdkName="Python 3.7 (chris_env)" jdkType="Python SDK" />
<orderEntry type="sourceFolder" forTests="false" />
</component>
</module>
\ No newline at end of file
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="VcsDirectoryMappings">
<mapping directory="$PROJECT_DIR$" vcs="Git" />
</component>
</project>
\ No newline at end of file
# Multimodal template construction
## Setup - Create a new or clone an existing python 3.7 Conda environment
Make sure the latest versions of the following packages are installed. Installation can be performed with:
```bash
conda create -n tempy_env python=3.7 pandas numpy
conda activate tempy_env
conda install -c conda-forge nibabel
conda install -c conda-forge fslpy
pip install file-tree file-tree-fsl
```
The MMORF singularity package has to be downloaded and the path set:
```bash
export MMORFDIR=/path/to/mmorf.sif
```
Job submission to a computing cluster is currently based on FSL's `fsl_sub` command line tool,
which assumes that the computing cluster is managed by SGE.
## Running the template construction pipeline
Clone this repository:
```bash
git clone https://git.fmrib.ox.ac.uk/cart/ms_templates.git
cd ms_templates
python run_template_construction.py [Options]
```
### Options
#### `-h, --help`
Show help message
#### `-i <dir>, --input <dir>`
Directory containing the defaced subjects/timepoints
#### `-t <path>, --tree <path>`
Path to FSL Filetree describing the subject-specific directory structure and selection of modalities.
The input file structure of the data (i.e. can be a timepoint or a subject) has to be defined in the FSL filetree located in `./tree/data.tree`.
To construct a combined T1, T2 and DTI template the locations and filenames of e.g. whole-head, brain-extracted and brain masks with respect to
the timepoint's directory have to be specified. In the template construction code these will be accessed through Filetree keys,
which must not be altered or removed.
Required keys for T1: T1_head, T1_brain, T1_brain_mask
Required keys for T1+T2: T1_head, T1_brain, T1_brain_mask, T2_head, T2_brain
Required keys for T1+T2+DTI: T1_head, T1_brain, T1_brain_mask, T2_head, T2_brain, DTI_scalar, DTI_tensor
For example to construct a combined T1, T2 and DTI template `./tree/data.tree` would look like:
```bash
{sub_id}
anat
T1
T1.nii.gz (T1_head)
T1_brain.nii.gz (T1_brain)
T1_brain_mask.nii.gz (T1_brain_mask)
T2
T2.nii.gz (T2_head)
T2_brain.nii.gz (T2_brain)
DTI
dti_S0.nii.gz (DTI_scalar)
dti_tensor.nii.gz (DTI_tensor)
```
For a T1-only template this would look like:
```bash
{sub_id}
anat
T1
T1.nii.gz (T1_head)
T1_brain.nii.gz (T1_brain)
T1_brain_mask.nii.gz (T1_brain_mask)
```
`{sub_id}` is a placeholder for the timepoints directory and will be automatically replaced with:
- IDs in the CSV file provided with `--subids <path>`
- if the `--subids <path>` flag is not provided, the sub-directories of `--input <dir>` will be used
#### `-o <dir>, --output <dir>`
Output directory
#### `-s <path>, --subids <path>`
Path to .csv file containing one subject ID per row: subject IDs have to indentify the sub-directories of the
'input' argument (optional)
If not provided all sub-directories of the --input <dir> argument will be used as subject IDs
#### `-aff [True,False], --affine [True,False]`
Run affine template construction (required for affine template construction)
#### `-nln [True,False], --nonlinear [True,False]`
Run nonlinear template construction (required for nonlinear template construction)
#### `-c <string>, --cpuq <string>`
Name of a cluster queue to submit CPU jobs to (required for affine and nonlinear template construction)
#### `-g <string>, --gpuq <string>`
Name of a cluster queue to submit GPU jobs to (required for nonlinear template construction)
#### `-nres <int>, --n_resolutions <int>`
Number of resolution levels (required for nonlinear template construction)
#### `-nit <int>, --n_iterations <int>`
Number of iterations per resolution level (required for nonlinear template construction)
### Example
A typical command for within-subject template construction given multiple timepoints per subject would lool like this:
```bash
python run_template_construction.py
-i /full/path/to/subject/containing/timepoint/directories/
-t ./tree/MS_template.tree
-o /full/path/to/output/directory/
-aff True
-nln True
-nres 6
-nit 1
--cpuq short.qc
--gpuq gpu8.q
```
This diff is collapsed.
ext_nii=.nii.gz
ext_mat=.mat
ext_ini=.ini
ext_txt=.txt
{data_dir}
->data (data)
{template_dir}
misc (misc_dir)
ident{ext_mat} (identity_mat)
log (log_dir)
scripts (script_dir)
clamped_T1 (clamped_dir)
{sub_id}_T1_clamped{ext_nii} (T1_head_clamped)
affine_it1 (affine_it1_dir)
{sub_id} (affine_it1_subject_dir)
T1_to_{ref_id}{ext_mat} (T1_to_ref_mat)
T1_to_{ref_id}{ext_nii} (T1_to_ref_img)
T2_to_T1{ext_mat} (T2_to_T1_mat)
DTI_to_T2{ext_mat} (DTI_to_T2_mat)
T1_to_unbiased{ext_mat} (T1_to_unbiased_mat)
T1_to_unbiased{ext_nii} (T1_to_unbiased_img)
T2_to_unbiased{ext_mat} (T2_to_unbiased_mat)
DTI_to_unbiased{ext_mat} (DTI_to_unbiased_mat)
T1_unbiased_affine_template{ext_nii} (T1_unbiased_affine_template)
T1_unbiased_affine_template_to_MNI{ext_nii} (T1_unbiased_affine_template_to_MNI_img)
T1_unbiased_affine_template_to_MNI{ext_mat} (T1_unbiased_affine_template_to_MNI_mat)
unbiasing_matrix_inverse{ext_mat} (T1_unbiasing_affine_matrix)
affine_it2 (affine_it2_dir)
{sub_id} (affine_it2_subject_dir)
T1_to_MNI152{ext_mat} (T1_to_MNI_mat)
T1_brain_to_MNI152{ext_nii} (T1_brain_to_MNI_img)
T1_head_to_MNI152{ext_nii} (T1_head_to_MNI_img)
T1_brain_mask_to_MNI152{ext_nii} (T1_brain_mask_to_MNI_img)
T2_FLAIR_to_MNI152{ext_mat} (T2_to_MNI_mat)
T2_FLAIR_brain_to_MNI152{ext_nii} (T2_brain_to_MNI_img)
T2_FLAIR_head_to_MNI152{ext_nii} (T2_head_to_MNI_img)
DTI_to_MNI152{ext_mat} (DTI_to_MNI_mat)
DTI_to_MNI152{ext_nii} (DTI_to_MNI_img)
DTI_tensor_to_MNI152{ext_nii} (DTI_tensor_to_MNI)
affine_template_T1_brain{ext_nii} (T1_brain_affine_template)
affine_template_T1_brain_mask{ext_nii} (T1_brain_mask_affine_template)
affine_template_T1_brain_mask_weighted{ext_nii} (T1_brain_mask_weighted_affine_template)
affine_template_T1_head{ext_nii} (T1_head_affine_template)
affine_template_T2_head{ext_nii} (T2_head_affine_template)
affine_template_DTI{ext_nii} (DTI_affine_template)
affine_template_DTI_tensor{ext_nii} (DTI_tensor_affine_template)
nln_step_{step_id} (nln_step_dir)
mmorf_params_step{step_id}{ext_ini} (mmorf_params)
iteration_{it_id} (nln_step_iteration_dir)
{sub_id}_to_template_warp{ext_nii} (mmorf_warp)
{sub_id}_to_template_warp_resampled{ext_nii} (mmorf_warp_resampled)
{sub_id}_to_template_warp_resampled_unbiased{ext_nii} (mmorf_warp_resampled_unbiased)
{sub_id}_to_template_warp_resampled_unbiased_full_T1_brain{ext_nii} (mmorf_warp_resampled_unbiased_full_T1brain)
{sub_id}_to_template_warp_resampled_unbiased_full_T1_head{ext_nii} (mmorf_warp_resampled_unbiased_full_T1head)
{sub_id}_to_template_warp_resampled_unbiased_full_T2{ext_nii} (mmorf_warp_resampled_unbiased_full_T2)
{sub_id}_to_template_warp_resampled_unbiased_full_DTI{ext_nii} (mmorf_warp_resampled_unbiased_full_DTI)
{sub_id}_to_template_jac{ext_nii} (mmorf_jac)
{sub_id}_to_template_bias (mmorf_bias)
average_warp{ext_nii} (avg_warp)
average_invwarp{ext_nii} (inv_avg_warp)
{sub_id}_to_template_T1_brain{ext_nii} (warped_T1brain)
{sub_id}_to_template_T1_head{ext_nii} (warped_T1head)
{sub_id}_to_template_T2_head{ext_nii} (warped_T2head)
{sub_id}_to_template_DTI_scalar{ext_nii} (warped_DTIscalar)
{sub_id}_to_template_DTI_tensor{ext_nii} (warped_DTItensor)
{sub_id}_to_template_T1_brain_mask{ext_nii} (warped_T1brain_mask)
nln_template_T1_brain_{step_id}_{it_id}{ext_nii} (T1_brain_nln_template)
nln_template_T1_brain_mask_{step_id}_{it_id}{ext_nii} (T1_brain_mask_nln_template)
nln_template_T1_brain_mask_weighted_{step_id}_{it_id}{ext_nii} (T1_brain_mask_weighted_nln_template)
nln_template_T1_brain_SD_{step_id}_{it_id}{ext_nii} (T1_brain_SD_nln_template)
nln_template_T1_head_{step_id}_{it_id}{ext_nii} (T1_head_nln_template)
nln_template_T2_head_{step_id}_{it_id}{ext_nii} (T2_head_nln_template)
nln_template_DTI_{step_id}_{it_id}{ext_nii} (DTI_nln_template)
nln_template_DTI_tensor_{step_id}_{it_id}{ext_nii} (DTI_tensor_nln_template)
nln_RMS_delta_intensity_{step_id}_{it_id}{ext_txt} (delta_intensity_output)
nln_RMS_delta_avgwarp_{step_id}_{it_id}{ext_txt} (delta_avgwarp_output)
nln_RMS_intensity_sd_{step_id}_{it_id}{ext_txt} (sd_intensity_output)
ext_nii=.nii.gz
ext_mat=.mat
{sub_id}
anat
T1_orig{ext_nii} (T1_head)
T1_brain{ext_nii} (T1_brain)
T1_brain_mask{ext_nii} (T1_brain_mask)
Supports Markdown
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment