Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found

Target

Select target project
  • paulmc/fslpy
  • ndcn0236/fslpy
  • seanf/fslpy
3 results
Show changes
Showing
with 1149 additions and 457 deletions
"""
Easy format to define input/output files in a python pipeline.
"""Easy format to define input/output files in a python pipeline.
.. warning::
File-tree is now an independent Python library, and will eventually be
removed from ``fslpy`` - visit the `file-tree` :ref:`API documentation
<https://open.win.ox.ac.uk/pages/fsl/file-tree/>`_ for more details.
The goal is to separate the definition of the input/output filenames from the actual code
by defining a directory tree (i.e., FileTree) in a separate file from the code.
......@@ -177,7 +181,7 @@ which amongst others refers to
Example pipeline
----------------
A very simple pipeline to run BET on every subject can start with a simply FileTree like
A very simple pipeline to run BET on every subject can start with a FileTree like
::
{subject}
......@@ -200,6 +204,12 @@ Assuming that the input T1w's already exist, we can then simply run BET for ever
# make_dir=True ensures that the output directory containing the "bet_output" actually exists
bet(input=T1w_tree.get('T1w'), output=T1w_tree.get('bet_output', make_dir=True), mask=True)
Useful tips
-----------
Changing directory structure
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
If later on in our input files change, because for some subjects we added a second session, we could keep our script
and simply update the FileTree:
::
......@@ -207,8 +217,8 @@ and simply update the FileTree:
{subject}
[ses-{session}]
T1w.nii.gz
T1w_brain.nii.gz (bed_output)
T1w_brain_mask.nii.gz (bed_mask)
T1w_brain.nii.gz (bet_output)
T1w_brain_mask.nii.gz (bet_mask)
Note the square brackets around the session sub-directory. This indicates that this sub-directory is optional and
will only be present if the "session" variable is defined (see `Optional variables`_).
......@@ -230,17 +240,24 @@ altering this behaviour is again as simple as altering the FileTree to something
::
raw_data
{subject}
[ses-{session}]
{subject} (input_subject_dir)
[ses-{session}] (input_session_dir)
T1w.nii.gz
processed_data
{subject}
[ses-{session}]
{subject} (output_subject_dir)
[ses-{session}] (output_session_dir)
bet
{subject}[_{session}]_T1w_brain.nii.gz (bet_output)
{subject}[_{session}]_T1w_brain_mask.nii.gz (bet_mask)
Note that we also encoded the subject and session ID in the output filename.
We also have to explicitly assign short names to the subject and session directories,
even though we don't explicitly reference these in the script.
The reason for this is that each directory and filename template must have a unique short name and
in this case the default short names (respectively, "{subject}" and "[ses-{session}]") would not have been unique.
Output "basenames"
^^^^^^^^^^^^^^^^^^
Some tools like FSL's FAST produce many output files. Rather than entering all
of these files in our FileTree by hand you can include them all at once by including `Sub-trees`_:
......@@ -248,12 +265,12 @@ of these files in our FileTree by hand you can include them all at once by inclu
::
raw_data
{subject}
[ses-{session}]
{subject} (input_subject_dir)
[ses-{session}] (input_session_dir)
T1w.nii.gz
processed_data
{subject}
[ses-{session}]
{subject} (output_subject_dir)
[ses-{session}] (output_session_dir)
bet
{subject}[_{session}]_T1w_brain.nii.gz (bet_output)
{subject}[_{session}]_T1w_brain_mask.nii.gz (bet_mask)
......@@ -272,6 +289,35 @@ Within the script we can generate the fast output by running
The output files will be available as `T1w_tree.get('segment/<variable name>')`, where `<variable name>` is one
of the short variable names defined in the
`FAST FileTree <https://git.fmrib.ox.ac.uk/fsl/fslpy/blob/master/fsl/utils/filetree/trees/fast.tree>`_.
Running a pipeline on a subset of participants/sessions/runs
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Suppose you want to run your pipeline on a subset of your data while testing.
You may want to do this if your data has a a hierarchy of variables (e.g. participant, session, run) as in the example below.
::
sub-001
ses-01
sub-001_ses-01_run-1.feat
sub-001_ses-01_run-2.feat
ses-02
sub-{participant}_ses-{session}_run-{run}.feat (feat_dir)
...
sub-002
sub-003
...
You can update the FileTree with one or more variables before calling `get_all_trees` as follows:
.. code-block:: python
for participant in ("001", "002"):
for t in tree.update(participant=participant, run="1").get_all_trees("feat_dir", glob_vars="all"):
my_pipeline(t)
This code will iterate over all sessions that have a run="1" for participants "001" and "002".
"""
__author__ = 'Michiel Cottaar <Michiel.Cottaar@ndcn.ox.ac.uk>'
......@@ -279,3 +325,13 @@ __author__ = 'Michiel Cottaar <Michiel.Cottaar@ndcn.ox.ac.uk>'
from .filetree import FileTree, register_tree, MissingVariable
from .parse import tree_directories, list_all_trees
from .query import FileTreeQuery
import fsl.utils.deprecated as deprecated
deprecated.warn('fsl.utils.filetree',
stacklevel=2,
vin='3.6.0',
rin='4.0.0',
msg='The filetree package is now released as a separate '
'Python library ("file-tree" on PyPi), and will be '
'removed in a future version of fslpy.')
......@@ -131,7 +131,7 @@ class FileTreeQuery(object):
# An ND array for this short
# name. Each element is a
# Match object, or nan.
matcharray = np.zeros(tvarlens, dtype=np.object)
matcharray = np.zeros(tvarlens, dtype=object)
matcharray[:] = np.nan
# indices into the match array
......
......@@ -2,19 +2,22 @@ ext=.nii.gz
dataset_description.json
participants.tsv
README
CHANGES
README (readme)
CHANGES (changes)
LICENSE (license)
genetic_info.json
sub-{participant}
[ses-{session}]
sub-{participant}_sessions.tsv (sessions_tsv)
anat (anat_dir)
sub-{participant}[_ses-{session}][_acq-{acq}][_rec-{rec}][_run-{run_index}]_{modality}{ext} (anat_image)
sub-{participant}[_ses-{session}][_acq-{acq}][_ce-{ce}][_rec-{rec}][_run-{run_index}]_{modality}{ext} (anat_image)
sub-{participant}[_ses-{session}][_acq-{acq}][_ce-{ce}][_rec-{rec}][_run-{run_index}][_mod-{modality}]_defacemask{ext} (anat_deface)
func (func_dir)
sub-{participant}[_ses-{session}]_task-{task}[_acq-{acq}][_rec-{rec}][_run-{run_index}]_bold{ext} (task_image)
sub-{participant}[_ses-{session}]_task-{task}[_acq-{acq}][_rec-{rec}][_run-{run_index}]_sbref{ext} (task_sbref)
sub-{participant}[_ses-{session}]_task-{task}[_acq-{acq}][_rec-{rec}][_run-{run_index}]_events.tsv (task_events)
sub-{participant}[_ses-{session}]_task-{task}[_acq-{acq}][_rec-{rec}][_run-{run_index}][_recording-{recording}]_physio{ext} (task_physio)
sub-{participant}[_ses-{session}]_task-{task}[_acq-{acq}][_rec-{rec}][_run-{run_index}][_recording-{recording}]_stim{ext} (task_stim)
sub-{participant}[_ses-{session}]_task-{task}[_acq-{acq}][_ce-{ce}][_dir-{dir}][_rec-{rec}][_run-{run_index}][_echo-{echo}]_bold{ext} (task_image)
sub-{participant}[_ses-{session}]_task-{task}[_acq-{acq}][_ce-{ce}][_dir-{dir}][_rec-{rec}][_run-{run_index}][_echo-{echo}]_sbref{ext} (task_sbref)
sub-{participant}[_ses-{session}]_task-{task}[_acq-{acq}][_ce-{ce}][_dir-{dir}][_rec-{rec}][_run-{run_index}][_echo-{echo}]_events.tsv (task_events)
sub-{participant}[_ses-{session}]_task-{task}[_acq-{acq}][_ce-{ce}][_dir-{dir}][_rec-{rec}][_run-{run_index}][_echo-{echo}][_recording-{recording}]_physio.tsv.gz (task_physio)
sub-{participant}[_ses-{session}]_task-{task}[_acq-{acq}][_ce-{ce}][_dir-{dir}][_rec-{rec}][_run-{run_index}][_echo-{echo}][_recording-{recording}]_stim.tsv.gz (task_stim)
dwi (dwi_dir)
sub-{participant}[_ses-{session}][_acq-{acq}][_run-{run_index}]_dwi{ext} (dwi_image)
sub-{participant}[_ses-{session}][_acq-{acq}][_run-{run_index}]_dwi.bval (bval)
......@@ -28,3 +31,15 @@ sub-{participant}
sub-{participant}[_ses-{session}][_acq-{acq}][_run-{run_index}]_phase2{ext} (fmap_phase2)
sub-{participant}[_ses-{session}][_acq-{acq}][_run-{run_index}]_fieldmap{ext} (fmap)
sub-{participant}[_ses-{session}][_acq-{acq}]_dir-{dir}[_run-{run_index}]_epi{ext} (fmap_epi)
meg (meg_dir)
sub-{participant}[_ses-{session}]_task-{task}[_run-{run}][_proc-{proc}]_meg.{meg_ext} (meg)
eeg (eeg_dir)
sub-{participant}[_ses-{session}]_task-{task}[_run-{run}][_proc-{proc}]_eeg.{eeg_ext} (eeg)
ieeg (ieeg_dir)
sub-{participant}[_ses-{session}]_task-{task}[_run-{run}][_proc-{proc}]_ieeg.{ieeg_ext} (ieeg)
beh (behavioral_dir)
sub-{participant}[_ses-{session}]_task-{task}_events.tsv (behavioural_events)
sub-{participant}[_ses-{session}]_task-{task}_beh.tsv (behavioural)
sub-{participant}[_ses-{session}]_task-{task}_physio.tsv.gz (behavioural_physio)
sub-{participant}[_ses-{session}]_task-{task}_stim.tsv.gz (behavioral_stim)
......@@ -12,3 +12,10 @@ basename = dti
{basename}_L3.nii.gz (L3)
{basename}_kurt.nii.gz (kurt)
{basename}_kurt1.nii.gz (kurt1)
{basename}_kurt2.nii.gz (kurt2)
{basename}_kurt3.nii.gz (kurt3)
{basename}_sse.nii.gz (sse)
{basename}_cnicope.nii.gz (cnicope)
{basename}_tensor.nii.gz (tensor)
struct = T1
basename = fsl_anat
{basename} (basename)
{basename}.anat (fsl_anat_dir)
lesionmaskinv.nii.gz
lesionmask.nii.gz
log.txt
MNI152_{struct}_2mm_brain_mask_dil1.nii.gz
MNI_to_{struct}_nonlin_field.nii.gz
{struct}2std_skullcon.mat
{struct}_biascorr_bet_skull.nii.gz (biascorr_bet_skull)
{struct}_biascorr_brain_mask.nii.gz (biascorr_brain_mask)
{struct}_biascorr_brain.nii.gz (biascorr_brain)
{struct}_biascorr.nii.gz (biascorr)
{struct}_fast_bias_idxmask.nii.gz
{struct}_fast_bias_init.nii.gz
{struct}_fast_bias.nii.gz
{struct}_fast_bias_vol2.nii.gz
{struct}_fast_bias_vol32.nii.gz
{struct}_fast_restore.nii.gz
{struct}_fast_seg.nii.gz (fast_seg)
{struct}_fast_pve_0.nii.gz
{struct}_fast_pve_1.nii.gz
{struct}_fast_pve_2.nii.gz
{struct}_fast_pveseg.nii.gz (fast_pveseg)
{struct}_fast_totbias.nii.gz
{struct}_fullfov.nii.gz
{struct}_initfast2_brain_mask2.nii.gz
{struct}_initfast2_brain_mask.nii.gz
{struct}_initfast2_brain.nii.gz
{struct}_initfast2_maskedrestore.nii.gz
{struct}_initfast2_restore.nii.gz
{struct}.nii.gz
{struct}_nonroi2roi.mat
{struct}_orig2roi.mat
{struct}_orig2std.mat
{struct}_orig.nii.gz
{struct}_roi2nonroi.mat
{struct}_roi2orig.mat
{struct}_roi.log
{struct}_std2orig.mat
{struct}_to_MNI_lin.mat (MNI_lin_mat)
{struct}_to_MNI_lin.nii.gz (MNI_lin_nii)
{struct}_to_MNI_nonlin_coeff.nii.gz
{struct}_to_MNI_nonlin_field.nii.gz
{struct}_to_MNI_nonlin_jac.nii.gz
{struct}_to_MNI_nonlin.nii.gz (MNI_nonlin_nii)
{struct}_to_MNI_nonlin.txt (MNI_nonlin_txt)
......@@ -7,9 +7,15 @@
"""This module submits jobs to a computing cluster using FSL's fsl_sub command
line tool. It is assumed that the computing cluster is managed by SGE.
.. note:: All of the functionality in this module is deprecated and will be
removed in a future version of fslpy. Equivalent functionality is
available in the `fsl_sub <https://git.fmrib.ox.ac.uk/fsl/fsl_sub>`_
project, and the :mod:`fsl.utils.run` module and
:mod:`.wrappers.fsl_sub` wrapper function.
Example usage, building a short pipeline::
from fsl.utils.fslsub import submit, wait
from fsl.utils.fslsub import submit
# submits bet to veryshort queue unless <mask_filename> already exists
bet_job = submit('bet <input_filename> -m',
......@@ -26,42 +32,79 @@ Example usage, building a short pipeline::
wait_for=(bet_job, other_job),
queue='cuda.q')
# waits for the cuda job to finish
wait(cuda_job)
.. autosummary::
:nosignatures:
submit
info
output
wait
func_to_cmd
hold
"""
from six import BytesIO
from io import BytesIO
import os.path as op
import glob
import time
import pickle
import dill
import sys
import tempfile
import logging
import importlib
from dataclasses import dataclass, asdict
from typing import Optional, Collection, Union, Tuple
from typing import Optional, Collection, Union, Tuple, Dict
import argparse
import warnings
import os
import fsl.utils.deprecated as deprecated
import fsl.utils.run as run
log = logging.getLogger(__name__)
@dataclass
class SubmitParams(object):
"""
Represents the fsl_sub parameters
class SubmitParams:
"""Represents the fsl_sub parameters
The ``SubmitParams`` class is deprecated - you should use
:mod:`fsl.wrappers.fsl_sub` instead, or use the ``fsl_sub`` Python
library, which is installed as part of FSL.
Any command line script can be submitted by the parameters by calling the `SubmitParams` object:
.. code-block:: python
submit = SubmitParams(minutes=1, logdir='log', wait_for=['108023', '108019'])
submit('echo finished')
This will run "echo finished" with a maximum runtime of 1 minute after the jobs with IDs 108023 and 108019 are finished.
It is the equivalent of
.. code-block:: bash
fsl_sub -T 1 -l log -j 108023,108019 "echo finished"
For python scripts that submit themselves to the cluster, it might be useful to give the user some control
over at least some of the submission parameters. This can be done using:
.. code-block:: python
import argparse
parser = argparse.ArgumentParser("my script doing awesome stuff")
parser.add_argument("input_file")
parser.add_argument("output_file")
SubmitParams.add_to_parser(parser, include=('wait_for', 'logdir'))
args = parser.parse_args()
submitter = SubmitParams.from_args(args).update(minutes=10)
from fsl import wrappers
wrappers.bet(input_file, output_file, fslsub=submitter)
This submits a BET job using the -j and -l flags set by the user and a maximum time of 10 minutes.
"""
minutes: Optional[float] = None
queue: Optional[str] = None
......@@ -91,8 +134,11 @@ class SubmitParams(object):
}
def __post_init__(self):
"""
If not set explicitly by the user don't alter the environment in which the script will be submitted
"""
if self.env is None:
self.env = {}
self.env = dict(os.environ)
def as_flags(self, ):
"""
......@@ -119,6 +165,8 @@ class SubmitParams(object):
def __str__(self):
return 'SubmitParams({})'.format(" ".join(self.as_flags()))
@deprecated.deprecated('3.7.0', '4.0.0',
'Use fsl_sub or fsl.wrappers.fsl_sub instead')
def __call__(self, *command, **kwargs):
"""
Submits the command to the cluster.
......@@ -205,6 +253,9 @@ class SubmitParams(object):
@classmethod
def from_args(cls, args):
"""
Create a SubmitParams from the command line arguments
"""
as_dict = {value: getattr(args, '_sub_' + value, None) for value in cls.cmd_line_flags.values()}
if args._sub_wait_for is not None:
as_dict['wait_for'] = args._sub_wait_for.split(',')
......@@ -214,10 +265,16 @@ class SubmitParams(object):
return cls(verbose=args._sub_verbose, flags=args._sub_flags, **as_dict)
@deprecated.deprecated('3.7.0', '4.0.0',
'Use fsl_sub or fsl.wrappers.fsl_sub instead')
def submit(*command, **kwargs):
"""
Submits a given command to the cluster
The ``submit`` function is deprecated - you should use
:mod:`fsl.wrappers.fsl_sub` instead, or use the ``fsl_sub`` Python
library, which is available in FSL 6.0.5 and newer.
You can pass the command and arguments as a single string, or as a regular or unpacked sequence.
:arg command: string or regular/unpacked sequence of strings with the job command
......@@ -252,31 +309,71 @@ def submit(*command, **kwargs):
return SubmitParams(**kwargs)(*command)
def info(job_id):
@deprecated.deprecated('3.7.0', '4.0.0', 'Use fsl_sub.report instead')
def info(job_ids) -> Dict[str, Optional[Dict[str, str]]]:
"""Gets information on a given job id
Uses `qstat -j <job_id>`
The ``info`` function is deprecated - you should use the
``fsl_sub.report`` function from the ``fsl_sub`` Python library, which
is available in FSL 6.0.5 and newer.
:arg job_id: string with job id
:return: dictionary with information on the submitted job (empty
if job does not exist)
Uses `qstat -j <job_ids>`
:arg job_ids: string with job id or (nested) sequence with jobs
:return: dictionary of jobid -> another dictionary with job information
(or None if job does not exist)
"""
if not hasattr(info, '_ncall'):
info._ncall = 0
info._ncall += 1
if info._ncall == 3:
warnings.warn("Please do not call `fslsub.info` repeatably, because it slows down the cluster. You can avoid this message by simply passing all the job IDs you are interested in to a single `fslsub.info` call.")
from fsl.utils.run import run
job_ids_string = _flatten_job_ids(job_ids)
try:
result = run(['qstat', '-j', job_id], exitcode=True)[0]
result = run(['qstat', '-j', job_ids_string], exitcode=True)[0]
except FileNotFoundError:
log.debug("qstat not found; assuming not on cluster")
return {}
if 'Following jobs do not exist:' in result:
return {}
res = {}
for line in result.splitlines()[1:]:
kv = line.split(':', 1)
if len(kv) == 2:
res[kv[0].strip()] = kv[1].strip()
return _parse_qstat(job_ids_string, result)
def _parse_qstat(job_ids_string, qstat_stdout):
"""
Parses the qstat output into a dictionary of dictionaries
:param job_ids_string: input job ids
:param qstat_stdout: qstat output
:return: dictionary of jobid -> another dictionary with job information
(or None if job does not exist)
"""
res = {job_id: None for job_id in job_ids_string.split(',')}
current_job_id = None
for line in qstat_stdout.splitlines()[1:]:
line = line.strip()
if len(line) == 0:
continue
if line == '=' * len(line):
current_job_id = None
elif ':' in line:
current_key, value = [part.strip() for part in line.split(':', 1)]
if current_key == 'job_number':
current_job_id = value
if current_job_id not in job_ids_string:
raise ValueError(f"Unexpected job ID in qstat output:\n{line}")
res[current_job_id] = {}
else:
if current_job_id is None:
raise ValueError(f"Found job information before job ID in qstat output:\n{line}")
res[current_job_id][current_key] = value
else:
res[current_job_id][current_key] += '\n' + line
return res
@deprecated.deprecated('3.13.0', '4.0.0',
'Use fsl.utils.run.job_output instead')
def output(job_id, logdir='.', command=None, name=None):
"""Returns the output of the given job.
......@@ -287,46 +384,7 @@ def output(job_id, logdir='.', command=None, name=None):
:arg name: Job name if it was specified. Not currently used.
:returns: A tuple containing the standard output and standard error.
"""
stdout = list(glob.glob(op.join(logdir, '*.o{}'.format(job_id))))
stderr = list(glob.glob(op.join(logdir, '*.e{}'.format(job_id))))
if len(stdout) != 1 or len(stderr) != 1:
raise ValueError('No/too many error/output files for job {}: stdout: '
'{}, stderr: {}'.format(job_id, stdout, stderr))
stdout = stdout[0]
stderr = stderr[0]
if op.exists(stdout):
with open(stdout, 'rt') as f:
stdout = f.read()
else:
stdout = None
if op.exists(stderr):
with open(stderr, 'rt') as f:
stderr = f.read()
else:
stderr = None
return stdout, stderr
def wait(job_ids):
"""Wait for one or more jobs to finish
:arg job_ids: string or tuple of strings with jobs that should finish
before continuing
"""
start_time = time.time()
for job_id in _flatten_job_ids(job_ids).split(','):
log.debug('Waiting for job {}'.format(job_id))
while len(info(job_id)) > 0:
wait_time = min(max(1, (time.time() - start_time) / 3.), 20)
time.sleep(wait_time)
log.debug('Job {} finished, continuing to next'.format(job_id))
log.debug('All jobs have finished')
return run.job_output(job_id, logdir, command, name)
def _flatten_job_ids(job_ids):
......@@ -351,36 +409,29 @@ def _flatten_job_ids(job_ids):
return ','.join(sorted(unpack(job_ids)))
_external_job = """#!{}
# This is a temporary file designed to run the python function {},
# so that it can be submitted to the cluster
import pickle
from six import BytesIO
from importlib import import_module
pickle_bytes = BytesIO({})
name_type, name, func_name, args, kwargs = pickle.load(pickle_bytes)
if name_type == 'module':
# retrieves a function defined in an external module
func = getattr(import_module(name), func_name)
elif name_type == 'script':
# retrieves a function defined in the __main__ script
local_execute = {{'__name__': '__not_main__', '__file__': name}}
exec(open(name, 'r').read(), local_execute)
func = local_execute[func_name]
else:
raise ValueError('Unknown name_type: %r' % name_type)
res = func(*args, **kwargs)
if res is not None:
with open(__file__ + '_out.pickle', 'w') as f:
pickle.dump(f, res)
"""
@deprecated.deprecated('3.13.0', '4.0.0', 'Use fsl.utils.run.hold instead')
def hold(job_ids, hold_filename=None):
"""
Waits until all jobs have finished
Internally works by submitting a new job, which creates a file named `hold_filename`,
which will only run after all jobs in `job_ids` finished.
This function will only return once `hold_filename` has been created
def func_to_cmd(func, args, kwargs, tmp_dir=None, clean=False):
:param job_ids: possibly nested sequence of job ids. The job ids themselves should be strings.
:param hold_filename: filename to use as a hold file.
The containing directory should exist, but the file itself should not.
Defaults to a ./.<random characters>.hold in the current directory.
:return: only returns when all the jobs have finished
"""
run.hold(job_ids, hold_filename)
@deprecated.deprecated('3.13.0', '4.0.0',
'Use fsl.utils.run.func_to_cmd or '
'fsl.utils.run.runfunc instead')
def func_to_cmd(func, args=None, kwargs=None, tmp_dir=None, clean="never", verbose=False):
"""Defines the command needed to run the function from the command line
WARNING: if submitting a function defined in the __main__ script,
......@@ -391,25 +442,13 @@ def func_to_cmd(func, args, kwargs, tmp_dir=None, clean=False):
:arg args: positional arguments
:arg kwargs: keyword arguments
:arg tmp_dir: directory where to store the temporary file
:arg clean: if True removes the submitted script after running it
:arg clean: Whether the script should be removed after running. There are three options:
- "never" (default): Script is kept
- "on_success": only remove if script successfully finished (i.e., no error is raised)
- "always": always remove the script, even if it raises an error
:arg verbose: If set to True, the script will print its own filename before running
:return: string which will run the function
"""
pickle_bytes = BytesIO()
if func.__module__ == '__main__':
pickle.dump(('script', importlib.import_module('__main__').__file__, func.__name__,
args, kwargs), pickle_bytes)
else:
pickle.dump(('module', func.__module__, func.__name__,
args, kwargs), pickle_bytes)
python_cmd = _external_job.format(sys.executable,
func.__name__,
pickle_bytes.getvalue())
_, filename = tempfile.mkstemp(prefix=func.__name__ + '_',
suffix='.py',
dir=tmp_dir)
with open(filename, 'w') as python_file:
python_file.write(python_cmd)
return sys.executable + " " + filename + ('; rm ' + filename if clean else '')
return run.func_to_cmd(func, args, kwargs, tmp_dir, clean, verbose)
......@@ -89,7 +89,35 @@ except ImportError: import Queue as queue
log = logging.getLogger(__name__)
class IdleTask(object):
@functools.lru_cache()
def _canHaveGui():
"""Return ``True`` if wxPython is installed, and a display is available,
``False`` otherwise.
"""
# Determine if a display is available. We do
# this once at init (instead of on-demand in
# the canHaveGui method) because calling the
# IsDisplayAvailable function will cause the
# application to steal focus under OSX!
try:
import wx
return wx.App.IsDisplayAvailable()
except ImportError:
return False
def _haveGui():
"""Return ``True`` if wxPython is installed, a display is available, and
a ``wx.App`` exists, ``False`` otherwise.
"""
try:
import wx
return _canHaveGui() and (wx.GetApp() is not None)
except ImportError:
return False
class IdleTask:
"""Container object used by the :class:`IdleLoop` class.
Used to encapsulate information about a queued task.
"""
......@@ -111,7 +139,7 @@ class IdleTask(object):
self.kwargs = kwargs
class IdleLoop(object):
class IdleLoop:
"""This class contains logic for running tasks via ``wx.EVT_IDLE`` events.
A single ``IdleLoop`` instance is created when this module is first
......@@ -370,8 +398,6 @@ class IdleLoop(object):
``timeout``, or ``alwaysQueue``.
"""
from fsl.utils.platform import platform as fslplatform
schedtime = time.time()
timeout = kwargs.pop('timeout', 0)
after = kwargs.pop('after', 0)
......@@ -380,18 +406,15 @@ class IdleLoop(object):
skipIfQueued = kwargs.pop('skipIfQueued', False)
alwaysQueue = kwargs.pop('alwaysQueue', False)
canHaveGui = fslplatform.canHaveGui
haveGui = fslplatform.haveGui
# If there is no possibility of a
# gui being available in the future
# (determined by canHaveGui), then
# (determined by _canHaveGui), then
# alwaysQueue is ignored.
alwaysQueue = alwaysQueue and canHaveGui
alwaysQueue = alwaysQueue and _canHaveGui()
# We don't have wx - run the task
# directly/synchronously.
if self.__neverQueue or not (haveGui or alwaysQueue):
if self.__neverQueue or not (_haveGui() or alwaysQueue):
time.sleep(after)
log.debug('Running idle task directly')
task(*args, **kwargs)
......@@ -611,11 +634,13 @@ def block(secs, delta=0.01, until=None):
determins when calls to ``block`` will return.
"""
havewx = _haveGui()
def defaultUntil():
return False
def tick():
if fslplatform.haveGui:
if havewx:
import wx
wx.YieldIfNeeded()
time.sleep(delta)
......@@ -623,8 +648,6 @@ def block(secs, delta=0.01, until=None):
if until is None:
until = defaultUntil
from fsl.utils.platform import platform as fslplatform
start = time.time()
while (time.time() - start) < secs:
tick()
......@@ -653,12 +676,11 @@ def run(task, onFinish=None, onError=None, name=None):
the return value will be ``None``.
"""
from fsl.utils.platform import platform as fslplatform
if name is None:
name = getattr(task, '__name__', '<unknown>')
haveWX = fslplatform.haveGui
haveWX = _haveGui()
# Calls the onFinish or onError handler
def callback(cb, *args, **kwargs):
......@@ -727,14 +749,12 @@ def wait(threads, task, *args, **kwargs):
a keyword argument called ``wait_direct``.
"""
from fsl.utils.platform import platform as fslplatform
direct = kwargs.pop('wait_direct', False)
if not isinstance(threads, abc.Sequence):
threads = [threads]
haveWX = fslplatform.haveGui
haveWX = _haveGui()
def joinAll():
log.debug('Wait thread joining on all targets')
......@@ -755,14 +775,15 @@ def wait(threads, task, *args, **kwargs):
return None
class Task(object):
class Task:
"""Container object which encapsulates a task that is run by a
:class:`TaskThread`.
"""
def __init__(self, name, func, onFinish, args, kwargs):
def __init__(self, name, func, onFinish, onError, args, kwargs):
self.name = name
self.func = func
self.onFinish = onFinish
self.onError = onError
self.args = args
self.kwargs = kwargs
self.enabled = True
......@@ -774,7 +795,6 @@ class TaskThreadVeto(Exception):
handler (if one has been specified). See the :meth:`TaskThread.enqueue`
method for more details.
"""
pass
class TaskThread(threading.Thread):
......@@ -808,9 +828,16 @@ class TaskThread(threading.Thread):
:arg onFinish: An optional function to be called (via :func:`idle`)
when the task funtion has finished. Must be provided as
a keyword argument. If the ``func`` raises a
:class`TaskThreadVeto` error, this function will not
be called.
a keyword argument, and must itself accept no arguments.
If the ``func`` raises a :class`TaskThreadVeto` error,
this function will not be called.
:arg onError: An optional function to be called (via :func:`idle`)
if the task funtion raises an ``Exception``. Must be
provided as a keyword argument, and must itself accept
the raised ``Exception`` object as a single argument.
If the ``func`` raises a :class`TaskThreadVeto` error,
this function will not be called.
All other arguments are passed through to the task function when it is
executed.
......@@ -821,16 +848,18 @@ class TaskThread(threading.Thread):
results.
.. warning:: Make sure that your task function is not expecting keyword
arguments called ``taskName`` or ``onFinish``!
arguments called ``taskName``, ``onFinish``, or
``onError``!
"""
name = kwargs.pop('taskName', None)
onFinish = kwargs.pop('onFinish', None)
onError = kwargs.pop('onError', None)
log.debug('Enqueueing task: {} [{}]'.format(
name, getattr(func, '__name__', '<unknown>')))
t = Task(name, func, onFinish, args, kwargs)
t = Task(name, func, onFinish, onError, args, kwargs)
self.__enqueued[name] = t
self.__q.put(t)
......@@ -951,6 +980,9 @@ class TaskThread(threading.Thread):
type(e).__name__,
str(e)),
exc_info=True)
if task.onError is not None:
idle(task.onError, e)
finally:
self.__q.task_done()
......@@ -992,7 +1024,7 @@ def mutex(*args, **kwargs):
return MutexFactory(*args, **kwargs)
class MutexFactory(object):
class MutexFactory:
"""The ``MutexFactory`` is a placeholder for methods which have been
decorated with the :func:`mutex` decorator. When the method of a class
is decorated with ``@mutex``, a ``MutexFactory`` is created.
......
......@@ -190,7 +190,7 @@ def resample(image,
if origin not in ('centre', 'corner'):
raise ValueError('Invalid value for origin: {}'.format(origin))
data = np.array(image[sliceobj], dtype=dtype, copy=False)
data = np.asarray(image[sliceobj], dtype=dtype)
if len(data.shape) != len(newShape):
raise ValueError('Data dimensions do not match new shape: '
......@@ -209,7 +209,7 @@ def resample(image,
np.all(np.isclose(matrix, np.eye(len(newShape) + 1))):
return data, image.voxToWorldMat
newShape = np.array(np.round(newShape), dtype=np.int)
newShape = np.array(np.round(newShape), dtype=int)
# Apply smoothing if requested,
# and if not using nn interp
......
......@@ -18,6 +18,7 @@ import os
import os.path as op
import shutil
import numpy as np
import nibabel as nib
import fsl.utils.path as fslpath
......@@ -44,12 +45,13 @@ def imcp(src,
already exist. Defaults to ``False``.
:arg useDefaultExt: Defaults to ``False``. If ``True``, the destination
file type will be set according to the default
extension, specified by
:func:`~fsl.data.image.defaultExt`. If the source
file does not have the same type as the default
file type will be set according to the default file
type, specified by
:func:`~fsl.data.image.defaultOutputType`. If the
source file does not have the same type as the default
extension, it will be converted. If ``False``, the
source file type is not changed.
source file type is used, and the destination file type
(if specified) is ignored.
:arg move: If ``True``, the files are moved, instead of being
copied. See :func:`immv`.
......@@ -57,7 +59,7 @@ def imcp(src,
# special case - non-existent directory
if dest.endswith('/') and not op.isdir(dest):
raise fslpath.PathError('Directory does not exist: {}'.format(dest))
raise fslpath.PathError(f'Directory does not exist: {dest}')
if op.isdir(dest):
dest = op.join(dest, op.basename(src))
......@@ -87,17 +89,22 @@ def imcp(src,
if not op.exists(src):
raise fslpath.PathError('imcp error - source path '
'does not exist: {}'.format(src))
# Figure out the destination file
# extension/type. If useDefaultExt
# is True, we use the default
# extension. Otherwise, if no
# destination file extension is
# provided, we use the source
# extension.
if useDefaultExt: destExt = fslimage.defaultExt()
elif destExt == '': destExt = srcExt
f'does not exist: {src}')
# Infer the image type of the source image. We
# can't just look at the extension, as e.g. an
# .img file can be any of ANALYZE/NIFTI1/NIFTI2
srcType = fslimage.fileType(src)
# Figure out the destination file extension/type.
# If useDefaultExt is True, we use the default
# extension. Otherwise we use the source type
if useDefaultExt:
destType = fslimage.defaultOutputType()
destExt = fslimage.defaultExt()
else:
destType = srcType
destExt = srcExt
# Resolve any file group differences
# e.g. we don't care if the src is
......@@ -116,10 +123,10 @@ def imcp(src,
# Give up if we don't have permission.
if not os.access(op.dirname(dest), os.W_OK | os.X_OK):
raise fslpath.PathError('imcp error - cannot write to {}'.format(dest))
raise fslpath.PathError(f'imcp error - cannot write to {dest}')
if move and not os.access(op.dirname(src), os.W_OK | os.X_OK):
raise fslpath.PathError('imcp error - cannot move from {}'.format(src))
raise fslpath.PathError(f'imcp error - cannot move from {src}')
# If the source file type does not
# match the destination file type,
......@@ -129,14 +136,48 @@ def imcp(src,
# io and cpu, but programmatically
# very easy - nibabel does all the
# hard work.
if srcExt != destExt:
if srcType != destType:
if not overwrite and op.exists(dest):
raise fslpath.PathError('imcp error - destination already '
'exists ({})'.format(dest))
raise fslpath.PathError('imcp error - destination '
f'already exists ({dest})')
img = nib.load(src)
# Force conversion to specific file format if
# necessary. The file format (pair, gzipped
# or not) is taken care of automatically by
# nibabel
if 'ANALYZE' in destType.name: cls = nib.AnalyzeImage
elif 'NIFTI2' in destType.name: cls = nib.Nifti2Image
elif 'NIFTI' in destType.name: cls = nib.Nifti1Image
# The default behaviour of nibabel when saving
# is to rescale the image data to the full range
# of the data type, and then set the scl_slope/
# inter header fields accordingly. This is highly
# disruptive in many circumstances. Fortunately:
# - The nibabel ArrayProxy class provides a
# get_unscaled method, which allows us to
# bypass the rescaling at load time.
# - Explicitly setting the slope and intercept
# on the header allows us to bypass rescaling
# at save time.
#
# https://github.com/nipy/nibabel/issues/1001
# https://neurostars.org/t/preserve-datatype-and-precision-with-nibabel/27641/2
slope = img.dataobj.slope
inter = img.dataobj.inter
data = np.asanyarray(img.dataobj.get_unscaled(),
dtype=img.get_data_dtype())
img = cls(data, None, header=img.header)
img.header.set_slope_inter(slope, inter)
nib.save(img, dest)
# Make sure the image reference is cleared, and
# hopefully GC'd, as otherwise we sometimes get
# errors on Windows (mostly in unit tests) w.r.t.
# attempts to delete files which are still open
img = None
if move:
......@@ -193,7 +234,7 @@ def imcp(src,
# paths already exist
if not overwrite and any([op.exists(d) for d in copyDests]):
raise fslpath.PathError('imcp error - a destination path already '
'exists ({})'.format(', '.join(copyDests)))
f'exists ({",".join(copyDests)})')
# Do the copy/move
for src, dest in zip(copySrcs, copyDests):
......
......@@ -21,7 +21,6 @@ a function:
import logging
import hashlib
import functools
import six
log = logging.getLogger(__name__)
......@@ -171,7 +170,7 @@ def memoizeMD5(func):
# compatible) bytes , and take
# the hash of those bytes.
for arg in args:
if not isinstance(arg, six.string_types):
if not isinstance(arg, str):
arg = str(arg)
arg = arg.encode('utf-8')
hashobj.update(arg)
......@@ -243,8 +242,8 @@ def skipUnchanged(func):
isarray = oldIsArray or newIsArray
if isarray:
a = np.array(oldVal, copy=False)
b = np.array(value, copy=False)
a = np.asarray(oldVal)
b = np.asarray(value)
nochange = (a.shape == b.shape) and np.allclose(a, b)
else:
......
......@@ -7,12 +7,9 @@
"""This module provides the :class:`Meta` class. """
import collections
class Meta(object):
"""The ``Meta`` class is intended to be used as a mixin for other classes. It
is simply a wrapper for a dictionary of key-value pairs.
class Meta:
"""The ``Meta`` class is intended to be used as a mixin for other classes.
It is simply a wrapper for a dictionary of key-value pairs.
It has a handful of methods allowing you to add and access additional
metadata associated with an object.
......@@ -20,6 +17,7 @@ class Meta(object):
.. autosummary::
:nosignatures:
meta
metaKeys
metaValues
metaItems
......@@ -32,11 +30,17 @@ class Meta(object):
"""Initialises a ``Meta`` instance. """
new = super(Meta, cls).__new__(cls)
new.__meta = collections.OrderedDict()
new.__meta = {}
return new
@property
def meta(self):
"""Return a reference to the metadata dictionary. """
return self.__meta
def metaKeys(self):
"""Returns the keys contained in the metadata dictionary
(``dict.keys``).
......
......@@ -14,9 +14,6 @@ import inspect
import contextlib
import collections
import six
import fsl.utils.idle as idle
import fsl.utils.weakfuncref as weakfuncref
......@@ -36,7 +33,7 @@ class Registered(Exception):
pass
class _Listener(object):
class _Listener:
"""This class is used internally by the :class:`.Notifier` class to
store references to callback functions.
"""
......@@ -61,22 +58,48 @@ class _Listener(object):
return self.__callback.function()
@property
def expectsArguments(self):
"""Returns ``True`` if the listener function needs to be passed
arguments, ``False`` otherwise. Listener functions can
be defined to accept either zero arguments, or a set of
positional arguments - see :meth:`Notifier.register` for details.
"""
func = self.callback
# the function may have been GC'd
if func is None:
return False
spec = inspect.signature(func)
posargs = 0
varargs = False
for param in spec.parameters.values():
if param.kind in (inspect.Parameter.POSITIONAL_ONLY,
inspect.Parameter.POSITIONAL_OR_KEYWORD):
posargs += 1
elif param.kind == inspect.Parameter.VAR_POSITIONAL:
varargs = True
return varargs or ((not varargs) and (posargs == 3))
def __str__(self):
cb = self.callback
if cb is not None: cbName = getattr(cb, '__name__', '<callable>')
else: cbName = '<deleted>'
if cb is not None: name = getattr(cb, '__name__', '<callable>')
else: name = '<deleted>'
return 'Listener {} [topic: {}] [function: {}]'.format(
self.name, self.topic, cbName)
return f'Listener {self.name} [topic: {self.topic}] [function: {name}]'
def __repr__(self):
return self.__str__()
class Notifier(object):
class Notifier:
"""The ``Notifier`` class is a mixin which provides simple notification
capability. Listeners can be registered/deregistered to listen via the
:meth:`register` and :meth:`deregister` methods, and notified via the
......@@ -119,8 +142,8 @@ class Notifier(object):
:arg name: A unique name for the listener.
:arg callback: The function to call - must accept two positional
arguments:
:arg callback: The function to call - must accept either zero
arguments, or three positional arguments:
- this ``Notifier`` instance.
......@@ -147,12 +170,12 @@ class Notifier(object):
listener = _Listener(name, callback, topic, runOnIdle)
if name in self.__listeners[topic]:
raise Registered('Listener {} is already registered'.format(name))
raise Registered(f'Listener {name} is already registered')
self.__listeners[topic][name] = listener
self.__enabled[ topic] = self.__enabled.get(topic, True)
log.debug('{}: Registered {}'.format(type(self).__name__, listener))
log.debug('%s: Registered %s', type(self).__name__, listener)
def deregister(self, name, topic=None):
......@@ -186,8 +209,8 @@ class Notifier(object):
self.__listeners.pop(topic)
self.__enabled .pop(topic)
log.debug('{}: De-registered listener {}'.format(
type(self).__name__, listener))
log.debug('%s: De-registered listener %s',
type(self).__name__, listener)
def enable(self, name, topic=None, enable=True):
......@@ -297,7 +320,7 @@ class Notifier(object):
:arg topic: Topic or topics that the listener is registered on.
"""
if topic is None or isinstance(topic, six.string_types):
if topic is None or isinstance(topic, str):
topic = [topic]
topics = topic
......@@ -347,12 +370,12 @@ class Notifier(object):
srcMod = '...{}'.format(frame[1][-20:])
srcLine = frame[2]
log.debug('{}: Notifying {} listeners (topic: {}) [{}:{}]'.format(
log.debug('%s: Notifying %s listeners (topic: %s) [%s:%s]',
type(self).__name__,
len(listeners),
topic,
srcMod,
srcLine))
srcLine)
for listener in listeners:
......@@ -363,15 +386,19 @@ class Notifier(object):
# callback function may have been
# gc'd - remove it if this is the case.
if callback is None:
log.debug('Listener {} has been gc\'d - '
'removing from list'.format(name))
log.debug('Listener %s has been gc\'d - '
'removing from list', name)
self.__listeners[listener.topic].pop(name)
continue
elif not listener.enabled:
if not listener.enabled:
continue
elif listener.runOnIdle: idle.idle(callback, self, topic, value)
else: callback( self, topic, value)
if listener.expectsArguments: args = (self, topic, value)
else: args = ()
if listener.runOnIdle: idle.idle(callback, *args)
else: callback( *args)
def __getListeners(self, topic):
......
......@@ -32,16 +32,19 @@ import os.path as op
import os
import glob
import operator
import pathlib
import re
from fsl.utils.platform import platform
from typing import Sequence, Tuple, Union
PathLike = Union[str, pathlib.Path]
class PathError(Exception):
"""``Exception`` class raised by the functions defined in this module
when something goes wrong.
"""
pass
def deepest(path, suffixes):
......@@ -52,12 +55,12 @@ def deepest(path, suffixes):
path = path.strip()
if path == op.sep or path == '':
if path in (op.sep, ''):
return None
path = path.rstrip(op.sep)
if any([path.endswith(s) for s in suffixes]):
if any(path.endswith(s) for s in suffixes):
return path
return deepest(op.dirname(path), suffixes)
......@@ -81,7 +84,7 @@ def shallowest(path, suffixes):
if parent is not None:
return parent
if any([path.endswith(s) for s in suffixes]):
if any(path.endswith(s) for s in suffixes):
return path
return None
......@@ -101,19 +104,23 @@ def allFiles(root):
return files
def hasExt(path, allowedExts):
def hasExt(path : PathLike,
allowedExts : Sequence[str]) -> bool:
"""Convenience function which returns ``True`` if the given ``path``
ends with any of the given ``allowedExts``, ``False`` otherwise.
"""
return any([path.endswith(e) for e in allowedExts])
def addExt(prefix,
allowedExts=None,
mustExist=True,
defaultExt=None,
fileGroups=None,
unambiguous=True):
path = str(path)
return any(path.endswith(e) for e in allowedExts)
def addExt(
prefix : PathLike,
allowedExts : Sequence[str] = None,
mustExist : bool = True,
defaultExt : str = None,
fileGroups : Sequence[Sequence[str]] = None,
unambiguous : bool = True
) -> Union[Sequence[str], str]:
"""Adds a file extension to the given file ``prefix``.
If ``mustExist`` is False, and the file does not already have a
......@@ -148,6 +155,8 @@ def addExt(prefix,
containing *all* matching files is returned.
"""
prefix = str(prefix)
if allowedExts is None: allowedExts = []
if fileGroups is None: fileGroups = {}
......@@ -189,7 +198,8 @@ def addExt(prefix,
# If ambiguity is ok, return
# all matching paths
elif not unambiguous:
if not unambiguous:
return allPaths
# Ambiguity is not ok! More than
......@@ -223,19 +233,29 @@ def addExt(prefix,
return allPaths[0]
def removeExt(filename, allowedExts=None, firstDot=False):
def removeExt(
filename : PathLike,
allowedExts : Sequence[str] = None,
firstDot : bool = False
) -> str:
"""Returns the base name of the given file name. See :func:`splitExt`. """
return splitExt(filename, allowedExts, firstDot)[0]
def getExt(filename, allowedExts=None, firstDot=False):
def getExt(
filename : PathLike,
allowedExts : Sequence[str] = None,
firstDot : bool = False
) -> str:
"""Returns the extension of the given file name. See :func:`splitExt`. """
return splitExt(filename, allowedExts, firstDot)[1]
def splitExt(filename, allowedExts=None, firstDot=False):
def splitExt(
filename : PathLike,
allowedExts : Sequence[str] = None,
firstDot : bool = False
) -> Tuple[str, str]:
"""Returns the base name and the extension from the given file name.
If ``allowedExts`` is ``None`` and ``firstDot`` is ``False``, this
......@@ -262,6 +282,8 @@ def splitExt(filename, allowedExts=None, firstDot=False):
last period. Ignored if ``allowedExts`` is specified.
"""
filename = str(filename)
# If allowedExts is not specified
# we split on a period character
if allowedExts is None:
......@@ -465,7 +487,7 @@ def removeDuplicates(paths, allowedExts=None, fileGroups=None):
groupFiles = getFileGroup(path, allowedExts, fileGroups)
if not any([p in unique for p in groupFiles]):
if not any(p in unique for p in groupFiles):
unique.append(groupFiles[0])
return unique
......@@ -492,14 +514,13 @@ def uniquePrefix(path):
break
# Should never happen if path is valid
elif len(hits) == 0 or idx >= len(filename) - 1:
if len(hits) == 0 or idx >= len(filename) - 1:
raise PathError('No unique prefix for {}'.format(filename))
# Not unique - continue looping
else:
idx += 1
prefix = prefix + filename[idx]
hits = [h for h in hits if h.startswith(prefix)]
idx += 1
prefix = prefix + filename[idx]
hits = [h for h in hits if h.startswith(prefix)]
return prefix
......@@ -525,54 +546,57 @@ def commonBase(paths):
last = base
if all([p.startswith(base) for p in paths]):
if all(p.startswith(base) for p in paths):
return base
raise PathError('No common base')
def wslpath(winpath):
"""
Convert Windows path (or a command line argument containing a Windows path)
to the equivalent WSL path (e.g. ``c:\\Users`` -> ``/mnt/c/Users``). Also supports
paths in the form ``\\wsl$\\(distro)\\users\\...``
:param winpath: Command line argument which may (or may not) contain a Windows path. It is assumed to be
either of the form <windows path> or --<arg>=<windows path>. Note that we don't need to
handle --arg <windows path> or -a <windows path> since in these cases the argument
and the path will be parsed as separate entities.
:return: If ``winpath`` matches a Windows path, the converted argument (including the --<arg>= portion).
Otherwise returns ``winpath`` unchanged.
def wslpath(path):
"""Convert Windows path (or a command line argument containing a Windows
path) to the equivalent WSL path (e.g. ``c:\\Users`` -> ``/mnt/c/Users``).
Also supports paths in the form ``\\wsl$\\(distro)\\users\\...``
:param winpath: Command line argument which may (or may not) contain a
Windows path. It is assumed to be either of the form
<windows path> or --<arg>=<windows path>. Note that we
don't need to handle --arg <windows path> or -a <windows
path> since in these cases the argument and the path will
be parsed as separate entities.
:return: If ``winpath`` matches a Windows path, the converted
argument (including the --<arg>= portion). Otherwise
returns ``winpath`` unchanged.
"""
match = re.match(r"^(--[\w-]+=)?\\\\wsl\$[\\\/][^\\^\/]+(.*)$", winpath)
match = re.match(r"^(--[\w-]+=)?\\\\wsl\$[\\\/][^\\^\/]+(.*)$", path)
if match:
arg, path = match.group(1, 2)
if arg is None:
arg = ""
return arg + path.replace("\\", "/")
match = re.match(r"^(--[\w-]+=)?([a-zA-z]):(.+)$", winpath)
match = re.match(r"^(--[\w-]+=)?([a-zA-z]):(.+)$", path)
if match:
arg, drive, path = match.group(1, 2, 3)
if arg is None:
arg = ""
return arg + "/mnt/" + drive.lower() + path.replace("\\", "/")
return winpath
return path
def winpath(wslpath):
"""
Convert a WSL-local filepath (for example ``/usr/local/fsl/``) into a path that can be used from
Windows.
def winpath(path):
"""Convert a WSL-local filepath (for example ``/usr/local/fsl/``) into a
path that can be used from Windows.
If ``self.fslwsl`` is ``False``, simply returns ``wslpath`` unmodified
Otherwise, uses ``FSLDIR`` to deduce the WSL distro in use for FSL.
This requires WSL2 which supports the ``\\wsl$\`` network path.
This requires WSL2 which supports the ``\\wsl$\\`` network path.
wslpath is assumed to be an absolute path.
"""
from fsl.utils.platform import platform # pylint: disable=import-outside-toplevel # noqa: E501
if not platform.fslwsl:
return wslpath
return path
else:
match = re.match(r"^\\\\wsl\$\\([^\\]+).*$", platform.fsldir)
if match:
......@@ -581,6 +605,7 @@ def winpath(wslpath):
distro = None
if not distro:
raise RuntimeError("Could not identify WSL installation from FSLDIR (%s)" % platform.fsldir)
raise RuntimeError('Could not identify WSL installation from '
'FSLDIR (%s)' % platform.fsldir)
return "\\\\wsl$\\" + distro + wslpath.replace("/", "\\")
return "\\\\wsl$\\" + distro + path.replace("/", "\\")
......@@ -18,7 +18,8 @@ import os.path as op
import sys
import importlib
import fsl.utils.notifier as notifier
import fsl.utils.notifier as notifier
import fsl.utils.deprecated as deprecated
# An annoying consequence of using
# a system-module name for our own
......@@ -87,6 +88,7 @@ class Platform(notifier.Notifier):
frozen
fsldir
fsldevdir
fslVersion
haveGui
canHaveGui
inSSHSession
......@@ -110,26 +112,15 @@ class Platform(notifier.Notifier):
self.WX_MAC_CARBON = WX_MAC_CARBON
self.WX_GTK = WX_GTK
self.__inSSHSession = False
self.__inVNCSession = False
# initialise fsldir - see fsldir.setter
self.fsldir = self.fsldir
# These are all initialised on first access
self.__glVersion = None
self.__glRenderer = None
self.__glIsSoftware = None
self.__fslVersion = None
# initialise fsldir - see fsldir.setter
self.fsldir = self.fsldir
# Determine if a display is available. We do
# this once at init (instead of on-demand in
# the canHaveGui method) because calling the
# IsDisplayAvailable function will cause the
# application to steal focus under OSX!
try:
import wx
self.__canHaveGui = wx.App.IsDisplayAvailable()
except ImportError:
self.__canHaveGui = False
self.__canHaveGui = None
# If one of the SSH_/VNC environment
# variables is set, then we're probably
......@@ -150,6 +141,10 @@ class Platform(notifier.Notifier):
@property
@deprecated.deprecated(
'3.6.0',
'4.0.0',
'Equivalent functionality is available in fsleyes-widgets.')
def frozen(self):
"""``True`` if we are running in a compiled/frozen application,
``False`` otherwise.
......@@ -158,6 +153,10 @@ class Platform(notifier.Notifier):
@property
@deprecated.deprecated(
'3.6.0',
'4.0.0',
'Equivalent functionality is available in fsleyes-widgets.')
def haveGui(self):
"""``True`` if we are running with a GUI, ``False`` otherwise.
......@@ -168,7 +167,7 @@ class Platform(notifier.Notifier):
the event loop is called periodically, and so is not always running.
"""
try:
import wx
import wx # pylint: disable=import-outside-toplevel
app = wx.GetApp()
# TODO Previously this conditional
......@@ -201,12 +200,31 @@ class Platform(notifier.Notifier):
@property
@deprecated.deprecated(
'3.6.0',
'4.0.0',
'Equivalent functionality is available in fsleyes-widgets.')
def canHaveGui(self):
"""``True`` if it is possible to create a GUI, ``False`` otherwise. """
# Determine if a display is available. Note that
# calling the IsDisplayAvailable function will
# cause the application to steal focus under OSX!
if self.__canHaveGui is None:
try:
import wx # pylint: disable=import-outside-toplevel
self.__canHaveGui = wx.App.IsDisplayAvailable()
except ImportError:
self.__canHaveGui = False
return self.__canHaveGui
@property
@deprecated.deprecated(
'3.6.0',
'4.0.0',
'Equivalent functionality is available in fsleyes-widgets.')
def inSSHSession(self):
"""``True`` if this application is running over an SSH session,
``False`` otherwise.
......@@ -215,6 +233,10 @@ class Platform(notifier.Notifier):
@property
@deprecated.deprecated(
'3.6.0',
'4.0.0',
'Equivalent functionality is available in fsleyes-widgets.')
def inVNCSession(self):
"""``True`` if this application is running over a VNC (or similar)
session, ``False`` otherwise. Currently, the following remote desktop
......@@ -228,6 +250,10 @@ class Platform(notifier.Notifier):
@property
@deprecated.deprecated(
'3.6.0',
'4.0.0',
'Equivalent functionality is available in fsleyes-widgets.')
def wxPlatform(self):
"""One of :data:`WX_UNKNOWN`, :data:`WX_MAC_COCOA`,
:data:`WX_MAC_CARBON`, or :data:`WX_GTK`, indicating the wx platform.
......@@ -236,14 +262,14 @@ class Platform(notifier.Notifier):
if not self.canHaveGui:
return WX_UNKNOWN
import wx
import wx # pylint: disable=import-outside-toplevel
pi = [t.lower() for t in wx.PlatformInfo]
if any(['cocoa' in p for p in pi]): plat = WX_MAC_COCOA
elif any(['carbon' in p for p in pi]): plat = WX_MAC_CARBON
elif any(['gtk' in p for p in pi]): plat = WX_GTK
else: plat = WX_UNKNOWN
if any('cocoa' in p for p in pi): plat = WX_MAC_COCOA
elif any('carbon' in p for p in pi): plat = WX_MAC_CARBON
elif any('gtk' in p for p in pi): plat = WX_GTK
else: plat = WX_UNKNOWN
if plat is WX_UNKNOWN:
log.warning('Could not determine wx platform from '
......@@ -253,6 +279,10 @@ class Platform(notifier.Notifier):
@property
@deprecated.deprecated(
'3.6.0',
'4.0.0',
'Equivalent functionality is available in fsleyes-widgets.')
def wxFlavour(self):
"""One of :data:`WX_UNKNOWN`, :data:`WX_PYTHON` or :data:`WX_PHOENIX`,
indicating the wx flavour.
......@@ -261,7 +291,7 @@ class Platform(notifier.Notifier):
if not self.canHaveGui:
return WX_UNKNOWN
import wx
import wx # pylint: disable=import-outside-toplevel
pi = [t.lower() for t in wx.PlatformInfo]
isPhoenix = False
......@@ -292,9 +322,20 @@ class Platform(notifier.Notifier):
return os.environ.get('FSLDEVDIR', None)
@property
def wsl(self):
"""Boolean flag indicating whether we are running under Windows
Subsystem for Linux.
"""
plat = builtin_platform.platform().lower()
return 'microsoft' in plat
@property
def fslwsl(self):
"""Boolean flag indicating whether FSL is installed in Windows Subsystem for Linux """
"""Boolean flag indicating whether FSL is installed in Windows
Subsystem for Linux
"""
return self.fsldir is not None and self.fsldir.startswith("\\\\wsl$")
......@@ -314,18 +355,13 @@ class Platform(notifier.Notifier):
if value is None:
os.environ.pop('FSLDIR', None)
else:
os.environ['FSLDIR'] = value
# Set the FSL version field if we can
versionFile = op.join(value, 'etc', 'fslversion')
if op.exists(versionFile):
with open(versionFile, 'rt') as f:
# split string at colon for new hash style versions
# first object in list is the non-hashed version string (e.g. 6.0.2)
# if no ":hash:" then standard FSL version string is still returned
self.__fslVersion = f.read().strip().split(":")[0]
# clear fslversion - it will
# be re-read on next access
self.__fslVersion = None
self.notify(value=value)
......@@ -355,10 +391,31 @@ class Platform(notifier.Notifier):
"""Returns the FSL version as a string, e.g. ``'5.0.9'``. Returns
``None`` if a FSL installation could not be found.
"""
if self.__fslVersion is not None:
return self.__fslVersion
if self.fsldir is None:
return None
# Set the FSL version field if we can
versionFile = op.join(self.fsldir, 'etc', 'fslversion')
if op.exists(versionFile):
with open(versionFile, 'rt') as f:
# split string at colon for new hash style versions
# first object in list is the non-hashed version string
# (e.g. 6.0.2) if no ":hash:" then standard FSL version
# string is still returned
self.__fslVersion = f.read().strip().split(":")[0]
return self.__fslVersion
@property
@deprecated.deprecated(
'3.6.0',
'4.0.0',
'Equivalent functionality is available in fsleyes-widgets.')
def glVersion(self):
"""Returns the available OpenGL version, or ``None`` if it has not
been set.
......@@ -367,12 +424,20 @@ class Platform(notifier.Notifier):
@glVersion.setter
@deprecated.deprecated(
'3.6.0',
'4.0.0',
'Equivalent functionality is available in fsleyes-widgets.')
def glVersion(self, value):
"""Set the available OpenGL version. """
self.__glVersion = value
@property
@deprecated.deprecated(
'3.6.0',
'4.0.0',
'Equivalent functionality is available in fsleyes-widgets.')
def glRenderer(self):
"""Returns the available OpenGL renderer, or ``None`` if it has not
been set.
......@@ -381,6 +446,10 @@ class Platform(notifier.Notifier):
@glRenderer.setter
@deprecated.deprecated(
'3.6.0',
'4.0.0',
'Equivalent functionality is available in fsleyes-widgets.')
def glRenderer(self, value):
"""Set the available OpenGL renderer. """
self.__glRenderer = value
......@@ -398,6 +467,10 @@ class Platform(notifier.Notifier):
@property
@deprecated.deprecated(
'3.6.0',
'4.0.0',
'Equivalent functionality is available in fsleyes-widgets.')
def glIsSoftwareRenderer(self):
"""Returns ``True`` if the OpenGL renderer is software based,
``False`` otherwise, or ``None`` if the renderer has not yet been set.
......
......@@ -15,28 +15,36 @@
run
runfsl
wait
runfunc
func_to_cmd
dryrun
hold
job_output
"""
import io
import sys
import glob
import time
import shlex
import logging
import tempfile
import threading
import contextlib
import collections.abc as abc
import subprocess as sp
import os.path as op
import os
import textwrap as tw
import six
import dill
from fsl.utils.platform import platform as fslplatform
import fsl.utils.fslsub as fslsub
import fsl.utils.tempdir as tempdir
import fsl.utils.path as fslpath
log = logging.getLogger(__name__)
......@@ -45,6 +53,12 @@ DRY_RUN = False
execute them.
"""
DRY_RUN_COMMANDS = None
"""Contains the commands that got logged during a dry run.
Commands will be logged if :data:`DRY_RUN` is true, which can be set using :func:`dryrun`.
"""
FSL_PREFIX = None
"""Global override for the FSL executable location used by :func:`runfsl`. """
......@@ -54,20 +68,25 @@ class FSLNotPresent(Exception):
"""Error raised by the :func:`runfsl` function when ``$FSLDIR`` cannot
be found.
"""
pass
@contextlib.contextmanager
def dryrun(*args):
def dryrun(*_):
"""Context manager which causes all calls to :func:`run` to be logged but
not executed. See the :data:`DRY_RUN` flag.
The returned standard output will be equal to ``' '.join(args)``.
After this function returns, each command that was executed while the
dryrun is active, along with any submission parameters, will be accessible
within a list which is stored as :data:`DRY_RUN_COMMANDS`.
"""
global DRY_RUN
global DRY_RUN, DRY_RUN_COMMANDS # pylint: disable=global-statement
oldval = DRY_RUN
DRY_RUN = True
DRY_RUN_COMMANDS = []
oldval = DRY_RUN
DRY_RUN = True
try:
yield
......@@ -83,7 +102,7 @@ def prepareArgs(args):
if len(args) == 1:
# Argument was a command string
if isinstance(args[0], six.string_types):
if isinstance(args[0], str):
args = shlex.split(args[0])
# Argument was an unpacked sequence
......@@ -93,7 +112,6 @@ def prepareArgs(args):
return list(args)
real_stdout = sys.stdout
def _forwardStream(in_, *outs):
"""Creates and starts a daemon thread which forwards the given input stream
to one or more output streams. Used by the :func:`run` function to redirect
......@@ -148,58 +166,69 @@ def run(*args, **kwargs):
:arg submit: Must be passed as a keyword argument. Defaults to ``None``.
If ``True``, the command is submitted as a cluster job via
the :func:`.fslsub.submit` function. May also be a
the :mod:`fsl.wrappers.fsl_sub` function. May also be a
dictionary containing arguments to that function.
:arg log: Must be passed as a keyword argument. An optional ``dict``
which may be used to redirect the command's standard output
and error. The following keys are recognised:
:arg cmdonly: Defaults to ``False``. If ``True``, the command is not
executed, but rather is returned directly, as a list of
arguments.
:arg log: Must be passed as a keyword argument. Defaults to
``{'tee' : True}``. An optional ``dict`` which may be used
to redirect the command's standard output and error. Ignored
if ``submit`` is specified. The following keys are
recognised:
- tee: If ``True``, the command's standard output/error
streams are forwarded to this processes streams.
- tee: If ``True`` (the default), the command's
standard output/error streams are forwarded to
the output streams of this process, in addition
to being captured and returned.
- stdout: Optional file-like object to which the command's
standard output stream can be forwarded.
- stdout: Optional callable or file-like object to which
the command's standard output stream can be
forwarded.
- stderr: Optional file-like object to which the command's
standard error stream can be forwarded.
- stderr: Optional callable or file-like object to which
the command's standard error stream can be
forwarded.
- cmd: Optional file-like object to which the command
itself is logged.
- cmd: Optional callable or file-like object to which
the command itself is logged.
:arg silent: Suppress standard output/error. Equivalent to passing
``log={'tee' : False}``. Ignored if `log` is also passed.
All other keyword arguments are passed through to the ``subprocess.Popen``
object (via :func:`_realrun`), unless ``submit=True``, in which case they
are passed through to the :func:`.fslsub.submit` function.
are passed through to the :func:`.fsl_sub` function.
:returns: If ``submit`` is provided, the return value of
:func:`.fslsub` is returned. Otherwise returns a single
value or a tuple, based on the based on the ``stdout``,
``stderr``, and ``exitcode`` arguments.
:returns: If ``submit`` is provided, the ID of the submitted job is
returned as a string. Otherwise returns a single value or a
tuple, based on the based on the ``stdout``, ``stderr``, and
``exitcode`` arguments.
"""
returnStdout = kwargs.pop('stdout', True)
returnStderr = kwargs.pop('stderr', False)
returnExitcode = kwargs.pop('exitcode', False)
submit = kwargs.pop('submit', {})
log = kwargs.pop('log', None)
cmdonly = kwargs.pop('cmdonly', False)
logg = kwargs.pop('log', None)
silent = kwargs.pop('silent', False)
args = prepareArgs(args)
if log is None:
log = {}
if logg is None:
logg = {'tee' : not silent}
tee = log.get('tee', False)
logStdout = log.get('stdout', None)
logStderr = log.get('stderr', None)
logCmd = log.get('cmd', None)
tee = logg.get('tee', True)
logStdout = logg.get('stdout', None)
logStderr = logg.get('stderr', None)
logCmd = logg.get('cmd', None)
if not bool(submit):
submit = None
if submit is not None:
returnStdout = False
returnStderr = False
returnExitcode = False
if submit is True:
submit = dict()
......@@ -207,13 +236,19 @@ def run(*args, **kwargs):
raise ValueError('submit must be a mapping containing '
'options for fsl.utils.fslsub.submit')
if cmdonly:
return args
if DRY_RUN:
return _dryrun(
submit, returnStdout, returnStderr, returnExitcode, *args)
# submit - delegate to fslsub
# submit - delegate to fsl_sub. This will induce a nested
# call back to this run function, which is a bit confusing,
# but harmless, as we've popped the "submit" arg above.
if submit is not None:
return fslsub.submit(' '.join(args), **submit, **kwargs)
from fsl.wrappers import fsl_sub # pylint: disable=import-outside-toplevel # noqa: E501
return fsl_sub(*args, log=logg, **submit, **kwargs)[0].strip()
# Run directly - delegate to _realrun
stdout, stderr, exitcode = _realrun(
......@@ -237,12 +272,18 @@ def _dryrun(submit, returnStdout, returnStderr, returnExitcode, *args):
active.
"""
# Save command/submit parameters -
# see the dryrun ctx manager
if DRY_RUN_COMMANDS is not None:
DRY_RUN_COMMANDS.append((args, submit))
if submit:
return ('0',)
results = []
stderr = ''
stdout = ' '.join(args)
join = getattr(shlex, 'join', ' '.join)
stdout = join(args)
if returnStdout: results.append(stdout)
if returnStderr: results.append(stderr)
......@@ -260,14 +301,14 @@ def _realrun(tee, logStdout, logStderr, logCmd, *args, **kwargs):
streams are forwarded to this process' standard output/
error.
:arg logStdout: Optional file-like object to which the command's standard
output stream can be forwarded.
:arg logStdout: Optional callable or file-like object to which the
command's standard output stream can be forwarded.
:arg logStderr: Optional file-like object to which the command's standard
error stream can be forwarded.
:arg logStderr: Optional callable or file-like object to which the
command's standard error stream can be forwarded.
:arg logCmd: Optional file-like object to which the command itself is
logged.
:arg logCmd: Optional callable or file-like to which the command
itself is logged.
:arg args: Command to run
......@@ -278,6 +319,12 @@ def _realrun(tee, logStdout, logStderr, logCmd, *args, **kwargs):
- the command's standard error as a string.
- the command's exit code.
"""
if fslplatform.fslwsl:
# On Windows this prevents opening of a popup window
startupinfo = sp.STARTUPINFO()
startupinfo.dwFlags |= sp.STARTF_USESHOWWINDOW
kwargs["startupinfo"] = startupinfo
proc = sp.Popen(args, stdout=sp.PIPE, stderr=sp.PIPE, **kwargs)
with tempdir.tempdir(changeto=False) as td:
......@@ -299,14 +346,22 @@ def _realrun(tee, logStdout, logStderr, logCmd, *args, **kwargs):
outstreams.append(sys.stdout)
errstreams.append(sys.stderr)
# And we also duplicate to caller-
# provided streams if they're given.
if logStdout is not None: outstreams.append(logStdout)
if logStderr is not None: errstreams.append(logStderr)
# log the command if requested
if logCmd is not None:
cmd = ' '.join(args) + '\n'
# And we also duplicate to caller-provided
# streams if they are file-likes (if they're
# callables, we call them after the process
# has completed)
if logStdout is not None and not callable(logStdout):
outstreams.append(logStdout)
if logStderr is not None and not callable(logStderr):
errstreams.append(logStderr)
# log the command if requested.
# logCmd can be a callable, or
# can be a file-like.
cmd = ' '.join(args) + '\n'
if callable(logCmd):
logCmd(cmd)
elif logCmd is not None:
if 'b' in getattr(logCmd, 'mode', 'w'):
logCmd.write(cmd.encode('utf-8'))
else:
......@@ -330,6 +385,10 @@ def _realrun(tee, logStdout, logStderr, logCmd, *args, **kwargs):
stdout = stdout.decode('utf-8')
stderr = stderr.decode('utf-8')
# Send stdout/error to logStdout/err callables
if logStdout is not None and callable(logStdout): logStdout(stdout)
if logStderr is not None and callable(logStderr): logStderr(stderr)
return stdout, stderr, exitcode
......@@ -378,26 +437,143 @@ def runfsl(*args, **kwargs):
return run(*args, **kwargs)
def wslcmd(cmdpath, *args):
def runfunc(func,
args=None,
kwargs=None,
tmp_dir=None,
clean="never",
verbose=False,
**run_kwargs):
"""Run the given python function as a shell command. See
:func:`func_to_cmd` for details on the arguments.
The remaining ``run_kwargs`` arguments are passed through to the
:func:`run` function.
"""
Convert a command + arguments into an equivalent set of arguments that will run the command
under Windows Subsystem for Linux
cmd = func_to_cmd(func, args, kwargs, tmp_dir, clean, verbose)
return run(cmd, **run_kwargs)
def func_to_cmd(func,
args=None,
kwargs=None,
tmp_dir=None,
clean="never",
verbose=False):
"""Save the given python function to an executable file. Return a string
containing a command that can be used to run the function.
..warning:: If submitting a function defined in the ``__main__`` script,
the script will be run again to retrieve this function. Make
sure there is a ``if __name__ == '__main__'`` guard to prevent
the full script from being re-run.
:arg func: function to be run
:arg args: positional arguments
:param cmdpath: Fully qualified path to the command. This is essentially a WSL path not a Windows
one since FSLDIR is specified as a WSL path, however it may have backslashes
as path separators due to previous use of ``os.path.join``
:param args: Sequence of command arguments (the first of which is the unqualified command name)
:arg kwargs: keyword arguments
:return: If ``cmdpath`` exists and is executable in WSL, return a sequence of command arguments
which when executed will run the command in WSL. Windows paths in the argument list will
be converted to WSL paths. If ``cmdpath`` was not executable in WSL, returns None
:arg tmp_dir: directory where to store the temporary file (default: the
system temporary directory)
:arg clean: Whether the script should be removed after running. There are
three options:
- ``"never"``: (default) Script is kept
- ``"on_success"``: only remove if script successfully
finished (i.e., no error is raised)
- ``"always"``: always remove the script, even if it
raises an error
:arg verbose: If set to True, the script will print its own filename
before running
"""
# Check if command exists in WSL (remembering that the command path may include FSLDIR which
# is a Windows path)
script_template = tw.dedent("""
#!{executable}
# This is a temporary file designed to run the python function {funcname},
# so that it can be submitted to the cluster
import os
import dill
from io import BytesIO
from importlib import import_module
if {verbose}:
print('running {filename}')
dill_bytes = BytesIO({dill_bytes})
func, args, kwargs = dill.load(dill_bytes)
clean = {clean}
try:
res = func(*args, **kwargs)
except Exception as e:
if clean == 'on_success':
clean = 'never'
raise e
finally:
if clean in ('on_success', 'always'):
os.remove({filename})
""").strip()
if clean not in ('never', 'always', 'on_success'):
raise ValueError("Clean should be one of 'never', 'always', "
f"or 'on_success', not {clean}")
if args is None: args = ()
if kwargs is None: kwargs = {}
dill_bytes = io.BytesIO()
dill.dump((func, args, kwargs), dill_bytes, recurse=True)
handle, filename = tempfile.mkstemp(prefix=func.__name__ + '_',
suffix='.py',
dir=tmp_dir)
os.close(handle)
python_cmd = script_template.format(
executable=sys.executable,
funcname=func.__name__,
filename=f'"{filename}"',
verbose=verbose,
clean=f'"{clean}"',
dill_bytes=dill_bytes.getvalue())
with open(filename, 'w') as f:
f.write(python_cmd)
os.chmod(filename, 0o755)
return f'{filename}'
def wslcmd(cmdpath, *args):
"""Convert a command + arguments into an equivalent set of arguments that
will run the command under Windows Subsystem for Linux
:param cmdpath: Fully qualified path to the command. This is essentially
a WSL path not a Windows one since FSLDIR is specified
as a WSL path, however it may have backslashes as path
separators due to previous use of ``os.path.join``
:param args: Sequence of command arguments (the first of which is the
unqualified command name)
:return: If ``cmdpath`` exists and is executable in WSL, return a
sequence of command arguments which when executed will run the
command in WSL. Windows paths in the argument list will be
converted to WSL paths. If ``cmdpath`` was not executable in
WSL, returns None
"""
# Check if command exists in WSL (remembering
# that the command path may include FSLDIR
# which is a Windows path)
cmdpath = fslpath.wslpath(cmdpath)
retcode = sp.call(["wsl", "test", "-x", cmdpath])
_stdout, _stderr, retcode = _realrun(
False, None, None, None, "wsl", "test", "-x", cmdpath)
if retcode == 0:
# Form a new argument list and convert any Windows paths in it into WSL paths
# Form a new argument list and convert
# any Windows paths in it into WSL paths
wslargs = [fslpath.wslpath(arg) for arg in args]
wslargs[0] = cmdpath
local_fsldir = fslpath.wslpath(fslplatform.fsldir)
......@@ -405,8 +581,10 @@ def wslcmd(cmdpath, *args):
local_fsldevdir = fslpath.wslpath(fslplatform.fsldevdir)
else:
local_fsldevdir = None
# Prepend important environment variables - note that it seems we cannot
# use WSLENV for this due to its insistance on path mapping. FIXME FSLDEVDIR?
# Prepend important environment variables -
# note that it seems we cannot use WSLENV
# for this due to its insistance on path
# mapping. FIXME FSLDEVDIR?
local_path = "$PATH"
if local_fsldevdir:
local_path += ":%s/bin" % local_fsldevdir
......@@ -425,6 +603,106 @@ def wslcmd(cmdpath, *args):
return None
def wait(job_ids):
"""Proxy for :func:`.fslsub.wait`. """
return fslsub.wait(job_ids)
def hold(job_ids, hold_filename=None, timeout=10):
"""Waits until all specified cluster jobs have finished.
:arg job_ids: Possibly nested sequence of job ids. The job ids
themselves should be strings.
:arg hold_filename: Filename to use as a hold file. The containing
directory should exist, but the file itself should
not. Defaults to a ./.<random characters>.hold in
the current directory.
:arg timeout: Number of seconds to sleep between status checks.
"""
# Returns a potentially nested sequence of
# job ids as a single comma-separated string
def _flatten_job_ids(job_ids):
def unpack(job_ids):
if isinstance(job_ids, str):
return {job_ids}
elif isinstance(job_ids, int):
return {str(job_ids)}
else:
res = set()
for job_id in job_ids:
res.update(unpack(job_id))
return res
return ','.join(sorted(unpack(job_ids)))
if hold_filename is not None:
if op.exists(hold_filename):
raise IOError(f"Hold file ({hold_filename}) already exists")
elif not op.isdir(op.split(op.abspath(hold_filename))[0]):
raise IOError(f"Hold file ({hold_filename}) can not be created "
"in non-existent directory")
# Generate a random file name to use as
# the hold file. Reduce likelihood of
# naming collision by storing file in
# cwd.
if hold_filename is None:
handle, hold_filename = tempfile.mkstemp(prefix='.',
suffix='.hold',
dir='.')
os.remove(hold_filename)
os.close(handle)
submit = {
'jobhold' : _flatten_job_ids(job_ids),
'jobtime' : 1,
'name' : '.hold',
}
run(f'touch {hold_filename}', submit=submit, silent=True)
while not op.exists(hold_filename):
time.sleep(timeout)
# remove the hold file and the
# fsl_sub job stdout/err files
os.remove(hold_filename)
for outfile in glob.glob('.hold.[o,e]*'):
os.remove(outfile)
def job_output(job_id, logdir='.', command=None, name=None):
"""Returns the output of the given cluster-submitted job.
On SGE cluster systems, the standard output and error streams of a
submitted job are saved to files named ``<job_id>.o`` and ``<job_id>.e``.
This function simply reads those files and returns their content.
:arg job_id: String containing job ID.
:arg logdir: Directory containing the log - defaults to
the current directory.
:arg command: Command that was run. Not currently used.
:arg name: Job name if it was specified. Not currently used.
:returns: A tuple containing the standard output and standard error.
"""
stdout = list(glob.glob(op.join(logdir, f'*.o{job_id}')))
stderr = list(glob.glob(op.join(logdir, f'*.e{job_id}')))
if len(stdout) != 1 or len(stderr) != 1:
raise ValueError('No/too many error/output files for job '
f'{job_id}: stdout: {stdout}, stderr: {stderr}')
stdout = stdout[0]
stderr = stderr[0]
if op.exists(stdout):
with open(stdout, 'rt') as f:
stdout = f.read()
else:
stdout = None
if op.exists(stderr):
with open(stderr, 'rt') as f:
stderr = f.read()
else:
stderr = None
return stdout, stderr
......@@ -421,7 +421,7 @@ class Settings(object):
try:
with open(configFile, 'wb') as f:
pickle.dump(config, f, protocol=2)
except (IOError, pickle.PicklingError, EOFError):
except (IOError, pickle.PicklingError, EOFError, FileNotFoundError):
log.warning('Unable to save {} configuration file '
'{}'.format(self.__configID, configFile),
exc_info=True)
......@@ -21,7 +21,7 @@ import contextlib
@contextlib.contextmanager
def tempdir(root=None, changeto=True, override=None):
def tempdir(root=None, changeto=True, override=None, prefix=None):
"""Returns a context manager which creates and returns a temporary
directory, and then deletes it on exit.
......@@ -36,14 +36,21 @@ def tempdir(root=None, changeto=True, override=None):
:arg override: Don't create a temporary directory, but use this one
instead. This allows ``tempdir`` to be used as a context
manager when a temporary directory already exists.
:arg prefix: Create the temporary directory with a name starting with
this prefix.
"""
if root is not None:
root = os.path.abspath(root)
if override is None:
testdir = tempfile.mkdtemp(dir=root)
prevdir = os.getcwd()
testdir = tempfile.mkdtemp(dir=root, prefix=prefix)
else:
testdir = override
prevdir = os.getcwd()
try:
if changeto:
os.chdir(testdir)
......@@ -51,6 +58,6 @@ def tempdir(root=None, changeto=True, override=None):
finally:
if override is None:
if changeto:
os.chdir(prevdir)
shutil.rmtree(testdir)
if changeto:
os.chdir(prevdir)
......@@ -7,13 +7,12 @@
"""This module provides the :class:`WeakFunctionRef` class. """
import six
import types
import weakref
import inspect
class WeakFunctionRef(object):
class WeakFunctionRef:
"""Class which encapsulates a :mod:`weakref` to a function or method.
This class is used by :class:`.Notifier` instances to reference
......@@ -28,10 +27,10 @@ class WeakFunctionRef(object):
"""
# Bound method
if self.__isMethod(func):
if inspect.ismethod(func):
boundMeth = six.get_method_function(func)
boundSelf = six.get_method_self( func)
boundMeth = func.__func__
boundSelf = func.__self__
# We can't take a weakref of the method
# object, so we have to weakref the object
......@@ -73,35 +72,6 @@ class WeakFunctionRef(object):
return self.__str__()
def __isMethod(self, func):
"""Returns ``True`` if the given function is a bound method,
``False`` otherwise.
This seems to be one of the few areas where python 2 and 3 are
irreconcilably incompatible (or just where :mod:`six` does not have a
function to help us).
In Python 3 there is no difference between an unbound method and a
function. But in Python 2, an unbound method is still a method (and
inspect.ismethod returns True).
"""
ismethod = False
# Therefore, in python2 we need to test
# whether the function is a method, and
# also test whether it is bound.
if six.PY2:
ismethod = (inspect.ismethod(func) and
six.get_method_self(func) is not None)
# But in python3, if the function is a
# method it is, by definition, bound.
elif six.PY3:
ismethod = inspect.ismethod(func)
return ismethod
def __findPrivateMethod(self):
"""Finds and returns the bound method associated with the encapsulated
......@@ -125,8 +95,7 @@ class WeakFunctionRef(object):
att = getattr(obj, name)
if isinstance(att, types.MethodType) and \
six.get_method_function(att) is func:
if isinstance(att, types.MethodType) and att.__func__ is func:
return att
return None
......
......@@ -47,7 +47,7 @@ import re
import string
__version__ = '3.1.0.dev0'
__version__ = '3.23.0.dev0'
"""Current version number, as a string. """
......
#!/usr/bin/env python
#
# pylint: disable=unused-import
# flake8: noqa: F401
#
# __init__.py - Wrappers for FSL command-line tools.
#
# Author: Paul McCarthy <pauldmccarthy@gmail.com>
......@@ -16,14 +19,12 @@ For example, you can call BET like so::
If you would like a command to be submitted as a cluster job, all wrappers
accept a ``submit`` keyword argument, which may be given a value of ``True``
indicating that the job should be submitted with default settings, or a
dictionary with submission settings::
dictionary with submission settings, which will be passed through to the
``fsl_sub`` command (run ``fsl_sub --help`` for details on all options)::
from fsl.wrappers import fnirt
fnirt('srf', 'ref', 'out', submit=True)
fnirt('srf', 'ref', 'out', submit={'queue' : 'long.q', 'ram' : '4GB'})
See the :mod:`.fslsub` module for more details.
fnirt('srf', 'ref', 'out', submit={'queue' : 'long.q', 'jobram' : '4'})
Most of these wrapper functions strive to provide an interface which is as
......@@ -73,38 +74,109 @@ Similarly, we can run a ``fslmaths`` command on in-memory images::
output = fslmaths(image).mas(mask).bin().run()
It is possible to run a Python script in Windows, and call FSL commands which
are installed in a WSL environment. When specifying inputs/outputs as
file/directory paths, the safest option is to use ``pathlib.Path`` objects
to ensure that they are correctly translated bewteen Windows and Linux-style
paths, e.g.::
from pathlib import Path
from fsl.wrappers import bet
bet(Path('T1\\T1.nii.gz'), Path('T1_brain'))
If you use strings to specify inputs/outputs, they must be absolute paths, as
they may otherwise not be translated correctly.
If you are *writing* wrapper functions, take a look at the
:mod:`.wrapperutils` module - it contains several useful functions and
decorators.
"""
from .wrapperutils import (LOAD,) # noqa
from .bet import (bet, # noqa
robustfov)
from .eddy import (eddy_cuda, # noqa
topup,
applytopup)
from .fast import (fast,) # noqa
from .fsl_anat import (fsl_anat,) # noqa
from .flirt import (flirt, # noqa
invxfm,
applyxfm,
applyxfm4D,
concatxfm,
mcflirt)
from .fnirt import (fnirt, # noqa
applywarp,
invwarp,
convertwarp)
from .fslmaths import (fslmaths,) # noqa
from .fslstats import (fslstats,) # noqa
from .fugue import (fugue, # noqa
prelude,
sigloss)
from .melodic import (melodic, # noqa
fsl_regfilt)
from .misc import (fslreorient2std, # noqa
fslroi,
slicer,
cluster)
from fsl.wrappers.wrapperutils import (LOAD,
wrapperconfig,
cmdwrapper,
fslwrapper,
funcwrapper)
from fsl.wrappers import (tbss,)
from fsl.wrappers.cluster_commands import (cluster,
smoothest)
from fsl.wrappers.bet import (bet,
robustfov)
from fsl.wrappers.eddy import (eddy,
eddy_cuda,
topup,
applytopup)
from fsl.wrappers.epi_reg import epi_reg
from fsl.wrappers.fast import (fast,)
from fsl.wrappers.avwutils import (fslmerge,
fslselectvols,
fslsplit,
fslcpgeom,)
from fsl.wrappers.first import (concat_bvars,
first,
first_flirt,
first_utils,
run_first,
run_first_all)
from fsl.wrappers.flirt import (flirt,
invxfm,
applyxfm,
applyxfm4D,
concatxfm,
fixscaleskew,
mcflirt,
standard_space_roi,
makerot,
midtrans)
from fsl.wrappers.fnirt import (fnirt,
applywarp,
invwarp,
convertwarp)
from fsl.wrappers.fsl_anat import (fsl_anat,)
from fsl.wrappers.fsl_sub import (fsl_sub,)
from fsl.wrappers.fslmaths import (fslmaths,)
from fsl.wrappers.fslstats import (fslstats,)
from fsl.wrappers.fugue import (fugue,
prelude,
sigloss,
fsl_prepare_fieldmap)
from fsl.wrappers.get_standard import (get_standard,)
from fsl.wrappers.melodic import (melodic,
fsl_regfilt,
fsl_glm)
from fsl.wrappers.misc import (fslreorient2std,
fslorient,
fslswapdim,
fslroi,
slicer,
gps)
from fsl.wrappers.bianca import (bianca,
bianca_cluster_stats,
bianca_overlap_measures,
bianca_perivent_deep,
make_bianca_mask)
from fsl.wrappers.feat import (feat,
featquery)
from fsl.wrappers.fdt import (dtifit,
vecreg)
from fsl.wrappers.bedpostx import (xfibres,
xfibres_gpu,
split_parts_gpu,
bedpostx,
bedpostx_gpu,
bedpostx_postproc_gpu,
probtrackx,
probtrackx2,
probtrackx2_gpu)
from fsl.wrappers.oxford_asl import (oxford_asl,
asl_file)
from fsl.wrappers.randomise import randomise
from fsl.wrappers.fslio import (imcp,
imglob,
imln,
immv,
imrm,
imtest)