Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found

Target

Select target project
  • paulmc/fslpy
  • ndcn0236/fslpy
  • seanf/fslpy
3 results
Show changes
Commits on Source (2109)
#!/usr/bin/env /bash
set -e
source /test.venv/bin/activate
pip install ".[doc]"
sphinx-build doc public
#!/usr/bin/env bash
set -e
pip install --upgrade pip wheel setuptools twine build
python -m build
twine check dist/*
# do a test install from both source and wheel
sdist=`find dist -maxdepth 1 -name *.tar.gz`
wheel=`find dist -maxdepth 1 -name *.whl`
for target in $sdist $wheel; do
python -m venv test.venv
. test.venv/bin/activate
pip install --upgrade pip setuptools
pip install $target
deactivate
rm -r test.venv
done
#!/usr/bin/env bash
set -e
cat fsl/version.py | egrep "^__version__ += +'$CI_COMMIT_REF_NAME' *$"
#!/usr/bin/env BASH
set -e
pip install setuptools wheel twine
twine upload dist/*
#!/usr/bin/env /bash
set -e
##########################################################
# The setup_ssh script does the following:
#
# - Sets up key-based SSH login, and
# installs the private keys, so
# we can connect to servers.
#
# - Configures git, and adds the
# upstream repo as a remote
#
# (see https://docs.gitlab.com/ce/ci/ssh_keys/README.html)
#
# NOTE: It is assumed that non-docker
# executors are already configured
# (or don't need any configuration).
##########################################################
if [[ -f /.dockerenv ]]; then
# We have to use different host names to connect
# to the docker daemon host on mac as opposed
# to on linux.
#
# On linux (assuming the docker job is running
# with --net=host), we can connect via
# username@localhost.
#
# On mac, we have to connect via
# username@host.docker.internal
if [[ "$CI_RUNNER_TAGS" == *"macOS"* ]]; then
if [[ "$FSL_HOST" == *"@localhost" ]]; then
FSL_HOST=${FSL_HOST/localhost/host.docker.internal}
fi
fi
apt-get update -y || yum -y check-update || true;
apt-get install -y openssh-client rsync git || yum install -y openssh-client rsync git || true;
eval $(ssh-agent -s);
mkdir -p $HOME/.ssh;
# for downloading FSL atlases/standards
echo "$SSH_PRIVATE_KEY_FSL_DOWNLOAD" > $HOME/.ssh/id_fsl_download;
chmod go-rwx $HOME/.ssh/id_*;
ssh-add $HOME/.ssh/id_fsl_download;
ssh-keyscan ${FSL_HOST##*@} >> $HOME/.ssh/known_hosts;
touch $HOME/.ssh/config;
echo "Host fsldownload" >> $HOME/.ssh/config;
echo " HostName ${FSL_HOST##*@}" >> $HOME/.ssh/config;
echo " User ${FSL_HOST%@*}" >> $HOME/.ssh/config;
echo " IdentityFile $HOME/.ssh/id_fsl_download" >> $HOME/.ssh/config;
echo "Host *" >> $HOME/.ssh/config;
echo " IdentitiesOnly yes" >> $HOME/.ssh/config;
git config --global user.name "Gitlab CI";
git config --global user.email "gitlabci@localhost";
if [[ `git remote -v` == *"upstream"* ]]; then
git remote remove upstream;
fi;
git remote add upstream "$UPSTREAM_URL:$UPSTREAM_PROJECT";
fi
#!/usr/bin/env bash
set -e
source /test.venv/bin/activate
pip install ".[extra,test,style]"
# style stage
if [ "$TEST_STYLE"x != "x" ]; then flake8 fsl || true; fi;
if [ "$TEST_STYLE"x != "x" ]; then pylint --output-format=colorized fsl || true; fi;
if [ "$TEST_STYLE"x != "x" ]; then exit 0; fi;
# We need the FSL atlases for the atlas
# tests, and need $FSLDIR to be defined
export FSLDIR=/fsl/
mkdir -p $FSLDIR/data/
rsync -rv "fsldownload:$FSL_ATLAS_DIR" "$FSLDIR/data/atlases/"
# Run the tests. Suppress coverage
# reporting until after we're finished.
TEST_OPTS="--cov-report= --cov-append"
# pytest struggles with my organisation of
# the fslpy package, where all tests are in
# fsl.tests, and fsl is a namespace package
touch fsl/__init__.py
# We run some tests under xvfb-run
# because they invoke wx. Sleep in
# between, otherwise xvfb gets upset.
xvfb-run -a pytest $TEST_OPTS fsl/tests/test_idle.py
sleep 5
xvfb-run -a pytest $TEST_OPTS fsl/tests/test_platform.py
# We run the immv/imcp tests as the nobody
# user because some tests expect permission
# denied errors when looking at files, and
# root never gets denied. Make everything in
# this directory writable by anybody (which,
# unintuitively, includes nobody)
chmod -R a+w `pwd`
cmd="source /test.venv/bin/activate && pytest"
cmd="$cmd $TEST_OPTS fsl/tests/test_scripts/test_immv_imcp.py fsl/tests/test_immv_imcp.py"
su -s /bin/bash -c "$cmd" nobody
# All other tests can be run as normal.
pytest $TEST_OPTS -m 'not longtest' \
--ignore=fsl/tests/test_idle.py \
--ignore=fsl/tests/test_platform.py \
--ignore=fsl/tests/test_immv_imcp.py \
--ignore=fsl/tests/test_scripts/test_immv_imcp.py
# Long tests are only run on release branches
if [[ $CI_COMMIT_REF_NAME == v* ]]; then
pytest $TEST_OPTS -m 'longtest'
fi
python -m coverage report -i
#!/usr/bin/env python
#
# Deposit a new version of something on zenodo.
#
# It is assumed that a deposit already exists on zenodo - you must
# specify the deposit ID of that original deposit.
#
# http://developers.zenodo.org/#rest-api
import os.path as op
import sys
import json
import jinja2 as j2
import requests
def deposit(zenodo_url, access_token, dep_id, upload_file, meta):
urlbase = '{}/api/deposit/depositions'.format(zenodo_url)
headers = {'Content-Type': 'application/json'}
params = {'access_token' : access_token}
# Create a new deposit
url = '{}/{}/actions/newversion'.format(urlbase, dep_id)
print('Creating new deposit: {}'.format(url))
r = requests.post(url, params=params)
if r.status_code != 201:
raise RuntimeError('POST {} failed: {}'.format(url, r.status_code))
newurl = r.json()['links']['latest_draft']
dep_id = newurl.split('/')[-1]
print("New deposition ID: {}".format(dep_id))
# Upload the file
data = {'filename': op.basename(upload_file)}
files = {'file': open(upload_file, 'rb')}
url = '{}/{}/files'.format(urlbase, dep_id)
print('Uploading file: {}'.format(url))
r = requests.post(url, params=params, data=data, files=files)
if r.status_code != 201:
raise RuntimeError('POST {} failed: {}'.format(url, r.status_code))
# Upload the metadata
url = '{}/{}?access_token={}'.format(urlbase, dep_id, access_token)
print('Uploading metadata: {}'.format(url))
r = requests.put(url, data=json.dumps(meta), headers=headers)
if r.status_code != 200:
print(r.json())
raise RuntimeError('PUT {} failed: {}'.format(url, r.status_code))
# Publish
url = '{}/{}/actions/publish'.format(urlbase, dep_id)
print('Publishing: {}'.format(url))
r = requests.post(url, params=params)
if r.status_code != 202:
raise RuntimeError('POST {} failed: {}'.format(url, r.status_code))
def make_meta(templatefile, version, date):
with open(templatefile, 'rt') as f:
template = f.read()
template = j2.Template(template)
env = {
'VERSION' : version,
'DATE' : date,
}
return json.loads(template.render(**env))
if __name__ == '__main__':
zurl = sys.argv[1]
tkn = sys.argv[2]
depid = sys.argv[3]
upfile = sys.argv[4]
metafile = sys.argv[5]
version = sys.argv[6]
date = sys.argv[7]
meta = make_meta(metafile, version, date)
deposit(zurl, tkn, depid, upfile, meta)
#!/bin/bash
tmp=`dirname $0`
pushd $tmp > /dev/null
thisdir=`pwd`
popd > /dev/null
zenodo_url=$1
zenodo_tkn=$2
zenodo_depid=$3
version=$(cat fsl/version.py |
egrep '^__version__ +=' |
cut -d "=" -f 2 |
tr -d "'" |
tr -d ' ')
upfile=$(pwd)/dist/fslpy-"$version".tar.gz
metafile=$(pwd)/.ci/zenodo_meta.json.jinja2
date=$(date +"%Y-%m-%d")
pip install --retries 10 requests jinja2
python "$thisdir"/zenodo.py \
"$zenodo_url" \
"$zenodo_tkn" \
"$zenodo_depid" \
"$upfile" \
"$metafile" \
"$version" \
"$date"
{
"metadata" : {
"title" : "fslpy",
"upload_type" : "software",
"version" : "{{VERSION}}",
"publication_date" : "{{DATE}}",
"description" : "<p>The fslpy project is a <a href=\"https://fsl.fmrib.ox.ac.uk/fsl/fslwiki\">FSL</a> programming library written in Python. It is used by <a href=\"http://git.fmrib.ox.ac.uk/fsl/fsleyes/fsleyes/\">FSLeyes</a>.</p>\n\n<p>The fslpy library is developed at the Wellcome Centre for Integrative Neuroimaging (FMRIB), at the University of Oxford. It is hosted at <a href=\"https://git.fmrib.ox.ac.uk/fsl/fslpy/\">https://git.fmrib.ox.ac.uk/fsl/fslpy/</a>.</p>",
"keywords" : ["python", "mri", "neuroimaging", "neuroscience"],
"access_right" : "open",
"license" : "Apache-2.0",
"creators" : [
{ "name" : "McCarthy, Paul" },
{ "name" : "Cottaar, Michiel" },
{ "name" : "Webster, Matthew" },
{ "name" : "Fitzgibbon, Sean" },
{ "name" : "Craig, Martin" }
]
}
}
###########################################################################
#############################################################################
# This file defines the build process for fslpy, as hosted at:
#
# https://git.fmrib.ox.ac.uk/fsl/fslpy
......@@ -9,24 +9,29 @@
#
# 2. style: Check coding style
#
# 3. doc: Building API documentation
# 3. doc: Building and upload API documentation using GitLab Pages.
#
# 4. build: Building source distributions and wheels
# 4. build: Building source and wheel distributions
#
# 5. deploy: Uploading the build outputs to pypi, and the documentation
# to a hosting server.
# 5. deploy: Uploading the build outputs to pypi/hosting servers.
#
#
# Custom docker images are used for several jobs - these images are
# available at:
#
# https://hub.docker.com/u/pauldmccarthy/
#
# The test and style stages are executed on all branches of upstream and fork
# repositories.
#
# The doc stage, and the deploy-doc job, is executed on all branches of the
# upstream repository.
# The doc stage is executed on release branches of the upstream repository.
#
# The build stage, and the remaining jobs in the deploy stage, are only
# executed on the upstream repository, and only for release tags.
# The build and deploy stages are executed on tags on the upstream
# repository, and the deploy stage must be manually instantiated.
#
# The deploy stages are manually instantiated.
###########################################################################
# Most of the logic for each job is defined in shell scripts in the .ci
# sub-directory.
#############################################################################
stages:
......@@ -53,16 +58,13 @@ stages:
# - SSH_PRIVATE_KEY_FSL_DOWNLOAD - private key for downloading some FSL
# files from a remote server (FSL_HOST)
#
# - SSH_PRIVATE_KEY_DOC_DEPLOY - private key for rsyncing documentation
# to remote host (DOC_HOST)
#
# - SSH_SERVER_HOSTKEYS - List of trusted SSH hosts
#
# - DOC_HOST: - Username@host to upload documentation to
# (e.g. "paulmc@jalapeno.fmrib.ox.ac.uk")
#
# - FSL_HOST: - Username@host to download FSL data from
# (e.g. "paulmc@jalapeno.fmrib.ox.ac.uk")
# (most likely "paulmc@localhost")
#
# - FSL_ATLAS_DIR: - Location of the FSL atlas data on
# FSL_HOST.
#
# - TWINE_USERNAME: - Username to use when uploading to pypi
#
......@@ -70,18 +72,17 @@ stages:
#
# - TWINE_REPOSITORY_URL: - Pypi repository to upload to
#
# - WXPYTHON_UBUNTU1404_URL: - Url to a location containg binary wheels
# for wxPython/ubuntu 14.04
# - ZENODO_URL: - Zenodo URL to deposit release file to.
#
# - WXPYTHON_UBUNTU1604_URL: - Url to a location containg binary wheels
# for wxPython/ubuntu 16.04
# - ZENODO_TOKEN: - Zenodo access token.
#
# - ZENODO_DEPOSIT_ID: - Deposit ID of previous Zenodo deposit.
###############################################################################
variables:
UPSTREAM_PROJECT: "fsl/fslpy"
UPSTREAM_URL: "git@git.fmrib.ox.ac.uk"
TEST_OPTS: "--cov-report= --cov-append"
UPSTREAM_PROJECT: "fsl/fslpy"
UPSTREAM_URL: "git@git.fmrib.ox.ac.uk"
####################################
......@@ -90,14 +91,9 @@ variables:
####################################
.only_upstream: &only_upstream
.only_release_branches: &only_release_branches
only:
- branches@fsl/fslpy
.only_master: &only_master
only:
- master@fsl/fslpy
- /^v.+$/@fsl/fslpy
.only_releases: &only_releases
......@@ -105,233 +101,95 @@ variables:
- tags@fsl/fslpy
.except_releases: &except_releases
except:
- tags
##########################################################
# The setup_ssh anchor contains a before_script section
# which does the following:
#
# - Sets up key-based SSH login, and
# installs the private keys, so
# we can connect to servers.
#
# - Configures git, and adds the
# upstream repo as a remote
#
# (see https://docs.gitlab.com/ce/ci/ssh_keys/README.html)
#
# NOTE: It is assumed that non-docker
# executors are already configured
# (or don't need any configuration).
##########################################################
.setup_ssh: &setup_ssh
before_script:
- if [[ -f /.dockerenv ]]; then
apt-get update -y || yum -y check-update || true;
apt-get install -y openssh-client rsync git || yum install -y openssh-client rsync git || true;
eval $(ssh-agent -s);
mkdir -p $HOME/.ssh;
echo "$SSH_PRIVATE_KEY_GIT" > $HOME/.ssh/id_git;
echo "$SSH_PRIVATE_KEY_FSL_DOWNLOAD" > $HOME/.ssh/id_fsl_download;
if [[ "$CI_PROJECT_PATH" == "$UPSTREAM_PROJECT" ]]; then
echo "$SSH_PRIVATE_KEY_DOC_DEPLOY" > $HOME/.ssh/id_doc_deploy;
fi;
chmod go-rwx $HOME/.ssh/id_*;
ssh-add $HOME/.ssh/id_git;
ssh-add $HOME/.ssh/id_fsl_download;
if [[ "$CI_PROJECT_PATH" == "$UPSTREAM_PROJECT" ]]; then
ssh-add $HOME/.ssh/id_doc_deploy;
fi
echo "$SSH_SERVER_HOSTKEYS" > $HOME/.ssh/known_hosts;
touch $HOME/.ssh/config;
echo "Host ${UPSTREAM_URL##*@}" >> $HOME/.ssh/config;
echo " User ${UPSTREAM_URL%@*}" >> $HOME/.ssh/config;
echo " IdentityFile $HOME/.ssh/id_git" >> $HOME/.ssh/config;
echo "Host docdeploy" >> $HOME/.ssh/config;
echo " HostName ${DOC_HOST##*@}" >> $HOME/.ssh/config;
echo " User ${DOC_HOST%@*}" >> $HOME/.ssh/config;
echo " IdentityFile $HOME/.ssh/id_doc_deploy" >> $HOME/.ssh/config;
echo "Host fsldownload" >> $HOME/.ssh/config;
echo " HostName ${FSL_HOST##*@}" >> $HOME/.ssh/config;
echo " User ${FSL_HOST%@*}" >> $HOME/.ssh/config;
echo " IdentityFile $HOME/.ssh/id_fsl_download" >> $HOME/.ssh/config;
echo "Host *" >> $HOME/.ssh/config;
echo " IdentitiesOnly yes" >> $HOME/.ssh/config;
git config --global user.name "Gitlab CI";
git config --global user.email "gitlabci@localhost";
if [[ `git remote -v` == *"upstream"* ]]; then
git remote remove upstream;
fi;
git remote add upstream "$UPSTREAM_URL:$UPSTREAM_PROJECT";
fi
- bash ./.ci/setup_ssh.sh
###################################################
# The patch_version anchor contains a before_script
# The check_version anchor contains a before_script
# section which is run on release builds, and makes
# sure that the version in the code is up to date
# (i.e. equal to the tag name).
###################################################
.patch_version: &patch_version
.check_version: &check_version
before_script:
- if [[ "x$CI_COMMIT_TAG" != "x" ]]; then
echo "Release detected - patching version - $CI_COMMIT_REF_NAME";
python -c "import fsl.version as v; v.patchVersion('fsl/version.py', '$CI_COMMIT_REF_NAME')";
fi
- bash ./.ci/check_version.sh
############
# Test stage
############
.test_rules: &test_rules
# We only run tests on MRs, and on release branches
# (a more substantial test suite is run on release
# branches - see .ci/test_template.sh). We don't run
# on upstream/main, as all merges are fast-forwards,
# so the tests will have already been run on the MR
# branch. We also allow manually running a pipeline
# via the web interface.
rules:
- if: $SKIP_TESTS != null
when: never
- if: $CI_COMMIT_MESSAGE =~ /\[skip-tests\]/
when: never
- if: $CI_PIPELINE_SOURCE == "merge_request_event"
when: on_success
- if: $CI_PROJECT_PATH == $UPSTREAM_PROJECT && $CI_COMMIT_BRANCH =~ /^v.+$/
when: on_success
- if: $CI_PIPELINE_SOURCE == "web"
when: on_success
- when: never
.test: &test_template
<<: *setup_ssh
# Releases are just tags on a release
# branch, so we don't need to test them.
<<: *except_releases
<<: *test_rules
tags:
- docker
script:
- bash ./.ci/test_template.sh
# Install $PY_VERSION, xvfb, and all
# of the wxpython dependencies.
- apt-get update -y || true
- apt-get install -y software-properties-common python-software-properties xvfb libgtk2.0-0 libnotify4 freeglut3 libsdl1.2debian
- add-apt-repository -y ppa:deadsnakes/ppa
- apt-get update -y || true
- apt-get install -y $PY_VERSION "$PY_VERSION"-dev $PY_PACKAGES
- $PY_VENV test.venv
- source test.venv/bin/activate
- pip install --upgrade pip setuptools
# If running on a fork repository, we merge in the
# upstream/master branch. This is done so that merge
# requests from fork to the parent repository will
# have unit tests run on the merged code, something
# which gitlab CE does not currently do for us.
- if [[ "$CI_PROJECT_PATH" != "$UPSTREAM_PROJECT" ]]; then
git fetch upstream;
git merge --no-commit --no-ff upstream/master;
fi;
- $INSTALL_WX
# All other deps can be installed as normal.
# We install test dependenciesd through pip,
# because if we let setuptools do it, it
# will build/install everything from source,
# rather than using wheels.
- pip install -r requirements.txt
- pip install sphinx sphinx-rtd-theme
- pip install pytest pytest-cov pytest-html pytest-runner mock coverage
# style stage
- if [ "$TEST_STYLE"x != "x" ]; then pip install pylint flake8; fi;
- if [ "$TEST_STYLE"x != "x" ]; then flake8 fsl || true; fi;
- if [ "$TEST_STYLE"x != "x" ]; then pylint --output-format=colorized fsl || true; fi;
- if [ "$TEST_STYLE"x != "x" ]; then exit 0; fi
# We need the FSL atlases for the atlas
# tests, and need $FSLDIR to be defined
- export FSLDIR=/fsl/
- mkdir -p $FSLDIR/data/
- rsync -rv "fsldownload:data/atlases/" "$FSLDIR/data/atlases/"
# Finally, run the damned tests.
# We run some tests under xvfb-run
# because they invoke wx. Sleep in
# between, otherwise xvfb gets upset.
- xvfb-run python setup.py test --addopts="$TEST_OPTS tests/test_async.py"
- sleep 5
- xvfb-run python setup.py test --addopts="$TEST_OPTS tests/test_platform.py"
# We run the immv/imcpy tests as the nobody
# user because some tests expect permission
# denied errors when looking at files, and
# root never gets denied. Make everything in
# this directory writable by anybody (which,
# unintuitively, includes nobody)
- chmod -R a+w `pwd`
- su -s /bin/bash -c 'source test.venv/bin/activate && python setup.py test --addopts="$TEST_OPTS tests/test_immv_imcp.py"' nobody
# All other tests can be run as normal
- python setup.py test --addopts="$TEST_OPTS --ignore=tests/test_async.py --ignore=tests/test_platform.py --ignore=tests/test_immv_imcp.py"
- python -m coverage report
test:2.7:
test:3.10:
stage: test
image: ubuntu:16.04
image: pauldmccarthy/fsleyes-py310-wxpy4-gtk3
<<: *test_template
variables:
PY_VERSION: "python2.7"
PY_PACKAGES: "python-pip python-virtualenv"
PY_VENV: "virtualenv"
INSTALL_WX: "pip install --only-binary wxpython -f $WXPYTHON_UBUNTU1604_URL wxpython"
# we use ubuntu:14.04 for python 3.4, because
# wxpython is not available for 16.04/3.4
test:3.4:
test:3.11:
stage: test
image: ubuntu:14.04
image: pauldmccarthy/fsleyes-py311-wxpy4-gtk3
<<: *test_template
variables:
PY_VERSION: "python3.4"
PY_PACKAGES: "python3-pip python3.4-venv"
PY_VENV: "python3.4 -m venv"
INSTALL_WX: "pip install --pre -f $WXPYTHON_UBUNTU1404_URL wxpython"
test:3.5:
test:3.12:
stage: test
image: ubuntu:16.04
image: pauldmccarthy/fsleyes-py312-wxpy4-gtk3
<<: *test_template
variables:
PY_VERSION: "python3.5"
PY_PACKAGES: "python3-pip python3-venv"
PY_VENV: "python3.5 -m venv"
INSTALL_WX: "pip install --only-binary wxpython -f $WXPYTHON_UBUNTU1604_URL wxpython"
test:3.6:
test:3.13:
stage: test
image: ubuntu:16.04
image: pauldmccarthy/fsleyes-py313-wxpy4-gtk3
<<: *test_template
variables:
PY_VERSION: "python3.6"
PY_PACKAGES: "python3-pip python3.6-venv"
PY_VENV: "python3.6 -m venv"
INSTALL_WX: "pip install --only-binary wxpython -f $WXPYTHON_UBUNTU1604_URL wxpython"
test:build-pypi-dist:
stage: test
image: python:3.10
<<: *test_rules
tags:
- docker
script:
- bash ./.ci/build_pypi_dist.sh
#############
......@@ -341,37 +199,37 @@ test:3.6:
style:
stage: style
image: ubuntu:16.04
image: pauldmccarthy/fsleyes-py310-wxpy4-gtk3
<<: *test_template
variables:
PY_VERSION: "python3.5"
PY_PACKAGES: "python3-pip python3-venv"
PY_VENV: "python3.5 -m venv"
INSTALL_WX: "pip install --only-binary wxpython -f $WXPYTHON_UBUNTU1604_URL wxpython"
TEST_STYLE: "true"
###########
# Doc stage
###########
#############
# Pages stage
#############
# I would like to have separate doc deploys for
# both the main and latest release branches,
# but this is awkward with gitlab pages. So
# currently the most recently executed pages
# job is the one that gets deployed.
build-doc:
<<: *only_upstream
<<: *patch_version
pages:
<<: *only_release_branches
tags:
- docker
stage: doc
image: python:3.5
image: pauldmccarthy/fsleyes-py310-wxpy4-gtk3
script:
- python setup.py doc
- mv doc/html doc/"$CI_COMMIT_REF_NAME"
- bash ./.ci/build_doc.sh
artifacts:
expire_in: 1 day
paths:
- doc/$CI_COMMIT_REF_NAME
- public
#############
......@@ -379,20 +237,18 @@ build-doc:
#############
build-dist:
build-pypi-dist:
<<: *only_releases
<<: *patch_version
<<: *check_version
stage: build
image: python:3.5
image: python:3.10
tags:
- docker
script:
- pip install wheel
- python setup.py sdist
- python setup.py bdist_wheel
- bash ./.ci/build_pypi_dist.sh
artifacts:
expire_in: 1 day
......@@ -405,36 +261,35 @@ build-dist:
##############
deploy-doc:
<<: *only_upstream
deploy-pypi:
<<: *only_releases
<<: *setup_ssh
stage: deploy
when: manual
image: python:3.5
image: python:3.10
tags:
- docker
dependencies:
- build-doc
- build-pypi-dist
script:
- rsync -rv doc/"$CI_COMMIT_REF_NAME" "docdeploy:"
- bash ./.ci/deploy_pypi.sh
deploy-pypi:
deploy-zenodo:
<<: *only_releases
<<: *setup_ssh
stage: deploy
when: manual
image: python:3.5
image: python:3.10
tags:
- docker
dependencies:
- build-dist
- build-pypi-dist
script:
- pip install setuptools wheel twine
- twine upload dist/*
- bash ./.ci/zenodo_deposit.sh "$ZENODO_URL" "$ZENODO_TOKEN" "$ZENODO_DEPOSIT_ID"
Paul McCarthy <pauldmccarthy@gmail.com>
Michiel Cottaar <michiel.cottaar@ndcn.ox.ac.uk>
Matthew Webster <matthew.webster@ndcn.ox.ac.uk>
Sean Fitzgibbon <sean.fitzgibbon@ndcn.ox.ac.uk>
Martin Craig <martin.craig@eng.ox.ac.uk>
Taylor Hanayik <taylor.hanayik@ndcn.ox.ac.uk>
Evan Edmond <evan.edmond@ndcn.ox.ac.uk>
Christoph Arthofer <christoph.arthofer@ndcn.oxc.ac.uk>
Fidel Alfaro Almagro <fidel.alfaroalmagro@ndcn.ox.ac.uk>
\ No newline at end of file
This diff is collapsed.
Copyright 2016-2017 University of Oxford, Oxford, UK
Copyright 2016-2023 University of Oxford, Oxford, UK
The fslpy library
Copyright 2016-2017 University of Oxford, Oxford, UK.
Copyright 2016-2023 University of Oxford, Oxford, UK.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
......
include LICENSE
include AUTHOR
include CHANGELOG.rst
include COPYRIGHT
include requirements.txt
include pytest.ini
recursive-include doc *
recursive-exclude doc/html *
recursive-include tests *
include LICENSE
include README.rst
include conftest.py
recursive-include doc *
recursive-include fsl/tests *
fslpy
=====
.. image:: https://img.shields.io/pypi/v/fslpy.svg
:target: https://pypi.python.org/pypi/fslpy/
.. image:: https://git.fmrib.ox.ac.uk/fsl/fslpy/badges/master/build.svg
:target: https://git.fmrib.ox.ac.uk/fsl/fslpy/commits/master/
.. image:: https://anaconda.org/conda-forge/fslpy/badges/version.svg
:target: https://anaconda.org/conda-forge/fslpy
.. image:: https://zenodo.org/badge/DOI/10.5281/zenodo.1470750.svg
:target: https://doi.org/10.5281/zenodo.1470750
.. image:: https://git.fmrib.ox.ac.uk/fsl/fslpy/badges/master/coverage.svg
:target: https://git.fmrib.ox.ac.uk/fsl/fslpy/commits/master/
.. image:: https://img.shields.io/pypi/v/fslpy.svg
:target: https://pypi.python.org/pypi/fslpy/)
The ``fslpy`` project is a `FSL <http://fsl.fmrib.ox.ac.uk/fsl/fslwiki/>`_
programming library written in Python. It is used by `FSLeyes
<https://git.fmrib.ox.ac.uk/paulmc/fsleyes/>`_.
<https://git.fmrib.ox.ac.uk/fsl/fsleyes/fsleyes/>`_.
``fslpy`` is tested against Python versions 3.10, 3.11, 3.12, and 3.13.
Installation
------------
Install ``fslpy`` and its core dependencies via pip::
pip install fslpy
``fslpy`` is also available on `conda-forge <https://conda-forge.org/>`_::
conda install -c conda-forge fslpy
Dependencies
------------
All of the dependencies of ``fslpy`` are listed in the `requirements.txt
<requirements.txt>`_ file. Some ``fslpy`` modules require `wxPython
<http://www.wxpython.org>`_ 3.0.2.0 or higher.
All of the core dependencies of ``fslpy`` are listed in the
`pyproject.toml <pyproject.toml>`_ file.
Some optional dependencies (labelled ``extra`` in ``pyproject.toml``) provide
addditional functionality:
- ``wxPython``: The `fsl.utils.idle <fsl/utils/idle.py>`_ module has
functionality to schedule functions on the ``wx`` idle loop.
- ``indexed_gzip``: The `fsl.data.image.Image <fsl/data/image.py>`_ class
can use ``indexed_gzip`` to keep large compressed images on disk instead
of decompressing and loading them into memory..
- ``trimesh``/``rtree``: The `fsl.data.mesh.TriangleMesh <fsl/data/mesh.py>`_
class has some methods which use ``trimesh`` to perform geometric queries
on the mesh.
- ``Pillow``: The `fsl.data.bitmap.Bitmap <fsl/data/bitmap.py>`_ class uses
``Pillow`` to load image files.
If you are using Linux, you need to install wxPython first, as binaries are
not available on PyPI. Install wxPython like so, changing the URL for your
specific platform::
pip install -f https://extras.wxpython.org/wxPython4/extras/linux/gtk2/ubuntu-16.04/ wxpython
Once wxPython has been installed, you can type the following to install the
remaining optional dependencies::
pip install "fslpy[extra]"
Dependencies for testing and documentation are also listed in ``pyproject.toml``,
and are respectively labelled as ``test`` and ``doc``.
Non-Python dependencies
^^^^^^^^^^^^^^^^^^^^^^^
The `fsl.data.dicom <fsl/data/dicom.py>`_ module requires the presence of
Chris Rorden's `dcm2niix <https://github.com/rordenlab/dcm2niix>`_ program.
The ``rtree`` library assumes that ``libspatialindex`` is installed on
your system.
The `fsl.transform.x5 <fsl/transform/x5.py>`_ module uses `h5py
<https://www.h5py.org/>`_, which requires ``libhdf5``.
Documentation
-------------
API documentation for ``fslpy`` is hosted at
https://open.win.ox.ac.uk/pages/fsl/fslpy/.
``fslpy`` is documented using `sphinx <http://http://sphinx-doc.org/>`_. You
can build the API documentation by running::
python setup.py doc
pip install ".[doc]"
sphinx-build doc html
The HTML documentation will be generated and saved in the ``doc/html/``
The HTML documentation will be generated and saved in the ``html/``
directory.
Tests
-----
Run the test suite via::
pip install ".[test]"
pytest
Some tests will only pass if the test environment meets certain criteria -
refer to the ``tool.pytest.init_options`` section of
[``pyproject.toml``](pyproject.toml) for a list of [pytest
marks](https://docs.pytest.org/en/7.1.x/example/markers.html) which can be
selectively enabled or disabled.
Contributing
------------
If you are interested in contributing to ``fslpy``, check out the
`contributing guide <doc/contributing.rst>`_.
Tests
-----
Credits
-------
Run the test suite via::
python setup.py test
The `fsl.data.dicom <fsl/data/dicom.py>`_ module is little more than a thin
wrapper around Chris Rorden's `dcm2niix
<https://github.com/rordenlab/dcm2niix>`_ program.
A test report will be generated at ``report.html``, and a code coverage report
will be generated in ``htmlcov/``.
The `example.mgz <tests/testdata/example.mgz>`_ file, used for testing,
originates from the ``nibabel`` test data set.
......@@ -14,42 +14,24 @@ import numpy as np
def pytest_addoption(parser):
parser.addoption('--niters',
type=int,
action='store',
default=50,
help='Number of test iterations for imagewrapper')
parser.addoption('--testdir',
action='store',
help='FSLeyes test data directory')
parser.addoption('--seed',
type=int,
help='Seed for random number generator')
help='Seed for random number generator')
@pytest.fixture
def testdir(request):
"""FSLeyes test data directory."""
return op.expanduser(request.config.getoption('--testdir'))
@pytest.fixture
def niters(request):
"""Number of test iterations."""
return request.config.getoption('--niters')
@pytest.fixture
def seed(request):
seed = request.config.getoption('--seed')
if seed is None:
seed = np.random.randint(2 ** 32)
seed = np.random.randint(2 ** 30)
np.random.seed(seed)
random .seed(seed)
print('Seed for random number generator: {}'.format(seed))
......
/* override table width restrictions */
.wy-table-responsive table td, .wy-table-responsive table th {
white-space: normal;
}
.wy-table-responsive {
margin-bottom: 24px;
max-width: 100%;
overflow: visible;
}
......@@ -12,12 +12,70 @@
# All configuration values have a default; values that are commented out
# serve to show the default.
import glob
import itertools as it
import os
import os.path as op
import sys
import datetime
date = datetime.date.today()
def check_for_missing_stubs():
docdir = op.dirname(__file__)
basedir = op.join(docdir, '..')
modules = []
def tomodname(f):
if f.endswith('.py'):
f = f[:-3]
return op.relpath(op.join(dirpath, f), basedir).replace(op.sep, '.')
for dirpath, dirnames, filenames in os.walk(op.join(basedir, 'fsl')):
for d in dirnames:
if d == '__pycache__':
continue
if len(glob.glob(op.join(dirpath, d, '**', '*.py'), recursive=True)) == 0:
continue
modules.append(tomodname(d))
for f in filenames:
if not f.endswith('.py'):
continue
if f in ('__init__.py', '__main__.py'):
continue
modules.append(tomodname(f))
modules = [m for m in modules if not m.startswith('fsl.tests')]
# import fsl
# modules = recurse(fsl)
# modules = [m.name for m in modules]
# print()
# print()
# print()
for mod in modules:
docfile = op.join(docdir, f'{mod}.rst')
if not op.exists(docfile):
print(f'No doc file found for module: {mod}')
for docfile in glob.glob(op.join(docdir, '*.rst')):
docfile = op.relpath(docfile, basedir)
mod = op.splitext(op.basename(docfile))[0]
if mod not in modules:
print(f'No module found for doc file: {docfile}')
if __name__ == '__main__':
check_for_missing_stubs()
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
......@@ -33,7 +91,8 @@ date = datetime.date.today()
# ones.
extensions = [
'sphinx.ext.autodoc',
'sphinx.ext.autosummary',
'sphinx.ext.viewcode',
'sphinx.ext.autosummary',
'sphinx.ext.mathjax',
'sphinx.ext.graphviz',
'sphinx.ext.todo',
......@@ -55,13 +114,13 @@ master_doc = 'index'
# General information about the project.
project = u'fslpy'
copyright = u'{}, Paul McCarthy, University of Oxford, Oxford, UK'.format(
copyright = u'{}, FMRIB Centre, University of Oxford, Oxford, UK'.format(
date.year)
# Links to other things
rst_epilog = """
.. |fsleyes_apidoc| replace:: FSLeyes
.. _fsleyes_apidoc: http://users.fmrib.ox.ac.uk/~paulmc/fsleyes_apidoc/index.html
.. _fsleyes_apidoc: http://users.fmrib.ox.ac.uk/~paulmc/fsleyes/userdoc/latest/index.html
"""
......@@ -121,6 +180,7 @@ pygments_style = 'sphinx'
# a list of builtin themes.
html_theme = 'sphinx_rtd_theme'
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the
# documentation.
......@@ -148,7 +208,11 @@ html_theme = 'sphinx_rtd_theme'
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
html_static_path = []
html_static_path = ['_static']
html_css_files = [
'theme_overrides.css', # overrides for wide tables in RTD theme
]
# Add any extra paths that contain custom files (such as robots.txt or
# .htaccess) here, relative to this directory. These files are copied
......@@ -353,26 +417,12 @@ epub_exclude_files = ['search.html']
# special-members flag)
autoclass_content = 'class'
# Document private members and special members (e.g. __init__)
autodocsourc_default_flags = ['private-members', 'special-members']
# Documentation for python modules is in the same order
# as the source code.
autodoc_member_order = 'bysource'
def autodoc_skip_member(app, what, name, obj, skip, options):
# Do not document the _sync_* properties
# that are added by the props package to
# all SyncableHasProperties classes.
if what == 'class':
attName = name.split('.')[-1]
return skip or attName.startswith('_sync_')
return skip or False
def setup(app):
app.connect('autodoc-skip-member', autodoc_skip_member)
autodoc_default_options = {
'special-members' : True,
'private-members' : True,
'undoc-members' : True,
'member-order' : 'bysource',
}
graphviz_output_format = 'svg'
......@@ -9,10 +9,10 @@ Development model
-----------------
- The master branch should always be stable and ready to release. All
development occurs on the master branch.
- The main branch should always be stable and ready to release. All
development occurs on the main branch.
- All changes to the master branch occur via merge requests. Individual
- All changes to the main branch occur via merge requests. Individual
developers are free to choose their own development workflow in their own
repositories.
......@@ -24,6 +24,27 @@ Development model
- Coding conventions are adhered to (unless there is good reason not to).
Commit messages
---------------
To aid readability, all commit messages should be prefixed with one or more of
the following labels (this convention has been inherited from `nibabel
<https://github.com/nipy/nibabel>`_):
* *BF* : bug fix
* *RF* : refactoring
* *ENH*: enhancement/new feature
* *BW* : addresses backward-compatibility
* *OPT* : optimization
* *BK* : breaks something and/or tests fail
* *PL* : making pylint happier
* *DOC* : for all kinds of documentation related commits
* *TEST*: for adding or changing tests
* *MNT* : for administrative/maintenance changes
* *CI* : for continuous-integration changes
Version number
--------------
......@@ -45,10 +66,10 @@ numbers::
backwards-incompatible changes.
The version number in the ``master`` branch should be of the form
``major.minor.patch.dev``, to indicate that any releases made from this branch
are development releases (although development releases are not part of the
release model).
The version number in the ``main`` branch should be of the form
``major.minor.patch.dev0``, to indicate that any releases made from this
branch are development releases (although development releases are not part of
the release model).
Releases
......@@ -62,7 +83,7 @@ name for minor release ``1.0`` would be ``v1.0``.
Patches and bugfixes may be added to these release branches as ``patch``
releases. These changes should be made on the master branch like any other
releases. These changes should be made on the main branch like any other
change (i.e. via merge requests), and then cherry-picked onto the relevant
release branch(es).
......@@ -73,6 +94,44 @@ example, the first release off the ``v1.0`` branch would be tagged with
``1.0.1``, ``1.0.2``, etc.
Major/minor releases
^^^^^^^^^^^^^^^^^^^^
Follow this process for major and minor releases. Steps 1 and 2 should be
performed via a merge request onto the main branch, and step 4 via a merge
request onto the relevant minor branch.
1. Update the changelog on the main branch to include the new version number
and release date.
2. On the main branch, update the version number in ``fsl/version.py`` to
a development version of **the next** minor release number. For example,
if you are about to release version ``1.3.0``, the version in the main
branch should be ``1.4.0.dev0``.
3. Create the new minor release branch off the main branch.
4. Update the version number on the release branch. If CI tests fail on the
release branch, postpone the release until they are fixed.
5. Tag the new release on the minor release branch.
Bugfix/patch releases
^^^^^^^^^^^^^^^^^^^^^
Follow this process for patch releases. Step 1 should be performed via
a merge request onto the main branch, and step 2 via a merge request onto
the relevant minor branch.
1. Add the fix to the main branch, along with an updated changelog including
the version number and date for the bugfix release.
2. Cherry-pick the relevant commit(s) from the main branch onto the minor
release branch, and update the version number on the minor release branch.
If CI tests fail on the release branch, go back to step 1.
3. Tag the new release on the minor release branch.
Testing
-------
......@@ -81,7 +140,7 @@ Unit and integration tests are currently run with ``py.test`` and
``coverage``.
- Aim for 100% code coverage.
- Tests must pass on python 2.7, 3.4, 3.5, and 3.6
- Tests must pass on python 3.5, 3.6, and 3.7.
Coding conventions
......@@ -123,6 +182,7 @@ for a list of error codes):
- E302: expected 2 blank lines, found 0
- E303: too many blank lines (3)
- E701: multiple statements on one line (colon)
- W504: line break after binary operator
The ``pylint`` tool can be *very* opinionated about how you write your code,
......@@ -141,7 +201,7 @@ refactoring and convention messages, and a few select warnings (type ``pylint
To check code with ``flake8`` and ``pylint``, I use the following commands::
flake8 --ignore=E127,E201,E203,E221,E222,E241,E271,E272,E301,E302,E303,E701 fsl
flake8 --ignore=E127,E201,E203,E221,E222,E241,E271,E272,E301,E302,E303,E701,W504 fsl
pylint --extension-pkg-whitelist=numpy,wx \
--generated-members=np.int8,np.uint8,np.int16,np.uint16,np.int32,np.uint32,np.int64,np.uint64,np.float32,np.float64,np.float128,wx.PyDeadObjectError \
--disable=R,C,W0511,W0703,W1202 fsl