Commit 49370d4a authored by Paul McCarthy's avatar Paul McCarthy 🚵
Browse files

RF: Make logic clearer? I think?

parent 578cbe3f
......@@ -4,34 +4,39 @@
###############################################################################
##################################
# JOBS RUN ON PROJECT REPOSITORIES
##################################
################################################
# JOBS RUN ON PROJECT REPOSITORIES
#
# Most work is performed by jobs running on the
# recipe repository. The only jobs which are run
# on the project repository are:
#
# - Trigger a pipeline on the recipe repository
# - Open a MR on the recipe repository
#
################################################
# Trigger building of a new conda package from the
# current state of the project repository. This job
# is run when commits are added to any branch of
# the project repository.
trigger-staging-conda-package:
# Trigger a test conda package build when commits
# are added to any branch of the project repository.
# This will induce a call to build-conda-recipe
# on the recipe repository.
trigger-package-build:
stage: fsl-ci-build
image: continuumio/miniconda3
tags:
- docker
# This job is only run on branches of a project repository.
# This job is only run on branches
# of a project repository.
rules:
- if: '$CI_PROJECT_PATH != "fsl/fsl-ci-rules" &&
$FSLCONDA_RECIPE == null &&
$CI_COMMIT_TAG == null &&
$CI_PIPELINE_SOURCE != "merge_request_event"'
# The trigger_staging_conda_package.py script passes some
# variables through to the triggered build-conda-package
# job, so it knows that this is a staging build. See the
# script, and the job below, for more details.
script:
- python3 /ci_rules/scripts/trigger_staging_conda_package.py
- python3 /ci_rules/scripts/trigger_package_build.py
# Open a MR on a recipe repository to update the
......@@ -54,46 +59,56 @@ update-conda-recipe:
- python3 /ci_rules/scripts/update_conda_recipe.py
#################################
# JOBS RUN ON RECIPE REPOSITORIES
#################################
#####################################################
# JOBS RUN ON RECIPE REPOSITORIES
#
# Up to three jobs may be run on a recipe repository:
#
# - Building a new conda package either for testing,
# or to be deployed to the staging or production
# channel.
# - Checking downstream compatilbity - a test build
# is triggered on the repositories of recipes which
# list this recipe as a dependency.
# - Deployment of the built package to the staging or
# production channel.
#
######################################################
# Build a conda package. This job is run on any branch of a
# recipe repository, and may be triggered by the
# trigger-staging-conda-package job when commits are added
# to the project repository.
# Build a conda package. This job is run on any branch of a recipe repository,
# or may invoked via the trigger-package-build or
# trigger-check-downwstream-packages jobs.
#
# By default, the package will be built from the project gitref specified in
# the recipe meta.yaml, dependencies will be downloaded from the production
# channel, and the built package will be deployed to production.
#
# By default, the package will be built from the project gitref
# specified in the recipe meta.yaml, dependencies will be
# downloaded from the production channel, and the built package
# will be deployed to production.
# This behaviour is controlled by the following variables, which have default
# values below, but may be overridden by API-triggered pipelines
# (e.g. trigger-staging-conda-recipe, and trigger-check-downstream-packages):
#
# This behaviour is controlled by the following variables, which
# have default values below, but may be overridden by API-triggered
# pipelines (e.g. trigger-staging-conda-recipe, and
# check-downstream-packages):
# - PROJECT_REF: If non-empty, the package is built from this git ref
# on the project repository. Otherwise the package is
# built from the git ref specified in the recipe
# meta.yaml file.
#
# - STAGING: If non-empty, the package is built using
# dependencies from the staging channel. Otherwise
# the package is built using dependencies from the
# production channel.
# - PREBUILD: List of packages to pre-build and install locally,
# rather than installing from the conda channel.
#
# - DEPLOY: Used by the deploy-conda-package job.
# - STAGING: If true, the package is built using dependencies from
# the staging channel. Otherwise the package is built
# using dependencies from the production channel.
#
# - PROJECT_REF: If non-empty, the package is built from this git
# ref on the project repository. Otherwise the
# package is built from the git ref specified in the
# recipe meta.yaml file.
# - CHECK_DOWNSTREAM: Toggle the downstream check - see the
# trigger-check-downstream-packages job.
#
# - PREBUILD: Passed to this job via the
# trigger-check-downstream-packages job, to trigger a
# build of the upstream package within the downstream build
# jobs.
# - DEPLOY: Toggle deployment - see the deploy-conda-package
# job.
#
# With the exception of PREBUILD, these variables are forwarded to the
# deploy-conda-package job and check-downstream-packages jobs, which are
# both potentially run after this job.
# deploy-conda-package job and trigger-check-downstream-packages jobs, which
# are both potentially invoked after this job.
build-conda-package:
stage: fsl-ci-build
image: continuumio/miniconda3
......@@ -101,28 +116,35 @@ build-conda-package:
- docker
# This job is only run on branches
# of a recipe repository.
# of a recipe repository, either
# by commits being added to them,
# or via an API call.
rules:
- if: '$FSLCONDA_RECIPE != null &&
$CI_COMMIT_TAG == null &&
$CI_PIPELINE_SOURCE != "merge_request_event"'
# Default behaviour - build from production using
# gitref in meta.yaml, and deploy to production.
# Default behaviour - build from production
# using gitref in meta.yaml, perform a
# downstream compatibility check, and deploy
# to production. These variables are
# overridden by API-triggered invocations.
variables:
STAGING: ""
DEPLOY: "true"
PROJECT_REF: ""
PREBUILD: ""
PROJECT_REF: ""
PREBUILD: ""
STAGING: ""
CHECK_DOWNSTREAM: "true"
DEPLOY: "true"
script:
- python3 /ci_rules/scripts/build_conda_package.py
# We propagate the variables on
# to jobs in the deploy stage.
- echo "STAGING=$STAGING" > build.env
- echo "DEPLOY=$DEPLOY" >> build.env
- echo "PROJECT_REF=$PROJECT_REF" >> build.env
- echo "STAGING=$STAGING" > build.env
- echo "CHECK_DOWNSTREAM=$CHECK_DOWNSTREAM" >> build.env
- echo "DEPLOY=$DEPLOY" >> build.env
- echo "PROJECT_REF=$PROJECT_REF" >> build.env
artifacts:
expire_in: 1 hr
......@@ -145,16 +167,11 @@ trigger-check-downstream-packages:
# This job is run on the master
# branch of a recipe repository,
# and only on pipelines which may
# result in a deployment to the
# staging or production channels.
# when CHECK_DOWNSTREAM is true
rules:
- if: ' $FSLCONDA_RECIPE != null &&
$CI_COMMIT_TAG == null &&
$CI_COMMIT_REF == "master" &&
$TRIGGERED == null &&
($PROJECT_REF == "" ||
$PROJECT_REF == "master")'
- if: '$FSLCONDA_RECIPE != null &&
$CI_COMMIT_REF_NAME == "master" &&
$CHECK_DOWNSTREAM == "true"'
dependencies:
- build-conda-package
......@@ -184,7 +201,7 @@ deploy-conda-package:
rules:
- if: '$FSLCONDA_RECIPE != null &&
$CI_COMMIT_REF_NAME == "master" &&
$DEPLOY != ""'
$DEPLOY == "true"'
when: manual
dependencies:
......
......@@ -8,7 +8,6 @@
import os
import sys
from fsl_ci_utils import sprun, tempdir, fprint
......@@ -67,15 +66,17 @@ def main():
output_dir = './conda_build'
project_ref = os.environ['PROJECT_REF']
recipe_name = os.environ['CI_PROJECT_NAME']
recipe_url = os.environ['CI_PROJECT_URL']
staging = os.environ['STAGING']
prebuild = os.environ['PREBUILD']
prebuild = [p.split(':', 1) for p in prebuild]
local_channel = './local_channel'
if staging == '':
channel_url = os.environ['FSLCONDA_PRODUCTION_CHANNEL_URL']
else:
if staging == 'true':
channel_url = os.environ['FSLCONDA_STAGING_CHANNEL_URL']
else:
channel_url = os.environ['FSLCONDA_PRODUCTION_CHANNEL_URL']
os.mkdir(local_channel)
for ref, url in prebuild:
......@@ -84,8 +85,8 @@ def main():
sprun('conda install -y -c conda-forge conda-build')
fprint('************************************')
fprint(f'Building conda recipe for: {env["CI_PROJECT_NAME"]}')
fprint(f'Recipe URL: {env["CI_PROJECT_URL"]}')
fprint(f'Building conda recipe for: {recipe_name}')
fprint(f'Recipe URL: {recipe_url}')
fprint(f'Pre-built packages: {prebuild}')
fprint( 'Revision (empty means to build release')
fprint(f' specified in meta.yaml): {project_ref}')
......
......@@ -8,18 +8,18 @@
pkgdir=./conda_build
if [[ -z $STAGING ]]; then
if [ "$STAGING" == "true" ]; then
channel_dir=$FSLCONDA_STAGING_CHANNEL_DIRECTORY
else
channel_dir=$FSLCONDA_PRODUCTION_CHANNEL_DIRECTORY
fi
echo "************************************"
echo "Deploying built conda package for: $CI_PROJECT_NAME"
echo "Recipe URL: $CI_PROJECT_URL"
echo "Revision (empty means last release): $PROJECT_REF"
echo "FSL conda channel: $channel_dir"
echo "Deploying built conda package for: $CI_PROJECT_NAME"
echo "Recipe URL: $CI_PROJECT_URL"
echo "Revision (empty means to build release"
echo " specified in meta.yaml): $PROJECT_REF"
echo "FSL conda channel: $channel_dir"
echo "************************************"
# copy all package files from the conda-build
......@@ -28,8 +28,8 @@ for platform in noarch linux-64 osx-64 win-32 win64; do
if ls $pkgdir/$platform/*.tar.bz2 &> /dev/null; then
mkdir -p $channeldir/$platform
cp $pkgdir/$platform/*.tar.bz2 $channeldir/$platform
mkdir -p $channel_dir/$platform
cp $pkgdir/$platform/*.tar.bz2 $channel_dir/$platform
fi
done
......@@ -45,4 +45,4 @@ find . -name "*.json" -delete
find . -name "*.json.bz2" -delete
# re-generate the channel index
conda index $channeldir
conda index $channel_dir
#!/usr/bin/env python
#
# This script is executed when new commits are merged into any branch of a FSL
# conda project or recipe repository (except for the master branch of the
# lattter.). Pipelines are triggered to re-build all "downstream" packages
# (those which are dependent on the updated package).
#
# The URL of the channel which houses FSL conda packages is passed as the
# first argument to this script. All other information is read from
# environment variables.
# This script is executed on a FSL recipe repository. All "downstream"
# packages (those which are dependent on this package) are identified, and a
# CI pipeline is triggered on each of them to perform a test build.
#
# Author: Paul McCarthy <pauldmccarthy@gmail.com>
#
# read channel repodata, build dependency graph, etc etc
import os
import sys
from fsl_ci_utils.conda import (load_meta_yaml,
read_channel_repodata,
......@@ -35,25 +28,26 @@ def trigger_pipeline_on_packages(upstream, packages, server, token, staging):
# build-conda-recipe job on the downstream
# recipe repository:
#
# - STAGING: whether to build from staging or
# production channels
# - DEPLOY: Built package is *not* for deployment
# - PREBUIlD: Prebuild this recipe locally, rather
# han installing it from the channel.
# - PROJECT_REF: Build recipe from this project
# repository ref (empty means ref
# specified in recipe meta.yaml)
# - TRIGGERRED: prevent recursive calls to
# trigger-check-downstrem-packages all
# the way down the dependency tree.
if staging: ref = 'master'
else: ref = ''
variables = {
'STAGING' : staging,
'DEPLOY' : '',
'PROJECT_REF' : ref,
'PREBUILD' : f'{ref}:{upstream}',
'TRIGGERED' : 'true'
# - PROJECT_REF: Build recipe from the project repository
# master branch, or from the ref specified
# in the recipe meta.yaml
# - PREBUIlD: Prebuild this recipe locally, rather
# than installing it from the channel.
# - STAGING: whether to build from staging or
# production channels
# - DEPLOY: Built package is *not* for deployment
# - CHECK_DOWNSTREAM: Do *not* perform downstream check -
# we only check direct dependants, and
# do not recursively run
# trigger-check-downstrem-packages all
# the way down the dependency tree.
ref = 'master' if (staging == 'true') else '',
variables = {
'PROJECT_REF' : ref,
'PREBUILD' : f'{ref}:{upstream}',
'STAGING' : staging,
'DEPLOY' : '',
'CHECK_DOWNSTREAM' : '',
}
for pkg in packages:
......@@ -71,18 +65,16 @@ def trigger_pipeline_on_packages(upstream, packages, server, token, staging):
def main():
"""
"""
staging = os.environ['STAGING']
gitlab_url = os.environ['CI_SERVER_URL']
token = os.environ['FSL_CI_API_TOKEN']
upstream = os.environ['CI_PROJECT_URL']
if staging == '':
channel_url = os.environ['FSLCONDA_PRODUCTION_CHANNEL_URL']
else:
if staging == 'true':
channel_url = os.environ['FSLCONDA_STAGING_CHANNEL_URL']
else:
channel_url = os.environ['FSLCONDA_PRODUCTION_CHANNEL_URL']
# load package information from the
# channel, and build a dependency graph
......
......@@ -15,27 +15,35 @@ from fsl_ci_utils.conda_api import (trigger_pipeline,
wait_on_pipeline)
GITLAB_URL = os.environ['CI_SERVER_URL']
TOKEN = os.environ['FSL_CI_API_TOKEN']
PROJECT_PATH = os.environ['CI_PROJECT_PATH']
PROJECT_REF = os.environ['CI_COMMIT_REF_NAME']
def main():
recipe = get_recipe_url(GITLAB_URL, PROJECT_PATH)
server = os.environ['CI_SERVER_URL']
token = os.environ['FSL_CI_API_TOKEN']
project_path = os.environ['CI_PROJECT_PATH']
project_ref = os.environ['CI_COMMIT_REF_NAME']
recipe = get_recipe_url(server, project_path)['path']
# These variables are passed to the
# build-conda-package job on the
# recipe repository. They control
# how the package is built, whether
# a downstream compatibility check
# is performed, and whether the
# built package is deployed to a
# conda channel.
variables = {
'PROJECT_REF' : PROJECT_REF,
'STAGING' : 'true',
'DEPLOY' : 'true' if (PROJECT_REF == 'master') else ''
'PROJECT_REF' : project_ref,
'STAGING' : 'true',
'CHECK_DOWNSTREAM' : 'true' if (project_ref == 'master') else '',
'DEPLOY' : 'true' if (project_ref == 'master') else ''
}
pipeline = trigger_pipeline(
recipe['path'], 'master', GITLAB_URL, TOKEN, variables)
pid = pipeline['id']
url = pipeline['web_url']
status = wait_on_pipeline(recipe['path'], pid, GITLAB_URL, TOKEN)
# start the pipeline, wait for it to finish
pipeline = trigger_pipeline(recipe, 'master', server, token, variables)
pid = pipeline['id']
url = pipeline['web_url']
status = wait_on_pipeline(recipe, pid, server, token)
if status in ('failed', 'cancelled', 'skipped'):
print(f'Pipeline {status}! Check {url} for details.')
......
Supports Markdown
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment