- Removes telemetry
- Replaces @requires import to use ploomber_core instead of ploomber
- Adds --skip-docker argument in soopervisor export to skip docker build (#103)
- Optimizes the generated AWS Batch
Dockerfile
so dependencies are only installed when the requirements are modified - Allows using multiple
requirements.txt
files, generating one Docker image for each one (#86)
- Allows execution of single tasks via
soopervisor export --task {task-name}
- Allowing to bundle custom libraries via
lib/
in docker image (#87) - Fixes DAG loading when passing the
--lazy
argument in pipelines with aFile
client (#105) - Display warning if source code contains file over 10MB (#81)
- Drop support for Python 3.6
- Dropping support for Python 3.6
- Adds
task_resources
to AWS Batch - AWS Batch exporter now generates a unique job definition name on each submission
- Adds
--lazy
option tosoopervisor export
- Pass custom arguments to
docker build
with environment variableDOCKER_ARGS
- Fixes an error that caused
soopervisor export
to fail due to non-serializable object
Important: We detected a problem with this release when running soopervisor export
. Please use 0.7.2
- Fixes error that caused docker image building to fail if the repository didn't have a version
Important: We detected a problem with this release when running soopervisor export
. Please use 0.7.2
- Improves CLI documentation
- Adds
--git-ignore
to documentation - Various documentation improvements
- Changes short version of
--until-build
to-u
- Adds "Task Communication" user guide
- Display warnings if passing CLI options that do not apply to SLURM
- SLURM exporter raises error if
sbatch
isn't installed - Showing a warning if source dist is >5MB (#53)
soopervisor add
adds a defaultexclude
value by extracting product paths frompipeline.yaml
- Copying user settings when generating the Docker image
- Experimental Kubeflow integration
- Airflow integration allows to choose between
BashOperator
,KubernetesPodOperator
, andDockerOperator
using--preset
- Many modules and test cases re-written for better code quality and maintainability
- Fixes output message after exporting to Argo
- Adds flag to
source.copy
to ignore git soopervisor export
raises and error ifgit
isn't tracking any files- Adds
--git-ignore
tosoopervisor export
- Adds support for SLURM
AirflowExporter
usesKubernetesPodOperator
by default (#33)- Simplified Airflow and Argo/k8s tutorials
- Clearer error message when pending
git commit
- Clearer error message when the user does not change docker repository default value (#29)
- Argo spec sets
imagePullPolicy
toNever
if repository isnull
- Documents Kubernetes/Argo configuration schema
- General documentation improvements
- Better error message when lock files do not exist
- Documentation note on when using shared disks (must pass
--skip-tests
) - Adds
build
as a dependency - Check for lock files before creating
soopervisor.yaml
- Fixes
docker.build
issue that caused apipeline.yaml
to be used even when the environment required one with another name - Fixes error that caused AWS batch args to be passed as a single str
- load_tasks tries to initialize a spec matching the target env name (e.g.,
training
target looks forpipeline.train.yaml
- Compatibility fixes with Ploomber (requires >=0.12.1)
- Fixes an error that caused Dockerfile to include a line to install project as a package even if there was no
setup.py
file
- Adds
exclude
to ignore files/directories from docker image - Adds user guide section to documentation
- Adds
--mode
option tosoopervisor export
- Batch export stops if there are no tasks to execute
- Adds
--skip-tests
option to skip tests before submitting
Important: Soopervisor was re-written. Some modules were deprecated and the API changed. This new architecture allows us to greatly simplify user experience and easily incorporate more platforms in the future.
- New CLI
- New documentation
- New (simplified)
soopervisor.yaml
configuration schema - Support for non-packaged projects (i.e., the ones without a
setup.py
file) - Support for AWS Batch
- Support for AWS Lambda
- Argo Workflows integration builds a docker image
- Airflow integration produces a DAG with
DockerOperator
tasks - Deprecates
build
module - Deprecates
script
module - Deprecates Box integration
- Export projects compatible with ploomber.OnlineModel to AWS Lambda
- Allow initialization from empty soopervisor.yaml
- Support to pass extra cli args to
ploomber task
(viaargs
insoopervisor.yaml
) when running in Argo and Airflow
- Adds
--root
arg tosoopervisor export-airflow
to select an alternative project's root - Determines default entry point using Ploomber's API to allow automated discovery of
pipeline.yaml
in package layouts (e.g.src/package/pipeline.yaml
)
- Changes to the Airflow generated DAG
- Fixes a bug when initializing configuration from projects whose root is not the current directory
env.airflow.yaml
optional when exporting to Airflow (#17)- Validating exported argo YAML spec
- Output argo YAML spec displays script in literal mode to make it readable
- Fixed extra whitespace in generated script
- Refactors
ArgoMountedVolume
to provide flexibility for different types of k8s volumes - Adds section in the documentation to run examples using minikube
- Adds a few
echo
statements to generated script to provide better status feedback
- Adds ability to skip dag loading during project validation
- Box uploader imported only if needed
- Exposes option to skip dag loading from the CLI
- Adds Airflow DAG export
- Adds Argo/Kubernetes DAG export
- Support for uploading products to Box
- Adds
DockerExecutor
- Products are saved in a folder with the name of the current commit by default
- Conda environments are created locally in a .soopervisor/ folder
- Conda environments are cached by default
- Ability to customize arguments to
ploomber build
- First release