:rocket: Build and manage real-life ML, AI, and data science projects with ease!
This release fixes an issue with merging broken log lines.
LD_LIBRARY_PATH
with Conda environmentsIn a Conda environment, it is sometimes necessary to set LD_LIBRARY_PATH
to first include the Conda's environment libraries before anything else. Prior to this release, this used to cause issues with the escape hatch.
Full Changelog: https://github.com/Netflix/metaflow/compare/2.9.13...2.9.14
The recent annotations feature introduced an issue where project
, flow_name
or user
annotations are not being populated for Kubernetes. This release reverts the changes.
Full Changelog: https://github.com/Netflix/metaflow/compare/2.9.12...2.9.13
The annotations feature introduced in this release has an issue where project, flow_name or user annotations are not being populated for Kubernetes. This has been reverted in the next release.
This release enables users to add custom annotations to the Kubernetes resources that Flows create. The annotations can be configured much in the same way as custom labels
export METAFLOW_KUBERNETES_ANNOTATIONS="first=A,second=B"
@kubernetes(annotations={"first": "A", "second": "B"})
executable
by @romain-intel in https://github.com/Netflix/metaflow/pull/1454
Full Changelog: https://github.com/Netflix/metaflow/compare/2.9.11...2.9.12
This release reverts a validation fix introduced in 2.9.10, which prevented executions of Metaflow tasks on AWS Batch
Full Changelog: https://github.com/Netflix/metaflow/compare/2.9.10...2.9.11
With this release, Metaflow users can get events on PagerDuty when their workflows succeed or fail on Argo Workflows. Setting up the notifications is similar to the existing Slack notifications support
python flow.py argo-workflows create --notify-on-error --notify-on-success --notify-pager-duty-integration-key <pager-duty-integration-key>
METAFLOW_ARGO_WORKFLOWS_CREATE_NOTIFY_PAGER_DUTY_INTEGRATION_KEY=<pager-duty-integration-key>
Full Changelog: https://github.com/Netflix/metaflow/compare/2.9.9...2.9.10
@conda
with some S3 providersThis release fixes a bug with the @conda
bootstrapping process. There was an issue with the ServerSideEncryption
support, that affected some of the S3 operations when using S3 providers that do not implement the encryption headers (for example MinIO).
Affected operations were all that handle multiple files at once:
get_many / get_all / get_recursive / put_many / info_many
which are used as part of bootstrapping a @conda
environment when executing remotely.
Full Changelog: https://github.com/Netflix/metaflow/compare/2.9.8...2.9.9
This release fixes an issue with mapping values with spaces from the Argo events payload to flow parameters.
@secrets
by @oavdeev in https://github.com/Netflix/metaflow/pull/1474
Full Changelog: https://github.com/Netflix/metaflow/compare/2.9.7...2.9.8
This release includes new commands for managing workflows on Argo Workflows.
When needed, commands can be authorized by supplying a production token with --authorize
.
argo-workflows delete
A deployed workflow can be deleted through the CLI with
python flow.py argo-workflows delete
argo-workflows terminate
A run can be terminated mid-execution through the CLI with
python flow.py argo-workflows terminate RUN_ID
argo-workflows suspend/unsuspend
A run can be suspended temporarily with
python flow.py argo-workflows suspend RUN_ID
Note that the suspended flow will show up as failed on Metaflow-UI after a period, due to this also suspending the heartbeat process. Unsuspending will resume the flow and its status will show as running again. This can be done with
python flow.py argo-workflows unsuspend RUN_ID
Previously the status for tasks running on Kubernetes was determined through the pod status, which can take a while to update after the last container finishes. This release changes the status checks to use container statuses directly instead.
Full Changelog: https://github.com/Netflix/metaflow/compare/2.9.6...2.9.7
This release introduces the command step-functions delete
for deleting state machines through the CLI.
python flow.py step-functions delete
Comment out the @project
decorator from the flow file, as we do not allow using --name
with projects.
python project_flow.py step-functions --name project_a.user.saikonen.ProjectFlow delete
python project_flow.py --production step-functions delete
# or
python project_flow.py --branch custom step-functions delete
add --authorize PRODUCTION_TOKEN
to the command if you do not have the correct production token locally
This release fixes an issue with the S3 server side encryption support, where some S3 compliant providers do not respond with the expected encryption method in the payload. This bug specifically affected regular operation when using MinIO.
--with environment
in AirflowFixes a bug with the Airflow support for environment variables, where the env values set in the environment decorator could get overwritten.
--with environment
in Airflow by @valayDave in https://github.com/Netflix/metaflow/pull/1459
Full Changelog: https://github.com/Netflix/metaflow/compare/2.9.5...2.9.6
There is now the possibility to choose which server side encryption method to use for S3 uploads by setting an environment variable METAFLOW_S3_SERVER_SIDE_ENCRYPTION
with an appropriate value, for example aws:kms
or AES256
This release fixes an issue where using parameters on Argo Workflows caused the values to be unnecessarily quoted.
In case you need any assistance or have feedback for us, ping us at chat.metaflow.org or open a GitHub issue.
Full Changelog: https://github.com/Netflix/metaflow/compare/2.9.4...2.9.5