Trains Versions Save

ClearML - Auto-Magical CI/CD to streamline your AI workload. Experiment Management, Data Management, Pipeline, Orchestration, Scheduling & Serving in one MLOps/LLMOps solution

v1.7.2

1 year ago

New Features and Improvements

  • Support running jupyter notebook inside a git repository (repository will be referenced without uncommitted changes and jupyter notebook will be stored om plain code as uncommitted changes)
  • Add jupyter notebook fail warning
  • Allow pipeline steps to return string paths without them being treated as a folder artifact and zipped (#780)
  • Remove future from Python 3 requirements

Bug Fixes

  • Fix exception raised when using ThreadPool (#790)
  • Fix Pyplot/Matplotlib binding reports incorrect line labels and colors (#791)
  • Pipelines
    • Fix crash when running cloned pipeline that invokes a step twice (#770, related to #769, thanks @tonyd!)
    • Fix pipeline argument becomes None if default value is not set
    • Fix retry_on_failure callback does nothing when specified on PipelineController.add_step()
    • Fix pipeline clone logic
  • Jupyter Notebook
    • Fix support for multiple jupyter servers running on the same machine
    • Fix issue with old/new notebook packages installed
  • Fix local cache with access rules disabling partial local access
  • Fix Task.upload_artifact() fails uploading pandas DataFrame
  • Fix relative paths in examples (#787, thanks @mendrugory!)

v1.7.1

1 year ago

New Features and Improvements

  • Add callback option for pipeline step retry

Bug Fixes

  • Fix Python Fire binding
  • Fix Dataset failing to load helper packages should not crash
  • Fix Dataset.get_local_copy() is allowed for a non-finalized dataset
  • Fix Task.upload_artifact() does not upload empty lists/tuples
  • Fix pipeline retry mechanism interface
  • Fix Python <3.5 compatibility
  • Fix local cache warning (should be a debug message)

v1.7.0

1 year ago

New Features and Improvements

  • ClearML Data: Support providing list of links
  • Upload artifacts with a custom serializer (#689)
  • Allow user to specify extension when using custom serializer functions (for artifacts)
  • Skip server URL verification in clearml-init wizard process
  • When calling Dataset.get() without "alias" field, tell user that he can use alias to log it in the UI
  • Add mmcv support for logging models
  • Add support for Azure and GCP storage in Task.setup_upload()
  • Support pipeline retrying tasks which are failing on suspected non-stable failures
  • Better storage (AWS, GCP) internal load balancing and configurations
  • Add Task.register_abort_callback

Bug Fixes

  • Allow getting datasets with non-semantic versioning (#776)
  • Fix interactive plots (instead of a generated png)
  • Fix Python 2.7 support
  • Fix clearml datasets list functionality
  • Fix Dataset.init() modifies task (moved to Dataset.create())
  • Fix failure with large files upload on HTTPS
  • Fix 3d plots with plt shows to show 2d plot on task results page
  • Fix uploading files with project's default_upload_destination (#734)
  • Fix broken reporting of Matplotlib - Using logarithmic scale breaks reporting
  • Fix supporting of wildcards in clearml-data CLI
  • Fix report_histogram - does not show "horizontal" orientation (#699)
  • Fix table reporting 'series' arg does not appear on UI when using logger.report_table(title, series, iteration...) (#684)
  • Fix artifacts (and models) use task original name and not new name
  • Fix very long filenames from S3 can't be downloaded (with get_local_copy())
  • Fix overwrite of existing output models on pipeline task with monitor_models (#758)

v1.6.4

1 year ago

Bug Fixes

  • Fix APIClient fails when calling get_all endpoints with API 2.20 (affects CLI tools such as clearml-session)

v1.6.3

1 year ago

New Features and Improvements

  • Add option to specify an endpoint URL when creating S3 resource service (#679, thanks @AndolsiZied!)
  • Add support for providing ExtraArgs to boto3 when uploading files using the sdk.aws.s3.extra_args configuration option
  • Add support for Server API 2.20
  • Add Task.get_num_enqueued_tasks() to get the number of tasks enqueued in a specific queue
  • Add support for updating model metadata using Model.set_metadata(), Model.get_metadata(), Model.get_all_metadata(), Model.get_all_metadata_casted() and Model.set_all_metadata()
  • Add Task.get_reported_single_value()
  • Add a retry mechanism for models and artifacts upload
  • Pipelines with empty configuration takes it from code
  • Add support for running pipeline steps on preemptible instances
  • Datasets
    • Add description to Datasets
    • Add wild-card support in clearml-data

Bug Fixes

  • Fix dataset download (#713, thanks @dankirsdot!)
  • Fix lock is not released after dataset cache is downloaded (#708, thanks @mralgos!)
  • Fix deadlock might occur when using process pool large number processes (#674)
  • Fix 'series' not appearing on UI when using logger.report_table() (#684)
  • Fix Task.init() docstring to include behavior when executing remotely (#737, thanks @mmiller-max!)
  • Fix KeyError when running remotely and no params were passed to click (https://github.com/allegroai/clearml-agent/issues/111)
  • Fix full path is stored when uploading a single artifact file
  • Fix passing non-alphanumeric filename in sdk.development.detect_with_pip_freeze
  • Fix Python 3.6 and 3.10 support
  • Fix mimetype cannot be None when uploading to S3
  • Pipelines
    • Fix pipeline DAG
    • Add support for pipelines with spot instances
    • Fix pipeline proxy object is always resolved in main pipeline logic
    • Fix pipeline steps with empty configuration should try and take it from code
    • Fix wait for jobs based on local/remote pool frequency
    • Fix UniformIntegerParameterRange.to_list() ignores min value
    • Fix pipeline component returning a list of length 1
  • Datasets
    • Fix Dataset.get() does not respect auto_create
    • Fix getting datasets fails with new ClearML Server v1.6
    • Fix datasets can't be queried by project/name alone
    • Fix adding child dataset to older parent dataset without stats
  • Fix error when connecting an input model
  • Fix deadlocks, including:
    • Change thread Event/Lock to a process fork safe threading objects
    • Use file lock instead of process lock to avoid future deadlocks since python process lock is not process safe (killing a process holding a lock will Not release the lock)
  • Fix StorageManager.list() on a local Windows path
  • Fix model not created in the current project
  • Fix keras_tuner_cifar example raises DeprecationWarning and ValueError

v1.6.2

1 year ago

Bug Fixes

  • Fix format string construction sometimes causing delayed evaluation errors (#706)

v1.6.1

1 year ago

Bug Fixes

  • Fix Task.get_tasks() fails when sending search_hidden=False
  • Fix LightGBM example shows UserWarning

v1.6

1 year ago

New Features and Improvements

  • New HyperParameter Optimization CLI clearml-param-search
  • Improvements to ClearML Data
    • Add support for a new ClearML Data UI in the ClearML WebApp
    • Add clearml-data new options set-description and rename
  • Add random seed control using Task.set_random_seed() allowing to set a new random seed for task initialization or to disable it
  • Improve error messages when failing to download an artifact
  • Improve error messages when testing for permissions

Bug Fixes

  • Fix axis range settings when logging plots
  • Fix Task.get_project() to return more than 500 entries (#612)
  • Fix pipeline progress calculation
  • Fix StorageManager.upload_folder() returns None for both successful and unsuccessful uploads
  • Fix script path capturing stores a relative path and not an absolute path
  • Fix HTML debug samples are saved incorrectly on S3
  • Fix Hydra deprecation warning in examples
  • Fix missing requirement for tensorboardx example

Known issues

  • When removing an image from a Dataset, its preview image won't be removed
  • Moving Datasets between projects still shows the Dataset in the old project

v1.5.0

1 year ago

New Features and Improvements

  • Add support for single value metric reporting (#400)
  • Add support for specifying parameter sections in PipelineDecorator (#629)
  • Add support for parallel uploads and downloads (upload \ download and zip \ unzip of artifacts) (ClearML Slack)
  • Add support for specifying execution details (repository, branch, commit, packages, image) in PipelineDecorator
  • Bump PyJWT version due to "Key confusion through non-blocklisted public key formats" vulnerability
  • Add support for AWS Session Token (using boto3's aws_session_token argument)

Bug Fixes

  • Fix Task.get_projects() retrieves only the first 500 results (#612)
  • Fix failure to delete artifacts stored in Azure (#660)
  • Fix Process Pool hangs at exit (#674)
  • Fix number of unpacked values when syncing a dataset (#682)
  • Fix FastAI DeprecationWarning (#683)
  • Fix StorageManager.download_folder() crash
  • Fix pipelines can't handle None return value
  • Fix pre-existing pipeline raises an exception
  • Fix deprecation warning in the image_reporting example
  • Fix patches are kept binded afterTask.close() is called
  • Fix running pipeline code remotely without first running it locally (i.e. no configuration on the Task)
  • Fix local task execution with empty working directory
  • Fix permission check fails when using local storage folder that does not exist
  • Fix pipeline add_function_step breaks in remote execution
  • Fix wrong mimetype used for any file or folder uploaded to S3 using StorageManager
  • Add missing default default_cache_manager_size in configuration files

v1.4.1

2 years ago

Bug Fixes

  • Fix Process Pool hangs at exit (#674)