Trains Versions Save

ClearML - Auto-Magical CI/CD to streamline your AI workload. Experiment Management, Data Management, Pipeline, Orchestration, Scheduling & Serving in one MLOps/LLMOps solution

v1.9.3

1 year ago

Bug Fixes

  • Fix broken Task._get_status() which was breaking clearml-session in the latest version
  • Fix path substitution, making it possible now to store unsubstituted URL for models (#935 thanks @john-zielke-snkeos!)

v1.9.2

1 year ago

New Features and Improvements

  • Support parsing queue name when providing execution queue in pipelines code (#857, thanks @Anton-Cherepkov!)
  • Ignore None values for keys in the click argument parser (#903, thanks @chengzegang!)
  • Improve docstrings for Task.mark_completed() and Task.close() (#921, #920, thanks @Make42!)
  • Add pre/post execution callbacks to pipeline steps using @PipelineDecorator.component
  • Add status-change callback to pipeline steps using PipelineController.add_step(), PipelineController.add_function_step(), and @PipelineDecorator.component

Bug Fixes

  • Fix missing debug samples when reporting using TensorBoard (#924, thanks @jday1!)
  • Fix wrong Jupyter API token during repository detection (#904, thanks @RoseGoldIsntGay!)
  • Fix typo in the warning for very large git diffs (#932, thanks @yiftachbeer!)
  • Fix pipelines from tasks don't propagate parameter_override values in PipelineController.add_step()
  • Fix folders and files uploaded to S3 and Azure with StorageManager.upload_file() have wrong MIME types
  • Fix CSV file preview in Datasets
  • Fix Task.connect_configuration() doesn't work with non-string dictionary keys
  • Fix LightGBM example deprecation warning
  • Fix potential race condition in get_or_create_project()

v1.9.1

1 year ago

New Features and Improvements

  • Add signature version to boto3 configuration (#884, thanks @cgaudreau-ubisoft!)
  • Allow requesting custom token expiration using the api.auth.req_token_expiration_sec configuration setting
  • Add Python 3.11 support

Bug Fixes

  • Fix UniformParameterRange.tolist() throws error when step size is not defined (#859, thanks @davyx8!)
  • Fix StorageManager.list() does not return size metadata (#865)
  • Fix storage with path substitutions (#877, thanks @john-zielke-snkeos!)
  • Fix extras in ClearML installation prevents clearml from being included in requirements (#885, thanks @cajewsa!)
  • Fix metadata set on an uploaded model object is not accessible (#890, thanks @supritmkulkarni!)
  • Fix TriggerScheduler docstrings (#881)
  • Fix Azure storage upload not working (#868)
  • Fix connect list of dicts parsed incorrectly in remote
  • Fix casting None to int fails uploads and permission checks
  • Fix numpy 1.24 support
  • Fix clearml-data previews are saved on file server even when output_uri is specified
  • Fix connecting a dictionary to task sometimes raises an exception
  • Fix authentication headers are not set on substituted fileserver URLs
  • Fix Task.get_project_id() cannot find hidden projects

v1.9.0

1 year ago

New Features and Improvements

  • Add r prefix to re.match() strings (#837, thanks @daugihao!)
  • Add path_substitution to clearml.conf example file (#842)
  • Clarify deferred_init usage in Task.init() (#855)
  • Add pipeline decorator argument to control docker image (#856)
  • Add StorageManager.set_report_upload_chunk_size() and StorageManager.set_report_download_chunk_size() to set chunk size for upload and download
  • Add allow_archived argument in Task.get_tasks()
  • Support querying model metadata in Model.query_models()
  • Add Dataset.set_metadata() and Dataset.get_metadata()
  • Add delete_from_storage (default True) to Task.delete_artifacts()

Bug Fixes

  • Fix jsonargparse and pytorch lightning integration broken for remote execution (#403)
  • Fix error when using TaskScheduler with 'limit_execution_time' (#648)
  • Fix dataset not synced if the changes are only modified files (#835, thanks @fjean!)
  • Fix StorageHelper.delete() does not respect path substitutions (#838)
  • Fix can't write more than 2 GB to a file
  • Fix StorageManager.get_file_size_bytes() returns ClientError instead of None for invalid S3 links
  • Fix Dataset lineage view is broken with multiple dataset dependencies
  • Fix tensorflow_macos support
  • Fix crash when calling task.flush(wait_for_uploads=True) while executing remotely
  • Fix None values get casted to empty strings when connecting a dictionary

v1.8.3

1 year ago

Bug fixes

  • Set GCS credentials to None if invalid service account credentials are provided (#841, thanks @freddessert!)
  • Fix a sync issue when loading deferred configuration

v1.8.2

1 year ago

New Features and Improvements

  • Added VCS_ENTRY_POINT environment variable that overrides ClearML's entrypoint auto-detection

Bug Fixes

  • Fix all parameters returned from a pipeline are considered strings
  • Fix Task.set_parameters() does not add parameter type when parameter exists but does not have a type

v1.8.1

1 year ago

New Features and Improvements

  • Raise error on failed uploads (#820, thanks @shpigi!)
  • Add hyperdataset examples (#823)
  • Change report_event_flush_threshold default to 100
  • Add ModelInfo.weights_object() for store callback access to the actual model object being stored (valid for both pre/post save calls, otherwise None)
  • Support num_workers in dataset operations
  • Support max connections setting for Azure storage using the sdk.azure.storage.max_connection configuration option

Bug Fixes

  • Fix clearml logger default level cannot be changed (#741)
  • Fix Hydra does use get overridden information from ClearML (#751)
  • Fix StorageManager.list(“s3://..”, with_metadata=True) doesn't work
  • Fix ModelsList.keys() is missing
  • Fix CLEARML_DEFERRED_TASK_INIT=1 doesn't work
  • Fix default API method does not work when set in configuration

v1.8.0

1 year ago

New Features and Improvements

  • Add tarfile member sanitization to extractall() (#803, thanks @TrellixVulnTeam!)
  • Add Task.delete_artifacts() with raise_on_errors argument (#806, thanks @frolovconst!)
  • Add CI/CD example (#815, thanks @thepycoder!)
  • Limit number of _serialize requests when adding list of links with add_external_files() (#813)
  • Add support for connecting Enum values as parameters
  • Improve CoLab integration (store entire colab, not history)
  • Add clearml.browser_login to authenticate browser online sessions such as CoLab, Jupyter Notebooks etc.
  • Remove import_bind from stack trace of import errors
  • Add sdk.development.worker.report_event_flush_threshold configuration option to control the number of events to trigger a report
  • Return stub object from Task.init() if no clearml.conf file is found
  • Improve manual model uploading example
  • Remove deprecated demo server

Bug Fixes

  • Fix passing compression=ZIP_STORED (or 0) to Dataset.upload() uses ZIP_DEFLATED and overrides the user-supplied argument (#812, thanks @doronser!)
  • Fix unique_selector is not applied properly on batches after the first batch. Remove default selector value since it does not work for all event types (and we always specify it anyway)
  • Fix clearml-init colab detection
  • Fix cloning pipelines ran with start_locally() doesn't work
  • Fix if project has a default output uri there is no way to disable it in development mode (manual), allow passing output_uri=False to disable it
  • Fix git remote repository detection when remote is not "origin"
  • Fix reported images might not all be reported when waiting to complete the task
  • Fix Dataset.get_local_copy() deletes the source archive if it is stored locally
  • Fix too many parts will cause preview to inflate Task object beyond its 16MB limit - set a total limit of 320kbs
  • Fix media preview is created instead of a table preview
  • Fix task.update_output_model() should always upload local models to a remote server
  • Fix broken pip package might mess up requirements detection

v1.7.2

1 year ago

New Features and Improvements

  • Support running jupyter notebook inside a git repository (repository will be referenced without uncommitted changes and jupyter notebook will be stored om plain code as uncommitted changes)
  • Add jupyter notebook fail warning
  • Allow pipeline steps to return string paths without them being treated as a folder artifact and zipped (#780)
  • Remove future from Python 3 requirements

Bug Fixes

  • Fix exception raised when using ThreadPool (#790)
  • Fix Pyplot/Matplotlib binding reports incorrect line labels and colors (#791)
  • Pipelines
    • Fix crash when running cloned pipeline that invokes a step twice (#770, related to #769, thanks @tonyd!)
    • Fix pipeline argument becomes None if default value is not set
    • Fix retry_on_failure callback does nothing when specified on PipelineController.add_step()
    • Fix pipeline clone logic
  • Jupyter Notebook
    • Fix support for multiple jupyter servers running on the same machine
    • Fix issue with old/new notebook packages installed
  • Fix local cache with access rules disabling partial local access
  • Fix Task.upload_artifact() fails uploading pandas DataFrame
  • Fix relative paths in examples (#787, thanks @mendrugory!)

v1.7.1

1 year ago

New Features and Improvements

  • Add callback option for pipeline step retry

Bug Fixes

  • Fix Python Fire binding
  • Fix Dataset failing to load helper packages should not crash
  • Fix Dataset.get_local_copy() is allowed for a non-finalized dataset
  • Fix Task.upload_artifact() does not upload empty lists/tuples
  • Fix pipeline retry mechanism interface
  • Fix Python <3.5 compatibility
  • Fix local cache warning (should be a debug message)