ClearML - Auto-Magical CI/CD to streamline your AI workload. Experiment Management, Data Management, Pipeline, Orchestration, Scheduling & Serving in one MLOps/LLMOps solution
always_create_from_code
argument to PipelineController
(default True) to allow remote execution to create the Pipeline DAG at runtime (as opposed to adhering to the existing structure stored on the task when running locally)OutputModel
reporting and other methods fail if OutputModel.update_weights()
is not called before that (#1078)--overrides
supportspawn
start method for Python multiprocess
. This should help circumvent issues like this
LazyEvalWrapper
type error (#1081)include_archive
parameter to Dataset.list_datasets()
, now possible to include archived datasets in the returned list (#1069, thanks @natephysics!)aws.boto3.multipart_chunksize
and aws.boto3.multipart_threshold
configuration options (#1059, thanks @cgaudreau-ubisoft!)PipelineController.get_pipeline()
for retrieving previously run pipelinescontinue_last_task=0
is ignored in pipelines run with retry_on_failure
(#1054)Task.connect_configuration()
doesn’t handle dictionaries with special charactersTask.get_by_name()
doesn't return the most recent task when multiple tasks have same nameboto3
certificate verification ignores custom self-signed certificateslightning>=2.0
(#1033, thanks @aweinmann!)clearml-init
support for standard HTTP and HTTPS ports in webapp in conjunction with non-default api/files server ports (#1031, thanks @pktiuk!)jsonargparse
configuration filesValueError
when setting task properties at the start of a pipeline componentPath
where str
expecteddict
added to task with Task.connect()
Task.connect
race condition overwriting task description with connected input modelDataset.add_external_files()
(#962, thanks @john-zielke-snkeos!)Task.launch_multi_node()
for distributed experiment executionTask.get_all_reported_scalars()
to fetch all available scalar dataTask.remove_input_models()
to disassociate input models from a taskDataset.list_datasets()
to include dataset version in the resultjoblib
hangs (#1009)gs://
(#1018, thanks @pzarfos!)Task.report_text()
sometimes reporting to an incorrect task when multiple tasks run simultaneouslyTask.set_offline(offline_mode=False)
raising an UnboundLocalError
scikit-image
package (skimage
) is sometimes not detected as a dependencyPipelineDecorator
sometimes causing a race condition when starting a remote executionGetAllRequest
in Task
docstrings (#982, thanks @Make42!)jsonargparse
during remote execution (clearml-agent #153 and #1010)ProxyDictPostWrite.update()
not triggering a write-back to the backend (#985)TypeError
when using Task.query_tasks()
without specifying the task_filter
parameterTask.init(continue_last_task=0)
when running remotely, now no longer ignoredurllib3
import error when using urllib3>=2.0.0
Task.query_tasks()
using TaskTypes
(#938 )PipelineController
Dataset.create()
usage for old server API versionsTask.get_tasks
docstring (#937)Task.get_reported_scalars()
docstring for x_axis
parameter when set to "timestamp"
(#964, thanks @jday1!)APIClient.events.debug_images()
for latest server API versioncast
parameter to Task.get_parameter
method (#958, thanks @harry-optimised!)APIClient().models.get_all
and APIClient().tasks.get_all
requestsModel
and Task
StorageManager.download_file()
and StorageManager.download_folder()
will not create a subfolder with the bucket name in the generated download path (#709)Task.mark_completed
, Task.close
, and the hyperparameter example (#927, thanks @Make42!)Task.get_debug_samples()
to retrieve task's debug samples (#761)artifact_{serialization,deserialization}_function
parameters in PipelineController
and PipelineDecorator
(#689)Logger.report_table()
through extra_data
parameter, solving (#796)Model.report_*
methodsforce_download
argument to Model.get_local_copy()
to force the downloading of a new copy of a model even if already in the cachePipelineController.connect_configuration()
to add configuration dictionaries to a pipelineDataset.delete
configurable hosted contents deletion support, allowing deletion of both ClearML file server hosted files, and external storage hosted filesPipelineController.is_successful()
criteria customizationCLEARML_VCS_DIFF
environment variable behavior, setting it to an empty string now forces the task to not log uncommitted changesoutput_uri=True
argument in the hook config for the OpenMMLab example--local
argumentsubdirectory
argument when pip install
ing from a git repo (#947, thanks @jday1!)SyntaxError
(#959, thanks @jday1!)clearml-init
(clearml-server #181 and #910)task.connect()
docstring (ClearML Docs issue #473)Dataset.finalize()
raises exception due to backward compatibility issue (#908)gradio
is not importedpython-fire
supportTask.connect()
inside PipelineDecorator
get_or_create_project
crashes when run in paralleltask.get_user_properties(value_only=True)
, previously raising an exceptionsilence_errors
parameter not working as expected in StorageManager.get_file_size_bytes()
Dataset
metadata name to a string prefixed with data_
breaking Dataset.get