Openeo Python Client Versions Save

Python client API for OpenEO

v0.18.0

9 months ago

Added

  • Support OIDC client credentials grant from a generic connection.authenticate_oidc() call through environment variables #419

Fixed

  • Fixed UDP parameter conversion issue in build_process_dict when using parameter in context of run_udf #431

v0.17.1

10 months ago

This is a fixup release, identical to the v0.17.0 release (except for a minor test fix)

v0.17.0

10 months ago

Added

  • Connection.authenticate_oidc(): add argument to set maximum device code flow poll time
  • Show progress bar while waiting for OIDC authentication with device code flow, including special mode for in Jupyter notebooks. (#237)
  • Basic support for load_stac process with Connection.load_stac() (#425)
  • Add DataCube.aggregate_spatial_window()

Fixed

  • Include "scope" parameter in OIDC token request with client credentials grant.
  • Support fractional seconds in Rfc3339.parse_datetime (#418)

v0.16.0

11 months ago

Added

  • Full support for user-uploaded files (/files endpoints) (#377)
  • Initial, experimental "local processing" feature to use openEO Python Client Library functionality on local GeoTIFF/NetCDF files and also do the processing locally using the openeo_processes_dask package (#338)
  • Add BatchJob.get_results_metadata_url().

Changed

  • Connection.list_files() returns a list of UserFile objects instead of a list of metadata dictionaries. Use UserFile.metadata to get the original dictionary. (#377)
  • DataCube.aggregate_spatial() returns a VectorCube now, instead of a DataCube (#386). The (experimental) fit_class_random_forest() and fit_regr_random_forest() methods moved accordingly to the VectorCube class.
  • Improved documentation on openeo.processes and ProcessBuilder (#390).
  • DataCube.create_job() and Connection.create_job() now require keyword arguments for all but the first argument for clarity. (#412).
  • Pass minimum log level to backend when retrieving batch job and secondary service logs. (Open-EO/openeo-api#485, Open-EO/openeo-python-driver#170)

Removed

  • Dropped support for pre-1.0.0 versions of the openEO API (#134):
    • Remove ImageCollectionClient and related helpers (now unused leftovers from version 0.4.0 and earlier). (Also #100)
    • Drop support for pre-1.0.0 job result metadata
    • Require at least version 1.0.0 of the openEO API for a back-end in Connection and all its methods.

Fixed

  • Reinstated old behavior of authentication related user files (e.g. refresh token store) on Windows: when PrivateJsonFile may be readable by others, just log a message instead of raising PermissionError (387)
  • VectorCube.create_job() and MlModel.create_job() are properly aligned with DataCube.create_job() regarding setting job title, description, etc. (#412).
  • More robust handling of billing currency/plans in capabilities (#414)
  • Avoid blindly adding a save_result node from DataCube.execute_batch() when there is already one (#401)

v0.15.0

1 year ago

Added

  • The openeo Python client library can now also be installed with conda (conda-forge channel) (#176)
  • Allow using a custom requests.Session in openeo.rest.auth.oidc logic

Changed

  • Less verbose log printing on failed batch job #332
  • Improve (UTC) timezone handling in openeo.util.Rfc3339 and add rfc3339.today()/rfc3339.utcnow().

v0.14.1

1 year ago

Fixed

  • Fine-tuned XarrayDataCube tests for conda building and packaging (#176)

v0.14.0

1 year ago

Added

  • Jupyter integration: show process graph visualization of DataCube objects instead of generic repr. (#336)
  • Add Connection.vectorcube_from_paths() to load a vector cube from files (on back-end) or URLs with load_uploaded_files process.
  • Python 3.10 and 3.11 are now officially supported (test run now also for 3.10 and 3.11 in GitHub Actions, #346)
  • Support for simplified OIDC device code flow, (#335)
  • Added MultiBackendJobManager, based on implementation from openeo-classification project (#361)
  • Added resilience to MultiBackendJobManager for backend failures (#365)

Changed

  • execute_batch also skips temporal 502 Bad Gateway errors. #352

Fixed

  • Fixed/improved math operator/process support for DataCubes in "apply" mode (non-"band math"), allowing expressions like 10 * cube.log10() and ~(cube == 0) (#123)
  • Support PrivateJsonFile permissions properly on Windows, using oschmod library. (#198)
  • Fixed some broken unit tests on Windows related to path (separator) handling. (#350)

v0.13.0

1 year ago

Added

  • Add max_cloud_cover argument to load_collection() to simplify setting maximum cloud cover (property eo:cloud_cover) (#328)

Changed

  • Improve default dimension metadata of a datacube created with openeo.rest.datacube.DataCube.load_disk_collection
  • DataCube.download(): only automatically add save_result node when there is none yet.
  • Deprecation warnings: make sure they are shown by default and can be hidden when necessary.
  • Rework and improve openeo.UDF helper class for UDF usage (#312).
    • allow loading directly from local file or URL
    • autodetect runtime from file/URL suffix or source code
    • hide implementation details around data argument (e.g.data={"from_parameter": "x"})
    • old usage patterns of openeo.UDF and DataCube.apply_dimension() still work but trigger deprecation warnings
  • Show warning when using load_collection property filters that are not defined in the collection metadata (summaries).

v0.12.1

1 year ago

Changed

  • Eliminate dependency on distutils.version.LooseVersion which started to trigger deprecation warnings (#316).

Removed

  • Remove old Connection.oidc_auth_user_id_token_as_bearer workaround flag (#300)

Fixed

  • Fix refresh token handling in case of OIDC token request with refresh token grant (#326)

v0.12.0

1 year ago

Added

  • Allow passing raw JSON string, JSON file path or URL to Connection.download(), Connection.execute() and Connection.create_job()
  • Add support for reverse math operators on DataCube in apply mode (#323)
  • Add DataCube.print_json() to simplify exporting process graphs in Jupyter or other interactive environments (#324)
  • Raise DimensionAlreadyExistsException when trying to add_dimension() a dimension with existing name (Open-EO/openeo-geopyspark-driver#205)

Changed

  • DataCube.execute_batch() now also guesses the output format from the filename, and allows using format argument next to the current out_format to align with the DataCube.download() method. (#240)
  • Better client-side handling of merged band name metadata in DataCube.merge_cubes()

Removed

  • Remove legacy DataCube.graph and DataCube.flatten() to prevent usage patterns that cause interoperability issues (#155, #209, #324)