Python client API for OpenEO
This is a fixup release, identical to the v0.17.0 release (except for a minor test fix)
Connection.authenticate_oidc()
: add argument to set maximum device code flow poll timeload_stac
process with Connection.load_stac()
(#425)DataCube.aggregate_spatial_window()
Rfc3339.parse_datetime
(#418)/files
endpoints) (#377)openeo_processes_dask
package (#338)BatchJob.get_results_metadata_url()
.Connection.list_files()
returns a list of UserFile
objects instead of a list of metadata dictionaries. Use UserFile.metadata
to get the original dictionary. (#377)DataCube.aggregate_spatial()
returns a VectorCube
now, instead of a DataCube
(#386). The (experimental) fit_class_random_forest()
and fit_regr_random_forest()
methods moved accordingly to the VectorCube
class.openeo.processes
and ProcessBuilder
(#390).DataCube.create_job()
and Connection.create_job()
now require keyword arguments for all but the first argument for clarity. (#412).ImageCollectionClient
and related helpers (now unused leftovers from version 0.4.0 and earlier). (Also #100)Connection
and all its methods.PrivateJsonFile
may be readable by others, just log a message instead of raising PermissionError
(387)VectorCube.create_job()
and MlModel.create_job()
are properly aligned with DataCube.create_job()
regarding setting job title, description, etc. (#412).save_result
node from DataCube.execute_batch()
when there is already one (#401)requests.Session
in openeo.rest.auth.oidc
logicopeneo.util.Rfc3339
and add rfc3339.today()
/rfc3339.utcnow()
.DataCube
objects instead of generic repr
. (#336)Connection.vectorcube_from_paths()
to load a vector cube
from files (on back-end) or URLs with load_uploaded_files
process.execute_batch
also skips temporal 502 Bad Gateway errors
. #352
DataCube
s in "apply" mode (non-"band math"),
allowing expressions like 10 * cube.log10()
and ~(cube == 0)
(#123)PrivateJsonFile
permissions properly on Windows, using oschmod library.
(#198)max_cloud_cover
argument to load_collection()
to simplify setting maximum cloud cover (property eo:cloud_cover
) (#328)openeo.rest.datacube.DataCube.load_disk_collection
DataCube.download()
: only automatically add save_result
node when there is none yet.openeo.UDF
helper class for UDF usage
(#312).
runtime
from file/URL suffix or source codedata
argument (e.g.data={"from_parameter": "x"}
)openeo.UDF
and DataCube.apply_dimension()
still work but trigger deprecation warningsload_collection
property filters that are not defined in the collection metadata (summaries).Connection.download()
,
Connection.execute()
and Connection.create_job()
apply
mode (#323)DataCube.print_json()
to simplify exporting process graphs in Jupyter or other interactive environments (#324)DimensionAlreadyExistsException
when trying to add_dimension()
a dimension with existing name (Open-EO/openeo-geopyspark-driver#205)DataCube.execute_batch()
now also guesses the output format from the filename,
and allows using format
argument next to the current out_format
to align with the DataCube.download()
method. (#240)DataCube.merge_cubes()