Python client API for OpenEO
requests.Session
in openeo.rest.auth.oidc
logicopeneo.util.Rfc3339
and add rfc3339.today()
/rfc3339.utcnow()
.DataCube
objects instead of generic repr
. (#336)Connection.vectorcube_from_paths()
to load a vector cube
from files (on back-end) or URLs with load_uploaded_files
process.execute_batch
also skips temporal 502 Bad Gateway errors
. #352
DataCube
s in "apply" mode (non-"band math"),
allowing expressions like 10 * cube.log10()
and ~(cube == 0)
(#123)PrivateJsonFile
permissions properly on Windows, using oschmod library.
(#198)max_cloud_cover
argument to load_collection()
to simplify setting maximum cloud cover (property eo:cloud_cover
) (#328)openeo.rest.datacube.DataCube.load_disk_collection
DataCube.download()
: only automatically add save_result
node when there is none yet.openeo.UDF
helper class for UDF usage
(#312).
runtime
from file/URL suffix or source codedata
argument (e.g.data={"from_parameter": "x"}
)openeo.UDF
and DataCube.apply_dimension()
still work but trigger deprecation warningsload_collection
property filters that are not defined in the collection metadata (summaries).Connection.download()
,
Connection.execute()
and Connection.create_job()
apply
mode (#323)DataCube.print_json()
to simplify exporting process graphs in Jupyter or other interactive environments (#324)DimensionAlreadyExistsException
when trying to add_dimension()
a dimension with existing name (Open-EO/openeo-geopyspark-driver#205)DataCube.execute_batch()
now also guesses the output format from the filename,
and allows using format
argument next to the current out_format
to align with the DataCube.download()
method. (#240)DataCube.merge_cubes()
aggregate_spatial
, mask_polygon
, ...is_valid
in count
in reduce_dimension
(#317)RESTJob
class name to less cryptic and more user-friendly BatchJob
.
Original RESTJob
is still available as deprecated alias.
(#280)DataCube.reduce_temporal_simple()
h5netcdf
engine from XarrayIO.from_netcdf_file()
and XarrayIO.to_netcdf_file()
(#314)Connection.describe_collection()
from name
to collection_id
to be more in line with other methods/functions.context
/condition
confusion bug with count
callback in DataCube.reduce_dimension()
(#317)context
parameter to DataCube.aggregate_spatial()
, DataCube.apply_dimension()
,
DataCube.apply_neighborhood()
, DataCube.apply()
, DataCube.merge_cubes()
.
(#291)DataCube.fit_regr_random_forest()
(#293)PGNode.update_arguments()
, which combined with DataCube.result_node()
allows to do advanced process graph argument tweaking/updating without using ._pg
hacks.JobResults.download_files()
: also download (by default) the job result metadata as STAC JSON file (#184)Connection
: try to automatically refresh access token when expired (#298)Connection.create_job
raises exception if response does not contain a valid job_idopeneo.udf.debug.inspect
for using the openEO inspect
process in a UDF (#302)openeo.util.to_bbox_dict()
to simplify building a openEO style bbox dictionary, e.g. from a list or shapely geometry (#304)zonal_statistics
method from old ImageCollectionClient
API. (#144)<
, >
, <=
and >=
) in callback process buildingConnection.describe_process()
to retrieve and show a single processDataCube.flatten_dimensions()
and DataCube.unflatten_dimension
(Open-EO/openeo-processes#308, Open-EO/openeo-processes#316)VectorCube.run_udf
(to avoid non-standard process_with_node(UDF(...))
usage)DataCube.fit_class_random_forest()
and Connection.load_ml_model()
to train and load Machine Learning models
(#279)DataCube.predict_random_forest()
to easily use reduce_dimension
with a predict_random_forest
reducer
using a MlModel
(trained with fit_class_random_forest
)DataCube.resample_cube_temporal
(#284)target_dimension
argument to DataCube.aggregate_spatial
(#288)context
argument to DataCube.chunk_polygon()
Connection.version_info()
to list version information about the client, the API and the back-endConnection.default_timeout
(when set) also on version discovery requestImageCollection
from DataCube
's class hierarchy.date_range_filter
and bbox_filter
from DataCube
.
(#100, #278)DataCube.send_job
in favor of DataCube.create_job
for better consistency (internally and with other libraries) (#276)openeo.processes
module to 1.2.0 release (2021-12-13) of openeo-processesopeneo.processes
module to draft version of 2022-03-16 (e4df8648) of openeo-processesopeneo.extra.spectral_indices
to a post-0.0.6 version of Awesome Spectral Indices
DataCube.polygonal_mean_timeseries()
, DataCube.polygonal_histogram_timeseries()
,
DataCube.polygonal_median_timeseries()
and DataCube.polygonal_standarddeviation_timeseries()
chunk_polygon
process (Open-EO/openeo-processes#287)spatial_extent
, temporal_extent
and bands
to Connection.load_result()
OPENEO_BASEMAP_URL
allows to set a new templated URL to a XYZ basemap for the Vue Components library, OPENEO_BASEMAP_ATTRIBUTION
allows to set the attribution for the basemap (#260)sum
or all
functions in callbacks (Forum #113)execute_batch/run_synchronous/start_and_wait
).