API

High level Interface

The high-level interface tries to provide an opinionated, Pythonic, API to interact with OpenEO backends. It’s aim is to hide some of the details of using a web service, so the user can produce concise and readable code.

Users that want to interact with OpenEO on a lower level, and have more control, can use the lower level classes.

openeo.connect(url, auth_type=None, auth_options={}, session=None, default_timeout=None)[source]

This method is the entry point to OpenEO. You typically create one connection object in your script or application and re-use it for all calls to that backend.

If the backend requires authentication, you should can pass authentication data directly to this function but it could be easier to authenticate as follows:

>>> # For basic authentication
>>> conn = connect(url).authenticate_basic(username="john", password="foo")
>>> # For OpenID Connect authentication
>>> conn = connect(url).authenticate_OIDC(client_id="myclient")
Parameters
  • url – The http url of an OpenEO endpoint.

  • auth_type (Optional[str]) – Which authentication to use: None, “basic” or “oidc” (for OpenID Connect)

  • auth_options (dict) – Options/arguments specific to the authentication type

  • default_timeout (Optional[int]) – default timeout (in seconds) for requests

Return type

openeo.connections.Connection

openeo.rest.datacube

The main module for creating earth observation processes. It aims to easily build complex process chains, that can be evaluated by an openEO backend.

openeo.rest.datacube.THIS

Symbolic reference to the current data cube, to be used as argument in DataCube.process() calls

class openeo.rest.datacube.DataCube(graph, connection, metadata=None)[source]

Class representing a openEO Data Cube. Data loaded from the backend is returned as an object of this class. Various processing methods can be invoked to build a complete workflow.

Supports openEO API 1.0. In earlier versions this was called ImageCollectionClient

add_dimension(name, label, type=None)[source]

Adds a new named dimension to the data cube. Afterwards, the dimension can be referenced with the specified name. If a dimension with the specified name exists, the process fails with a DimensionExists error. The dimension label of the dimension is set to the specified label.

This call does not modify the datacube in place, but returns a new datacube with the additional dimension.

@param name: The name of the dimension to add @param label: The dimension label. @param type: Dimension type, allowed values: ‘spatial’, ‘temporal’, ‘bands’, ‘other’, default value is ‘other’ @return: The data cube with a newly added dimension. The new dimension has exactly one dimension label. All other dimensions remain unchanged.

apply(process=None)[source]

Applies a unary process (a local operation) to each value of the specified or all dimensions in the data cube.

Parameters
  • process (Union[str, PGNode, Callable, None]) – the name of a process, or a callback function that creates a process graph, see Processes with “callbacks”

  • dimensions – The names of the dimensions to apply the process on. Defaults to an empty array so that all dimensions are used.

Return type

DataCube

Returns

A data cube with the newly computed values. The resolution, cardinality and the number of dimensions are the same as for the original data cube.

apply_dimension(code=None, runtime=None, process=None, version='latest', dimension='t', target_dimension=None)[source]

Applies a process to all pixel values along a dimension of a raster data cube. For example, if the temporal dimension is specified the process will work on a time series of pixel values.

The process to apply is specified by either code and runtime in case of a UDF, or by providing a callback function in the process argument.

The process reduce_dimension also applies a process to pixel values along a dimension, but drops the dimension afterwards. The process apply applies a process to each pixel value in the data cube.

The target dimension is the source dimension if not specified otherwise in the target_dimension parameter. The pixel values in the target dimension get replaced by the computed pixel values. The name, type and reference system are preserved.

The dimension labels are preserved when the target dimension is the source dimension and the number of pixel values in the source dimension is equal to the number of values computed by the process. Otherwise, the dimension labels will be incrementing integers starting from zero, which can be changed using rename_labels afterwards. The number of labels will equal to the number of values computed by the process.

Parameters
  • code (str) – UDF code or process identifier (optional)

  • runtime – UDF runtime to use (optional)

  • process – a callback function that creates a process graph, see Processes with “callbacks”

  • version (str) – Version of the UDF runtime to use

  • dimension (str) – The name of the source dimension to apply the process on. Fails with a DimensionNotAvailable error if the specified dimension does not exist.

  • target_dimension – The name of the target dimension or null (the default) to use the source dimension specified in the parameter dimension. By specifying a target dimension, the source dimension is removed. The target dimension with the specified name and the type other (see add_dimension) is created, if it doesn’t exist yet.

Returns

A datacube with the UDF applied to the given dimension.

Raises

DimensionNotAvailable

apply_kernel(kernel, factor=1.0, border=0, replace_invalid=0)[source]

Applies a focal operation based on a weighted kernel to each value of the specified dimensions in the data cube.

The border parameter determines how the data is extended when the kernel overlaps with the borders. The following options are available:

  • numeric value - fill with a user-defined constant number n: nnnnnn|abcdefgh|nnnnnn (default, with n = 0)

  • replicate - repeat the value from the pixel at the border: aaaaaa|abcdefgh|hhhhhh

  • reflect - mirror/reflect from the border: fedcba|abcdefgh|hgfedc

  • reflect_pixel - mirror/reflect from the center of the pixel at the border: gfedcb|abcdefgh|gfedcb

  • wrap - repeat/wrap the image: cdefgh|abcdefgh|abcdef

Parameters
  • kernel (Union[ndarray, List[List[float]]]) – The kernel to be applied on the data cube. The kernel has to be as many dimensions as the data cube has dimensions.

  • factor (float) – A factor that is multiplied to each value computed by the focal operation. This is basically a shortcut for explicitly multiplying each value by a factor afterwards, which is often required for some kernel-based algorithms such as the Gaussian blur.

  • border (int) – Determines how the data is extended when the kernel overlaps with the borders. Defaults to fill the border with zeroes.

Param

replace_invalid: This parameter specifies the value to replace non-numerical or infinite numerical values with. By default, those values are replaced with zeroes.

Return type

DataCube

Returns

A data cube with the newly computed values. The resolution, cardinality and the number of dimensions are the same as for the original data cube.

apply_neighborhood(process, size, overlap=None)[source]

Applies a focal process to a data cube.

A focal process is a process that works on a ‘neighbourhood’ of pixels. The neighbourhood can extend into multiple dimensions, this extent is specified by the size argument. It is not only (part of) the size of the input window, but also the size of the output for a given position of the sliding window. The sliding window moves with multiples of size.

An overlap can be specified so that neighbourhoods can have overlapping boundaries. This allows for continuity of the output. The values included in the data cube as overlap can’t be modified by the given process.

The neighbourhood size should be kept small enough, to avoid running beyond computational resources, but a too small size will result in a larger number of process invocations, which may slow down processing. Window sizes for spatial dimensions typically are in the range of 64 to 512 pixels, while overlaps of 8 to 32 pixels are common.

The process must not add new dimensions, or remove entire dimensions, but the result can have different dimension labels.

For the special case of 2D convolution, it is recommended to use apply_kernel().

param size

param overlap

param process

a callback function that creates a process graph, see Processes with “callbacks”

return

Return type

DataCube

band(band)[source]

Filter the imagery by the given bands :type band: Union[str, int] :param band: band name, band common name or band index. :return a DataCube instance

Return type

DataCube

count_time()[source]

Counts the number of images with a valid mask in a time series for all bands of the input dataset.

Return type

DataCube

Returns

a DataCube instance

download(outputfile=None, format='GTIFF', options=None)[source]

Download image collection, e.g. as GeoTIFF. :type options: dict :type format: str If :param: outputfile is provided, the result is stored on disk locally, otherwise, a bytes object is returned. The bytes object can be passed on to a suitable decoder for decoding.

Parameters
  • outputfile (Union[str, Path, None]) – Optional, an output file if the result needs to be stored on disk.

  • format (str) – Optional, defaults to “GTIFF”, an output format supported by the backend.

  • options (Optional[dict]) – Optional, file format options

Returns

None if the result is stored to disk, or a bytes object returned by the backend.

execute()[source]

Executes the process graph of the imagery.

Return type

Dict

execute_batch(outputfile=None, out_format=None, print=<built-in function print>, max_poll_interval=60, connection_retry_interval=30, job_options=None, **format_options)[source]

Evaluate the process graph by creating a batch job, and retrieving the results when it is finished. This method is mostly recommended if the batch job is expected to run in a reasonable amount of time.

For very long running jobs, you probably do not want to keep the client running.

Return type

RESTJob

Parameters
  • job_options

  • outputfile (Union[str, Path, None]) – The path of a file to which a result can be written

  • out_format (Optional[str]) – (optional) Format of the job result.

  • format_options – String Parameters for the job result format

static execute_local_udf(udf, datacube=None, fmt='netcdf')[source]

Locally executes an user defined function on a previously downloaded datacube.

Parameters
  • udf (str) – the code of the user defined function

  • datacube (Union[str, ForwardRef, ForwardRef]) – the path to the downloaded data in disk or a DataCube

  • fmt (str) – format of the file if datacube is string

Returns

the resulting DataCube

filter_bands(bands)[source]

Filter the data cube by the given bands :type bands: Union[List[Union[int, str]], str] :param bands: list of band names, common names or band indices. Single band name can also be given as string. :return a DataCube instance

Return type

DataCube

filter_bbox(west, east, north, south, crs=None, base=None, height=None)[source]

Limits the ImageCollection to a given spatial bounding box.

Parameters
  • west – west boundary (longitude / easting)

  • east – east boundary (longitude / easting)

  • north – north boundary (latitude / northing)

  • south – south boundary (latitude / northing)

  • crs – spatial reference system of boundaries as proj4 or EPSG:12345 like string

  • base – lower left corner coordinate axis 3

  • height – upper right corner coordinate axis 3

Return type

DataCube

Returns

An image collection cropped to the specified bounding box.

https://open-eo.github.io/openeo-api/v/0.4.1/processreference/#filter_bbox

# TODO: allow passing some kind of bounding box object? e.g. a (xmin, ymin, xmax, ymax) tuple?

flatten()[source]

Get the process graph in flattened dict representation

Return type

dict

property graph

Get the process graph in flattened dict representation

Return type

dict

graph_add_node(process_id, arguments=None, metadata=None, **kwargs)

Generic helper to create a new DataCube by applying a process.

Parameters
  • process_id (str) – process id of the process.

  • arguments (Optional[dict]) – argument dictionary for the process.

Return type

DataCube

Returns

new DataCube instance

linear_scale_range(input_min, input_max, output_min, output_max)[source]

Color stretching :param input_min: Minimum input value :param input_max: Maximum input value :param output_min: Minimum output value :param output_max: Maximum output value :return a DataCube instance

Return type

DataCube

classmethod load_collection(collection_id, connection=None, spatial_extent=None, temporal_extent=None, bands=None, fetch_metadata=True, properties=None)[source]

Create a new Raster Data cube.

Parameters
  • collection_id (str) – A collection id, should exist in the backend.

  • connection (Optional[Connection]) – The connection to use to connect with the backend.

  • spatial_extent (Optional[Dict[str, float]]) – limit data to specified bounding box or polygons

  • temporal_extent (Optional[List[Union[str, datetime, date]]]) – limit data to specified temporal interval

  • bands (Optional[List[str]]) – only add the specified bands

  • properties (Optional[Dict[str, Union[str, PGNode, Callable]]]) – limit data by metadata property predicates

Returns

classmethod load_disk_collection(connection, file_format, glob_pattern, **options)[source]

Loads image data from disk as a DataCube.

Parameters
  • connection (Connection) – The connection to use to connect with the backend.

  • file_format (str) – the file format, e.g. ‘GTiff’

  • glob_pattern (str) – a glob pattern that matches the files to load from disk

  • options – options specific to the file format

Return type

DataCube

Returns

the data as a DataCube

logical_and(other)[source]

Apply element-wise logical and operation :type other: DataCube :param other: :return DataCube: logical_and(this, other)

Return type

DataCube

logical_or(other)[source]

Apply element-wise logical or operation :type other: DataCube :param other: :return DataCube: logical_or(this, other)

Return type

DataCube

mask(mask=None, replacement=None)[source]

Applies a mask to a raster data cube. To apply a vector mask use mask_polygon.

A mask is a raster data cube for which corresponding pixels among data and mask are compared and those pixels in data are replaced whose pixels in mask are non-zero (for numbers) or true (for boolean values). The pixel values are replaced with the value specified for replacement, which defaults to null (no data).

Parameters
  • mask (Optional[DataCube]) – the raster mask

  • replacement – the value to replace the masked pixels with

Return type

DataCube

mask_polygon(mask=None, srs='EPSG:4326', replacement=None, inside=None)[source]

Applies a polygon mask to a raster data cube. To apply a raster mask use mask.

All pixels for which the point at the pixel center does not intersect with any polygon (as defined in the Simple Features standard by the OGC) are replaced. This behaviour can be inverted by setting the parameter inside to true.

The pixel values are replaced with the value specified for replacement, which defaults to no data.

Parameters
  • mask (Union[Polygon, MultiPolygon, str, Path, None]) – A polygon, provided as a shapely.geometry.Polygon or shapely.geometry.MultiPolygon, or a filename pointing to a valid vector file

  • srs (str) – The reference system of the provided polygon, by default this is Lat Lon (EPSG:4326).

  • replacement – the value to replace the masked pixels with

Return type

DataCube

max_time()[source]

Finds the maximum value of a time series for all bands of the input dataset.

Return type

DataCube

Returns

a DataCube instance

mean_time()[source]

Finds the mean value of a time series for all bands of the input dataset.

Return type

DataCube

Returns

a DataCube instance

median_time()[source]

Finds the median value of a time series for all bands of the input dataset.

Return type

DataCube

Returns

a DataCube instance

merge(other, overlap_resolver=None)

Merging two data cubes

The data cubes have to be compatible. A merge operation without overlap should be reversible with (a set of) filter operations for each of the two cubes. The process performs the join on overlapping dimensions, with the same name and type. An overlapping dimension has the same name, type, reference system and resolution in both dimensions, but can have different labels. One of the dimensions can have different labels, for all other dimensions the labels must be equal. If data overlaps, the parameter overlap_resolver must be specified to resolve the overlap.

Examples for merging two data cubes:

  1. Data cubes with the dimensions x, y, t and bands have the same dimension labels in x,y and t, but the labels for the dimension bands are B1 and B2 for the first cube and B3 and B4. An overlap resolver is not needed. The merged data cube has the dimensions x, y, t and bands and the dimension bands has four dimension labels: B1, B2, B3, B4.

  2. Data cubes with the dimensions x, y, t and bands have the same dimension labels in x,y and t, but the labels for the dimension bands are B1 and B2 for the first data cube and B2 and B3 for the second. An overlap resolver is required to resolve overlap in band B2. The merged data cube has the dimensions x, y, t and bands and the dimension bands has three dimension labels: B1, B2, B3.

  3. Data cubes with the dimensions x, y and t have the same dimension labels in x,y and t. There are two options:
    • Keep the overlapping values separately in the merged data cube: An overlap resolver is not needed, but for each data cube you need to add a new dimension using add_dimension. The new dimensions must be equal, except that the labels for the new dimensions must differ by name. The merged data cube has the same dimensions and labels as the original data cubes, plus the dimension added with add_dimension, which has the two dimension labels after the merge.

    • Combine the overlapping values into a single value: An overlap resolver is required to resolve the overlap for all pixels. The merged data cube has the same dimensions and labels as the original data cubes, but all pixel values have been processed by the overlap resolver.

  4. Merging a data cube with dimensions x, y, t with another cube with dimensions x, y will join on the x, y dimension, so the lower dimension cube is merged with each time step in the higher dimensional cube. This can for instance be used to apply a digital elevation model to a spatiotemporal data cube.

@param other: The data cube to merge with. @param overlap_resolver: A reduction operator that resolves the conflict if the data overlaps. The reducer must return a value of the same data type as the input values are. The reduction operator may be a single process such as multiply or consist of multiple sub-processes. null (the default) can be specified if no overlap resolver is required. @return: The merged data cube.

Return type

DataCube

merge_cubes(other, overlap_resolver=None)[source]

Merging two data cubes

The data cubes have to be compatible. A merge operation without overlap should be reversible with (a set of) filter operations for each of the two cubes. The process performs the join on overlapping dimensions, with the same name and type. An overlapping dimension has the same name, type, reference system and resolution in both dimensions, but can have different labels. One of the dimensions can have different labels, for all other dimensions the labels must be equal. If data overlaps, the parameter overlap_resolver must be specified to resolve the overlap.

Examples for merging two data cubes:

  1. Data cubes with the dimensions x, y, t and bands have the same dimension labels in x,y and t, but the labels for the dimension bands are B1 and B2 for the first cube and B3 and B4. An overlap resolver is not needed. The merged data cube has the dimensions x, y, t and bands and the dimension bands has four dimension labels: B1, B2, B3, B4.

  2. Data cubes with the dimensions x, y, t and bands have the same dimension labels in x,y and t, but the labels for the dimension bands are B1 and B2 for the first data cube and B2 and B3 for the second. An overlap resolver is required to resolve overlap in band B2. The merged data cube has the dimensions x, y, t and bands and the dimension bands has three dimension labels: B1, B2, B3.

  3. Data cubes with the dimensions x, y and t have the same dimension labels in x,y and t. There are two options:
    • Keep the overlapping values separately in the merged data cube: An overlap resolver is not needed, but for each data cube you need to add a new dimension using add_dimension. The new dimensions must be equal, except that the labels for the new dimensions must differ by name. The merged data cube has the same dimensions and labels as the original data cubes, plus the dimension added with add_dimension, which has the two dimension labels after the merge.

    • Combine the overlapping values into a single value: An overlap resolver is required to resolve the overlap for all pixels. The merged data cube has the same dimensions and labels as the original data cubes, but all pixel values have been processed by the overlap resolver.

  4. Merging a data cube with dimensions x, y, t with another cube with dimensions x, y will join on the x, y dimension, so the lower dimension cube is merged with each time step in the higher dimensional cube. This can for instance be used to apply a digital elevation model to a spatiotemporal data cube.

@param other: The data cube to merge with. @param overlap_resolver: A reduction operator that resolves the conflict if the data overlaps. The reducer must return a value of the same data type as the input values are. The reduction operator may be a single process such as multiply or consist of multiple sub-processes. null (the default) can be specified if no overlap resolver is required. @return: The merged data cube.

Return type

DataCube

min_time()[source]

Finds the minimum value of a time series for all bands of the input dataset.

Return type

DataCube

Returns

a DataCube instance

ndvi(nir=None, red=None, target_band=None)[source]

Normalized Difference Vegetation Index (NDVI)

Parameters
  • nir (Optional[str]) – (optional) name of NIR band

  • red (Optional[str]) – (optional) name of red band

  • target_band (Optional[str]) – (optional) name of the newly created band

Return type

DataCube

Returns

a DataCube instance

polygonal_histogram_timeseries(polygon)[source]

Extract a histogram time series for the given (multi)polygon. Its points are expected to be in the EPSG:4326 coordinate reference system.

Parameters

polygon (Union[Polygon, MultiPolygon, str]) – The (multi)polygon; or a file path or HTTP URL to a GeoJSON file or shape file

Return type

DataCube

Returns

DataCube

polygonal_mean_timeseries(polygon)[source]

Extract a mean time series for the given (multi)polygon. Its points are expected to be in the EPSG:4326 coordinate reference system.

Parameters

polygon (Union[Polygon, MultiPolygon, str]) – The (multi)polygon; or a file path or HTTP URL to a GeoJSON file or shape file

Return type

DataCube

Returns

DataCube

polygonal_median_timeseries(polygon)[source]

Extract a median time series for the given (multi)polygon. Its points are expected to be in the EPSG:4326 coordinate reference system.

Parameters

polygon (Union[Polygon, MultiPolygon, str]) – The (multi)polygon; or a file path or HTTP URL to a GeoJSON file or shape file

Return type

DataCube

Returns

DataCube

polygonal_standarddeviation_timeseries(polygon)[source]

Extract a time series of standard deviations for the given (multi)polygon. Its points are expected to be in the EPSG:4326 coordinate reference system.

Parameters

polygon (Union[Polygon, MultiPolygon, str]) – The (multi)polygon; or a file path or HTTP URL to a GeoJSON file or shape file

Return type

DataCube

Returns

DataCube

process(process_id, arguments=None, metadata=None, **kwargs)[source]

Generic helper to create a new DataCube by applying a process.

Parameters
  • process_id (str) – process id of the process.

  • arguments (Optional[dict]) – argument dictionary for the process.

Return type

DataCube

Returns

new DataCube instance

process_with_node(pg, metadata=None)[source]

Generic helper to create a new DataCube by applying a process (given as process graph node)

Parameters
  • pg (PGNode) – process graph node (containing process id and arguments)

  • metadata (Optional[CollectionMetadata]) – (optional) metadata to override original cube metadata (e.g. when reducing dimensions)

Return type

DataCube

Returns

new DataCube instance

raster_to_vector()[source]

EXPERIMENTAL: not generally supported, API subject to change Converts this raster data cube into a vector data cube. The bounding polygon of homogenous areas of pixels is constructed.

@return: A vectorcube

Return type

VectorCube

reduce_bands_udf(code, runtime='Python', version='latest')[source]

Apply reduce (reduce_dimension) process with given UDF along band/spectral dimension.

Return type

DataCube

reduce_dimension(dimension, reducer, process_id='reduce_dimension', band_math_mode=False)[source]

Add a reduce process with given reducer callback along given dimension

Parameters
  • dimension (str) – the label of the dimension to reduce

  • reducer (Union[str, PGNode, Callable]) – a callback function that creates a process graph, see Processes with “callbacks”

Return type

DataCube

reduce_temporal_simple(process_id='max')[source]

Do temporal reduce with a simple given process as callback.

Return type

DataCube

reduce_temporal_udf(code, runtime='Python', version='latest')[source]

Apply reduce (reduce_dimension) process with given UDF along temporal dimension.

reduce_tiles_over_time(code, runtime='Python', version='latest')[source]

Applies a user defined function to a timeseries of tiles. The size of the tile is backend specific, and can be limited to one pixel. The function should reduce the given timeseries into a single (multiband) tile.

Parameters
  • code (str) – The UDF code, compatible with the given runtime and version

  • runtime (str) – The UDF runtime

  • version (str) – The UDF runtime version

Returns

rename_dimension(source, target)[source]

Renames a dimension in the data cube while preserving all other properties.

Parameters
  • source (str) – The current name of the dimension. Fails with a DimensionNotAvailable error if the specified dimension does not exist.

  • target (str) – A new Name for the dimension. Fails with a DimensionExists error if a dimension with the specified name exists.

Returns

A new datacube with the dimension renamed.

rename_labels(dimension, target, source=None)[source]

Renames the labels of the specified dimension in the data cube from source to target.

Parameters
  • dimension (str) – Dimension name

  • target (list) – The new names for the labels.

  • source (Optional[list]) – The names of the labels as they are currently in the data cube.

Return type

DataCube

Returns

An DataCube instance

resample_cube_spatial(target, method='near')[source]

Resamples the spatial dimensions (x,y) of this data cube to a target data cube and return the results as a new data cube.

https://processes.openeo.org/#resample_cube_spatial

Parameters
  • target (DataCube) – An ImageCollection that specifies the target

  • method (str) – The resampling method.

Returns

A raster data cube with values warped onto the new projection.

resample_spatial(resolution, projection=None, method='near', align='upper-left')[source]

Resamples the spatial dimensions (x,y) of the data cube to a specified resolution and/or warps the data cube to the target projection. At least resolution or projection must be specified.

Use filter_bbox to set the target spatial extent.

https://processes.openeo.org/#resample_spatial

Parameters
  • resolution (Union[float, Tuple[float, float]]) – Either a single number or an array with separate resolutions for each spatial dimension. Resamples the data cube to the target resolution, which can be specified either as separate values for x and y or as a single value for both axes. Specified in the units of the target projection. Doesn’t change the resolution by default (0).

  • projection (Union[int, str, None]) – Either an epsg code, as an integer, or a proj-definition string. Warps the data cube to the target projection. Target projection specified as EPSG code or PROJ definition. Doesn’t change the projection by default (null).

  • method (str) – Resampling method. Methods are inspired by GDAL, see gdalwarp for more information. Possible values: near, bilinear, cubic, cubicspline, lanczos, average, mode, max, min, med, q1, q3

  • align (str) – Specifies to which corner of the spatial extent the new resampled data is aligned to. Possible values: lower-left, upper-left, lower-right, upper-right

Returns

A raster data cube with values warped onto the new projection.

save_user_defined_process(user_defined_process_id, public=False)[source]

Saves this process graph in the backend as a user-defined process for the authenticated user.

Return type

RESTUserDefinedProcess

Parameters
  • user_defined_process_id (str) – unique identifier for the process

  • public (bool) – visible to other users?

Returns

a RESTUserDefinedProcess instance

send_job(out_format=None, job_options=None, **format_options)[source]

Sends a job to the backend and returns a Job instance. The job will still need to be started and managed explicitly. The execute_batch() method allows you to run batch jobs without managing it.

Return type

RESTJob

Parameters
  • out_format – String Format of the job result.

  • job_options – A dictionary containing (custom) job options

  • format_options – String Parameters for the job result format

Returns

status: Job resulting job.

stretch_colors(min, max)[source]

Color stretching deprecated, use ‘linear_scale_range’ instead

param min

Minimum value

param max

Maximum value

return

a DataCube instance

Return type

DataCube

tiled_viewing_service(type, **kwargs)[source]

Returns metadata for a tiled viewing service that visualizes this layer.

Parameters
  • process_graph – process graph dict

  • service_type – The type of viewing service to create, for instance: ‘WMTS’

Return type

Dict

Returns

A dictionary object containing the viewing service metadata, such as the connection ‘url’.

to_graphviz()[source]

Build a graphviz DiGraph from the process graph :return:

zonal_statistics(regions, func, scale=1000, interval='day')[source]

Calculates statistics for each zone specified in a file.

Parameters
  • regions – GeoJSON or a path to a GeoJSON file containing the regions. For paths you must specify the path to a user-uploaded file without the user id in the path.

  • func – Statistical function to calculate for the specified zones. example values: min, max, mean, median, mode

  • scale (int) – A nominal scale in meters of the projection to work in. Defaults to 1000.

  • interval (str) – Interval to group the time series. Allowed values: day, wee, month, year. Defaults to day.

Return type

DataCube

Returns

a DataCube instance

openeo.api

class openeo.api.process.Parameter(name, description, schema, default=<object object>)[source]

Wrapper for a process parameter, as used in predefined and user-defined processes.

classmethod raster_cube(name='data', description='A data cube.')[source]

Helper to easily create a ‘raster-cube’ parameter.

Parameters
  • name (str) – name of the parameter.

  • description (str) – description of the parameter

Returns

Parameter

to_dict()[source]

Convert to dictionary for JSON-serialization.

Return type

dict

openeo.rest.connection

This module provides a Connection object to manage and persist settings when interacting with the OpenEO API.

class openeo.rest.connection.Connection(url, auth=None, session=None, default_timeout=None, auth_config=None, refresh_token_store=None)[source]

Connection to an openEO backend.

authenticate_OIDC(client_id, provider_id=None, webbrowser_open=None, timeout=120, server_address=None)[source]

Authenticates a user to the backend using OpenID Connect.

Parameters
  • client_id (str) – Client id to use for OpenID Connect authentication

  • webbrowser_open – optional handler for the initial OAuth authentication request (opens a webbrowser by default)

  • timeout (int) – number of seconds after which to abort the authentication procedure

  • server_address (Optional[Tuple[str, int]]) – optional tuple (hostname, port_number) to serve the OAuth redirect callback on

TODO: deprecated?

Return type

Connection

authenticate_basic(username=None, password=None)[source]

Authenticate a user to the backend using basic username and password.

Parameters
  • username (Optional[str]) – User name

  • password (Optional[str]) – User passphrase

Return type

Connection

authenticate_oidc_authorization_code(client_id=None, client_secret=None, provider_id=None, timeout=None, server_address=None, webbrowser_open=None, store_refresh_token=False)[source]

OpenID Connect Authorization Code Flow (with PKCE).

WARNING: this API is in experimental phase

Return type

Connection

authenticate_oidc_client_credentials(client_id=None, client_secret=None, provider_id=None, store_refresh_token=False)[source]

OpenID Connect Client Credentials flow.

WARNING: this API is in experimental phase

Return type

Connection

authenticate_oidc_device(client_id=None, client_secret=None, provider_id=None, store_refresh_token=False, **kwargs)[source]

Authenticate with OAuth Device Authorization grant/flow

WARNING: this API is in experimental phase

Return type

Connection

authenticate_oidc_refresh_token(client_id=None, refresh_token=None, client_secret=None, provider_id=None)[source]

OpenId Connect Refresh Token

WARNING: this API is in experimental phase

Return type

Connection

authenticate_oidc_resource_owner_password_credentials(username, password, client_id=None, client_secret=None, provider_id=None, store_refresh_token=False)[source]

OpenId Connect Resource Owner Password Credentials

WARNING: this API is in experimental phase

Return type

Connection

capabilities()[source]

Loads all available capabilities.

Return type

RESTCapabilities

Returns

data_dict: Dict All available data types

create_file(path)[source]

Creates virtual file

Returns

file object.

create_job(process_graph, title=None, description=None, plan=None, budget=None, additional=None)[source]

Posts a job to the back end.

Return type

RESTJob

Parameters
  • process_graph (dict) – (flat) dict representing process graph

  • title (Optional[str]) – String title of the job

  • description (Optional[str]) – String description of the job

  • plan (Optional[str]) – billing plan

  • budget – Budget

  • additional (Optional[Dict]) – additional job options to pass to the backend

Returns

job_id: String Job id of the new created job

datacube_from_process(process_id, **kwargs)[source]

Load a raster datacube, from a custom process.

@param process_id: The process id of the custom process. @param kwargs: The arguments of the custom process @return: A DataCube, without valid metadata, as the client is not aware of this custom process.

Return type

DataCube

describe_account()[source]

Describes the currently authenticated user account.

Return type

str

describe_collection(name)[source]

Loads detailed information of a specific image collection.

Return type

dict

Parameters

name – String Id of the collection

Returns

data_dict: Dict Detailed information about the collection

download(graph, outputfile=None)[source]

Downloads the result of a process graph synchronously, and save the result to the given file or return bytes object if no outputfile is specified. This method is useful to export binary content such as images. For json content, the execute method is recommended.

Parameters
  • graph (dict) – (flat) dict representing a process graph

  • outputfile (Union[str, Path, None]) – output file

execute(process_graph)[source]

Execute a process graph synchronously.

Parameters

process_graph (dict) – (flat) dict representing a process graph

imagecollection(collection_id, **kwargs)

Load an image collection by collection id

see openeo.rest.imagecollectionclient.ImageCollectionClient.load_collection() for available arguments.

Parameters

collection_id (str) – image collection identifier (string)

Return type

Union[ImageCollectionClient, DataCube]

Returns

ImageCollectionClient

job(job_id)[source]

Get the job based on the id. The job with the given id should already exist.

Use openeo.rest.connection.Connection.create_job() to create new jobs

Parameters

job_id (str) – the job id of an existing job

Returns

A job object.

list_collection_ids()[source]

Get list of all collection ids

Return type

List[str]

Returns

list of collection ids

list_collections()[source]

Loads all available imagecollections types.

Return type

List[dict]

Returns

list of collection meta data dictionaries

list_file_formats()[source]

Get available input and output formats

Return type

dict

list_files()[source]

Lists all files that the logged in user uploaded.

Returns

file_list: List of the user uploaded files.

list_jobs()[source]

Lists all jobs of the authenticated user.

Return type

dict

Returns

job_list: Dict of all jobs of the user.

list_processes()[source]

Loads all available processes of the back end.

Return type

List[dict]

Returns

processes_dict: Dict All available processes of the back end.

list_service_types()[source]

Loads all available service types.

Return type

dict

Returns

data_dict: Dict All available service types

list_services()[source]

Loads all available services of the authenticated user.

Return type

dict

Returns

data_dict: Dict All available service types

list_user_defined_processes()[source]

Lists all user-defined processes of the authenticated user.

Return type

List[dict]

load_collection(collection_id, **kwargs)[source]

Load an image collection by collection id

see openeo.rest.imagecollectionclient.ImageCollectionClient.load_collection() for available arguments.

Parameters

collection_id (str) – image collection identifier (string)

Return type

Union[ImageCollectionClient, DataCube]

Returns

ImageCollectionClient

load_disk_collection(format, glob_pattern, options={})[source]

Loads image data from disk as an ImageCollection.

Return type

ImageCollectionClient

Parameters
  • format (str) – the file format, e.g. ‘GTiff’

  • glob_pattern (str) – a glob pattern that matches the files to load from disk

  • options (dict) – options specific to the file format

Returns

the data as an ImageCollection

remove_service(service_id)[source]

Stop and remove a secondary web service.

Parameters

service_id (str) – service identifier

Returns

save_user_defined_process(user_defined_process_id, process_graph, parameters=None, public=False)[source]

Saves a process graph and its metadata in the backend as a user-defined process for the authenticated user.

Return type

RESTUserDefinedProcess

Parameters
  • user_defined_process_id (str) – unique identifier for the user-defined process

  • process_graph (dict) – a process graph

  • parameters (Optional[List[Union[Parameter, dict]]]) – a list of parameters

  • public (bool) – visible to other users?

Returns

a RESTUserDefinedProcess instance

user_defined_process(user_defined_process_id)[source]

Get the user-defined process based on its id. The process with the given id should already exist.

Return type

RESTUserDefinedProcess

Parameters

user_defined_process_id (str) – the id of the user-defined process

Returns

a RESTUserDefinedProcess instance

user_jobs()[source]

Loads all jobs of the current user.

Return type

dict

Returns

jobs: Dict All jobs of the user

classmethod version_discovery(url, session=None)[source]

Do automatic openEO API version discovery from given url, using a “well-known URI” strategy.

Return type

str

Parameters

url (str) – initial backend url (not including “/.well-known/openeo”)

Returns

root url of highest supported backend version

exception openeo.rest.connection.OpenEoApiError(http_status_code=None, code='unknown', message='unknown error', id=None, url=None)[source]

Error returned by OpenEO API according to https://open-eo.github.io/openeo-api/errors/