The Geohazards Thematic Exploitation Platform. Terradue presentation done at Living Planet Symposium in Prague on May 9th


The Geohazards Thematic Exploitation Platform (GEP) is an ESA funded activity on technological innovation to demonstrate the benefits of exploiting new techniques for large scale on demand and systematic processing of EO data in the Geohazards domain. It supports the geohazards community by creating an Exploitation Platform with new models of collaboration where data providers, users and processors produce scientific and commercial exploitable results in the Cloud.

GEP is technology R&D but follows a model for partnership and community building that is user driven. Fundamentally the current prototype has been designed working on requirements from users of the geohazards community in the context of the Geohazards Supersite initiative (GSNL) and the CEOS Disaster Risk Management initiative and its Seismic Pilot with a range of expert users such as for instance, University of Leeds/COMET, ISTerre of the French ForM@Ter network, the Italian Istituto Nazionale Geofisica & Vulcanologia, UNVACO, NASA/JPL and University of Miami in the USA, etc.

22 European early adopters have been already taken on board in a validation initiated in March 2015 and since October 2015 the GEP includes six new partnerships that bring new applications and new end-users. 25 new users are being identified during the project execution and finally two ESA GSP projects on Innovation in the area of Disaster Risk Reduction will bring 7 additional users. This will make 60 users by end 2017.

After about 18 months prototyping the GEP provides an ecosystem addressing a variety of EO-related scientific challenges, that demonstrates the value of scientific exploitation of EO data, community building, as well as rapid development and benchmarking of algorithms. It explores innovative funding and business models to ensure sustainability and elaborates a vision for wider “smart” exploitation of EO data to maximize the scientific return of ESA missions.

The thematic part of the GEP is likely the most important as it drives the rest of the platform and focuses on specific fields such as continuous monitoring and rapid response, studying the consequences of natural disasters or global crises.

The exploitation part deals with:

  • high performance computing resources (e.g. hybrid cloud, commercial cloud, federated resources, third party systems such as G-POD),
  • very fast access to large volume of data from many EO missions (ERS, EnviSat, Sentinels, COSMO SkyMed, TerraSAR-X, ALOS, Landsat…) as well as non-space data,
  • data processing software (e.g. toolboxes, RTMs, retrieval baselines, visualisation routines): it includes well known and widely used processors such as ROI_PAC, DIAPASON, DORIS, StaMPS, SBAS, NSBAS, …

Lastly, the platform gathers many skills in the Earth Science application domain:

  • scientists process data using pre-integrated services or provide their algorithms to be integrated as services,
  • software engineers provide support and design and develop toolboxes for the formers,
  • data providers feed indexed database with datasets of various type and nature such as satellite acquisitions,
  • resource providers provision the processing platform with high processing capacity infrastructures,
  • even people out of the field (such as journalists) may find interest in reading the produced publications and sharing them via the social networks.

The GEP provides a complete working environment with capabilities that enable users to effectively perform data-intensive research by running dedicated processing software, thereby avoiding spending non-research time on developing ICT tools, sourcing data, etc.

The GEP supports 3 main usage scenarios:
Scenario #1 – EO Data Exploitation,​ allows a user to discover/select data and pre­-existing processing service, process data, and visualize/analyse or select and apply data manipulation tools to the result.
Scenario #2 – New EO Service Development,​allows a user to discover/select a data sample and software components, engineer (or upload) and validate an application (such as a new processor), and deploy the application on the platform for use also by other users.
Scenario #3 – New EO Product Development, ​allows a user to deploy a new processor, discover/select data; process the data, and eventually publish a resulting new information layer. Note that this scenario implies the implementation of effective, scalable, and cost­-optimized strategies for infrastructure provisioning for processing of very big datasets. 

If you are interested accessing the platform processing services or in submitting an application according to one of these scenarios, please contact us at