Keywords

raster data, Python, HDF5, data fusion, data management, remote sensing

Location

Session A2: Interoperability, Reusability, and Integrated Systems

Start Date

11-7-2016 3:50 PM

End Date

11-7-2016 4:10 PM

Abstract

We present a concept for a system to manage and distribute geo-referenced raster data from multiple possible sources on the institute level. The core concept is to centralize all import routines for data from different sources and formats, store the information in a common data format and distribute the needed information for research purposes upon request. By using hdf5 files, it can be assured that geo-spatial data is always kept in close connection with the corresponding meta-data. Furthermore this approach builds a technical basis for enhancing data fusion methods by allowing the use of common analysis routines, regardless of the original data format. The flow of the information is organized in such a way, that a data scientist is responsible for the import, conversion and storage of raster data. The sources of such data may be space- or airborne sensors (both optical or radar), ground based precipitation measurements by radar or digital elevation models, yet any data that can be mapped to a geo-referenced raster could be included. Upon request, a hdf5 file, containing data from the region of interest, defined temporal boundaries and any desired thematic content is created and a copy is given to the scientist, who no longer has to worry about data conversion. The presentation includes an introduction to the concept above and examples of the implementation, which was done using reusable python scripts, that can be run on the cluster computer at the Leibniz Centre for Agricultural Landscape Research (ZALF) or - for special purposes – are core of a program with a graphic user interface built around them.

Share

COinS
 
Jul 11th, 3:50 PM Jul 11th, 4:10 PM

A Centralized Management System for Raster Data

Session A2: Interoperability, Reusability, and Integrated Systems

We present a concept for a system to manage and distribute geo-referenced raster data from multiple possible sources on the institute level. The core concept is to centralize all import routines for data from different sources and formats, store the information in a common data format and distribute the needed information for research purposes upon request. By using hdf5 files, it can be assured that geo-spatial data is always kept in close connection with the corresponding meta-data. Furthermore this approach builds a technical basis for enhancing data fusion methods by allowing the use of common analysis routines, regardless of the original data format. The flow of the information is organized in such a way, that a data scientist is responsible for the import, conversion and storage of raster data. The sources of such data may be space- or airborne sensors (both optical or radar), ground based precipitation measurements by radar or digital elevation models, yet any data that can be mapped to a geo-referenced raster could be included. Upon request, a hdf5 file, containing data from the region of interest, defined temporal boundaries and any desired thematic content is created and a copy is given to the scientist, who no longer has to worry about data conversion. The presentation includes an introduction to the concept above and examples of the implementation, which was done using reusable python scripts, that can be run on the cluster computer at the Leibniz Centre for Agricultural Landscape Research (ZALF) or - for special purposes – are core of a program with a graphic user interface built around them.