RUS  ENG
Full version
JOURNALS // Computer Optics // Archive

Computer Optics, 2024 Volume 48, Issue 1, Pages 123–138 (Mi co1220)

IMAGE PROCESSING, PATTERN RECOGNITION

An approach to dynamic visualization of heterogeneous geospatial vector images

A. V. Vorob'evabc, G. R. Vorobevab

a Ufa State Petroleum Technological University
b Ufa University of Science and Technology
c Geophysical center of RAS, Moscow

Abstract: One of the well-known problems of geoinformation software libraries in the visualization of geospatial data is the low efficiency and limited frame-by-frame change of a group of spatial layers with time reference. Among the most significant challenges of visualization is that the syn-chronous switching of a group of spatial layers is impossible to perform, which leads to less accurate estimates of the temporal anisotropy of the corresponding spatial data. The problem is further exacerbated by the heterogeneity of spatial information, which is expressed in sampling steps, formats used by geospatial primitives. The said problem significantly complicates the analysis of spatiotemporal information in many research and application areas. An illustrative example may be found in the problem of analyzing the spatiotemporal anisotropy of geophysical information. Here, for the visualization to be implemented, the dynamic evaluation of the retrospective data in a given time interval needs to performed. The paper proposes an approach that enables heterogeneous vector geospatial data to be integrated before subsequent processing, analysis and visualization. The effectiveness of the developed approach is confirmed by the example of a web applica-tion for visualizing a geospatial image as an array of spatial polylines of arbitrary data, as well as in the problem of analysis of spatiotemporal variations of geophysical data.

Keywords: spatial data, geoinformation technologies, geospatial image, geospatial primitives

Received: 16.01.2023
Accepted: 04.08.2023

DOI: 10.18287/2412-6179-CO-1279



© Steklov Math. Inst. of RAS, 2024