Time series - image transformation-based new approaches in detecting underwater objects with machine learning methods


Demirezen M. U., Civrizoglu A., YAVANOĞLU U.

JOURNAL OF THE FACULTY OF ENGINEERING AND ARCHITECTURE OF GAZI UNIVERSITY, cilt.36, sa.3, ss.1400-1415, 2021 (SCI-Expanded) identifier identifier identifier

Özet

Sonar used to determine the size, distance, direction and other features of an object using sound waves; It is widely used in underwater mining and oil exploration, submarine mapping, tracking fish shoals and mine detection. Feature extraction, selection, selection of appropriate algorithms and optimization of hyperparameters that should be used for the identification and classification of sonar signals are scientific problems that have been studied for many years. In this study, using three different mathematical transformations as an innovative approach, numerical representation of data in a different format is proposed and the performance of deep learning methods in this particular problem is compared with classical machine learning and statistical pattern recognition algorithms. Mathematical transformations of Markov Transform Field, Gramian Angular Field and Repetition Plot are used to express the data in image format from time series type. By using the transformed data obtained, deep learning algorithms were trained and the performance of the produced models and the results obtained with classical algorithms were compared with the help of metrics. It has been determined that the proposed time series data to image transformation approaches eliminate the need for feature extraction in problem solving and give the best results determined in the literature so far. It is considered that the proposed new approach can be applied not only to time series based classification problems, but also in different research areas, and will provide significant contributions to increase the performance of machine learning algorithms with the proposed mathematical transformations for the representation of data.