Video Frame Denoising via CNN and GAN Methods


Creative Commons License

Yapici A., AKCAYOL M. A.

KSII Transactions on Internet and Information Systems, cilt.19, sa.3, ss.704-729, 2025 (SCI-Expanded) identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 19 Sayı: 3
  • Basım Tarihi: 2025
  • Doi Numarası: 10.3837/tiis.2025.03.001
  • Dergi Adı: KSII Transactions on Internet and Information Systems
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Applied Science & Technology Source, Compendex, Computer & Applied Sciences
  • Sayfa Sayıları: ss.704-729
  • Anahtar Kelimeler: Convolutional neural network, deep neural network, image denoising, video denoising
  • Açık Arşiv Koleksiyonu: AVESİS Açık Erişim Koleksiyonu
  • Gazi Üniversitesi Adresli: Evet

Özet

Video and image denoising techniques aim to eliminate noise while preserving image details. However, in the process of noise reduction, while some image texture may get lost, residual noise artifacts can persist. So far, no single model has achieved universal success across all types and levels of noise. In this study, we present a noise reduction model that combines deep learning methods using limited hardware resources. Specifically, we leverage a combination of Convolutional Neural Network (CNN) and Generative Adversarial Network (GAN) architecture, which has demonstrated effectiveness in preserving image structure as the depth of the model increases. Additionally, we enhance visual quality by incorporating GAN model after the CNN network. Our evaluation of the proposed approach reveals superior performance at low noise levels compared to previous neural network-based methods. By utilizing the findings of this study, we gain insights into the more efficient utilization of deep learning and traditional noise reduction methods, explore their behavior at different noise levels, and compare results obtained from diverse datasets. The proposed method outperforms other CNN-based methods at certain noise levels, thereby providing valuable prior knowledge to researchers in this field.