INGC-GAN: An Implicit Neural-Guided Cycle Generative Approach for Perceptual-Friendly Underwater Image Enhancement.

 0 Người đánh giá. Xếp hạng trung bình 0

Tác giả: Shuaishuai Fan, Glyn Gowing, Weiming Li, Songjie Wei, Xuelong Wu

Ngôn ngữ: eng

Ký hiệu phân loại:

Thông tin xuất bản: United States : IEEE transactions on neural networks and learning systems , 2025

Mô tả vật lý:

Bộ sưu tập: NCBI

ID: 707166

The key requirement for underwater image enhancement (UIE) is to overcome the unpredictable color degradation caused by the underwater environment and light attenuation, while addressing issues, such as color distortion, reduced contrast, and blurring. However, most existing unsupervised methods fail to effectively solve these problems, resulting in a visual disparity in metric-optimal qualitative results compared with undegraded images. In this work, we propose an implicit neural-guided cyclic generative model for UIE tasks, and the bidirectional mapping structure solves the aforementioned ill-posed problem from the perspective of bridging the gap between the metric-favorable and the perceptual-friendly versions. The multiband-aware implicit neural normalization effectively alleviates the degradation distribution. The U-shaped generator simulates human visual attention mechanisms, which enables the aggregation of global coarse-grained and local fine-grained features, and enhances the texture and edge features under the guidance of shallow semantics. The discriminator ensures perception-friendly visual results through a dual-branch structure via appearance and color. Extensive experiments and ablation analyses on the full-reference and nonreference underwater benchmarks demonstrate the superiority of our proposed method. It can restore degraded images in most underwater scenes with good generalization and robustness, and the code is available at https://github.com/SUIEDDM/INGC-GAN.
Tạo bộ sưu tập với mã QR

THƯ VIỆN - TRƯỜNG ĐẠI HỌC CÔNG NGHỆ TP.HCM

ĐT: (028) 36225755 | Email: tt.thuvien@hutech.edu.vn

Copyright @2024 THƯ VIỆN HUTECH