ZHENG Haojun, WANG Zhen, ZHANG Jiapeng, LIU Shengnan, QIAN Cheng, TU Xueying, LIU Shijing. Enhancement of underwater images on improved StarGAN by mixed attention module[J]. South China Fisheries Science, 2025, 21(1): 185-196. DOI: 10.12131/20240191
Citation: ZHENG Haojun, WANG Zhen, ZHANG Jiapeng, LIU Shengnan, QIAN Cheng, TU Xueying, LIU Shijing. Enhancement of underwater images on improved StarGAN by mixed attention module[J]. South China Fisheries Science, 2025, 21(1): 185-196. DOI: 10.12131/20240191

Enhancement of underwater images on improved StarGAN by mixed attention module

More Information
  • Received Date: August 11, 2024
  • Revised Date: November 06, 2024
  • Accepted Date: November 21, 2024
  • Available Online: November 29, 2024
  • Based on the characteristics of color cast and blur in underwater images, we proposed a StarGAN (Star generative adversarial networks) based on CBAM (Convolutional block attention module) improvement for the underwater multi turbidity image enhancement to address the problem of significant differences in underwater images with different turbidity levels. First, we collected two sets of underwater turbidity image datasets from laboratory and aquaculture platform environments by using an underwater camera. Secondly, we optimized StarGAN by introducing a CBAM consisting of a channel attention module and a spatial attention module in series after each ResidualBlock module. Finally, we conducted ablation experiments and compared them with other methods by using UIQM (Underwater color image quality measurement), UCIQE (Underwater color image quality evaluation)and Image entropy as image quality evaluation indicators. The results show that UIQM reached 1.18, UCIQE reached 30.13 and Image entropy reached 12.83 of the enhanced laboratory dataset. UIQM reached 0.52, UCIQE reached 10.35 and Image entropy reached 9.94 of the enhanced aquaculture platform dataset. The experimental results indicate that in ablation experiments and compared with the other methods, this method has a good effect on enhancing multi turbidity images, with the highest scores.

  • [1]
    黄月群, 李文菁, 黄寿琨, 等. 鱼类行为监测技术应用研究[J]. 水产学杂志, 2022, 35(2): 102-107.
    [2]
    刘世晶, 李国栋, 刘晃, 等. 中国水产养殖装备发展现状[J]. 水产学报, 2023, 47(11): 190-203.
    [3]
    黄一心, 鲍旭腾, 徐皓. 中国渔业装备科技研究进展[J]. 渔业现代化, 2023, 50(4): 1-11.
    [4]
    陈炜玲, 邱艳玲, 赵铁松, 等. 面向海洋的水下图像处理与视觉技术进展[J]. 信号处理, 2023, 39(10): 1748-1763.
    [5]
    GARCIA R, PRADOS R, QUINTANA J, et al. Automatic segmentation of fish using deep learning with application to fish size measurement[J]. ICES J Mar Sci, 2020, 77(4): 1354-1366. doi: 10.1093/icesjms/fsz186
    [6]
    KLEINHAPPEL T K, PIKE T W, BURMAN O H P. Stress-induced changes in group behaviour[J]. Sci Rep, 2019, 9(1): 17200. doi: 10.1038/s41598-019-53661-w
    [7]
    ZHANG P, YU H, LI H Q, et al. Msgnet: multi-source guidance network for fish segmentation in underwater videos[J]. Front Mar Sci, 2023, 10: 1256594. doi: 10.3389/fmars.2023.1256594
    [8]
    WANG H, ZHANG S, ZHAO S L, et al. Real-time detection and tracking of fish abnormal behavior based on improved YOLOV5 and SiamRPN++[J]. Comput Electron Agric, 2022, 192: 106512. doi: 10.1016/j.compag.2021.106512
    [9]
    王柯俨, 黄诗芮, 李云松. 水下光学图像重建方法研究进展[J]. 中国图象图形学报, 2022, 27(5): 1337-1358.
    [10]
    ZHOU J C, ZHANG D H, ZHANG W S. Classical and state-of-the-art approaches for underwater image defogging: a comprehensive survey[J]. Front Inform Technol Electron Eng, 2020, 21(12): 1745-1769. doi: 10.1631/FITEE.2000190
    [11]
    JAFFE J S. Computer modeling and the design of optimal underwater imaging systems[J]. IEEE J Oceanic Eng, 1990, 15(2): 101-111. doi: 10.1109/48.50695
    [12]
    DREWS P, NASCIMENTO E, MORAES F, et al. Transmission estimation in underwater single images[C]//Proc IEEE Int Conf Comput Vis Workshops, 2013: 825-830.
    [13]
    GALDRAN A, PARDO D, PICÓN A, et al. Automatic red-channel underwater image restoration[J]. J Vis Commun Image Represent, 2015, 26: 132-145. doi: 10.1016/j.jvcir.2014.11.006
    [14]
    FU X Y, CAO X Y. Underwater image enhancement with global-local networks and compressed-histogram equalization[J]. Signal Process Image Commun, 2020, 86: 115892. doi: 10.1016/j.image.2020.115892
    [15]
    ZHANG D, HE Z X, ZHANG X H, et al. Underwater image enhancement via multi-scale fusion and adaptive color-gamma correction in low-light conditions[J]. Eng Appl Artif Intell, 2023, 126: 106972. doi: 10.1016/j.engappai.2023.106972
    [16]
    TAO Y, DONG L L, XU W H. A novel two-step strategy based on white-balancing and fusion for underwater image enhancement[J]. IEEE Access, 2020, 8: 217651-217670. doi: 10.1109/ACCESS.2020.3040505
    [17]
    ZHOU J C, ZHANG D H, ZHANG W S. Underwater image enhancement method via multi-feature prior fusion[J]. Appl Intell, 2022, 52(14): 16435-16457. doi: 10.1007/s10489-022-03275-z
    [18]
    LI C Y, ANWAR S, PORIKLI F. Underwater scene prior inspired deep underwater image and video enhancement[J]. Pattern Recognit, 2020, 98: 107038. doi: 10.1016/j.patcog.2019.107038
    [19]
    LI C Y, GUO C L, REN W Q, et al. An underwater image enhancement benchmark dataset and beyond[J]. IEEE Trans Image Process, 2019, 29: 4376-4389.
    [20]
    PANETTA K, KEZEBOU L, OLUDARE V, et al. Comprehensive underwater object tracking benchmark dataset and underwater image enhancement with GAN[J]. IEEE J Oceanic Eng, 2021, 47(1): 59-75.
    [21]
    HAMBARDE P, MURALA S, DHALL A. UW-GAN: single-image depth estimation and image enhancement for underwater images[J]. IEEE Trans Instrument Meas, 2021, 70: 1-12.
    [22]
    ISLAM M J, XIA Y Y, SATTAR J. Fast underwater image enhancement for improved visual perception[J]. IEEE Robot Autom Lett, 2020, 5(2): 3227-3234. doi: 10.1109/LRA.2020.2974710
    [23]
    HAN J, ZHOU J, WANG L, et al. Fe-gan: fast and efficient underwater image enhancement model based on conditional GAN[J]. Electronics, 2023, 12(5): 1227. doi: 10.3390/electronics12051227
    [24]
    WANG Z F, LI C F, MO Y Z, et al. RCA-CycleGAN: unsupervised underwater image enhancement using red channel attention optimized CycleGAN[J]. Displays, 2023, 76: 102359. doi: 10.1016/j.displa.2022.102359
    [25]
    ZHANG D H, ZHOU J C, GUO C L, et al. Synergistic multiscale detail refinement via intrinsic supervision for underwater image enhancement[C]//Proceedings of the AAAI Conference on Artificial Intelligence, 2024, 38(7): 7033-7041.
    [26]
    SHEN Z, XU H Y, LUO T, et al. UDAformer: underwater image enhancement based on dual attention transformer[J]. Comput Graph, 2023, 111: 77-88. doi: 10.1016/j.cag.2023.01.009
    [27]
    方明, 刘小晗, 付飞蚺. 基于注意力的多尺度水下图像增强网络[J]. 电子与信息学报, 2021, 43(12): 3513-3521.
    [28]
    DU R, LI W W, CHEN S D, et al. Unpaired underwater image enhancement based on CycleGAN[J]. Information, 2021, 13: 1. doi: 10.3390/info13010001
    [29]
    CHOI Y, CHOI M, KIM M, et al. Stargan: unified generative adversarial networks for multi-domain image-to-image translation[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2018: 8789-8797.
    [30]
    王浩. 基于浑浊度分级的浑浊水体图像增强算法研究[D]. 武汉: 华中科技大学, 2021: 18-22.
    [31]
    WOO S, PARK J, LEE J Y, et al. Cbam: convolutional block attention module[C]//Proceedings of the European conference on computer vision (ECCV), 2018: 3-19.
    [32]
    YANG M, SOWMYA A. An underwater color image quality evaluation metric[J]. IEEE Trans Image Process, 2015, 24(12): 6062-6071. doi: 10.1109/TIP.2015.2491020
    [33]
    PANETTA K, GAO C, AGAIAN S. Human-visual-system-inspired underwater image quality measures[J]. IEEE J Oceanic Eng, 2015, 41(3): 541-551.
    [34]
    CHEN X Q, ZHANG Q Y, LIN M H, et al. No-reference color image quality assessment: From entropy to perceptual quality[J]. EURASIP J Image Video Process, 2019, 2019: 77. doi: 10.1186/s13640-018-0395-2
    [35]
    HAN J L, SHOEIBY M, MALTHUS T, et al. Underwater image restoration via contrastive learning and a real-world dataset[J]. Remote Sens, 2022, 14(17): 4297. doi: 10.3390/rs14174297
    [36]
    PENG Y T, CHEN Y R, CHEN Z, et al. Underwater image enhancement based on histogram-equalization approximation using physics-based dichromatic modeling[J]. Sensors, 2022, 22(6): 2168. doi: 10.3390/s22062168

Catalog

    Article views (98) PDF downloads (33) Cited by()
    Related

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return