Edge computing on mobile marine platform is paramount for automated ecological monitoring. The goal of demonstrating the computational feasibility of an Artificial Intelligence (AI)-powered camera for fully automated real-time species-classification on deep-sea crawler platforms was searched by running You-Only-Look-Once (YOLO) model on an edge computing device (NVIDIA Jetson Nano), to evaluate the achievable animal detection performances, execution time and power consumption, using all the available cores. We processed a total of 337 rotating video scans (∼180°), taken during approximately 4 months in 2022 at the methane hydrates site of Barkley Canyon (Vancouver Island; BC; Canada), focusing on three abundant species (i.e., Sablefish Anoplopoma fimbria, Hagfish Eptatretus stoutii, and Rockfish Sebastes spp.). The model was trained on 1926 manually annotated video frames and showed high detection test performances in terms of accuracy (0.98), precision (0.98), and recall (0.99). The trained model was then applied on 337 videos. In 288 videos we detected a total of 133 Sablefish, 31 Hagfish, and 321 Rockfish nearly in real-time (about 0.31 s/image) with very low power consumption (0.34 J/image). Our results have broad implications on intelligent ecological monitoring. Indeed, YOLO model can meet operational-autonomy criteria for fast image processing with limited computational and energy loads

Automated Species Classification and Counting by Deep-Sea Mobile Crawler Platforms Using Yolo

Marini, S.
Methodology
;
2024

Abstract

Edge computing on mobile marine platform is paramount for automated ecological monitoring. The goal of demonstrating the computational feasibility of an Artificial Intelligence (AI)-powered camera for fully automated real-time species-classification on deep-sea crawler platforms was searched by running You-Only-Look-Once (YOLO) model on an edge computing device (NVIDIA Jetson Nano), to evaluate the achievable animal detection performances, execution time and power consumption, using all the available cores. We processed a total of 337 rotating video scans (∼180°), taken during approximately 4 months in 2022 at the methane hydrates site of Barkley Canyon (Vancouver Island; BC; Canada), focusing on three abundant species (i.e., Sablefish Anoplopoma fimbria, Hagfish Eptatretus stoutii, and Rockfish Sebastes spp.). The model was trained on 1926 manually annotated video frames and showed high detection test performances in terms of accuracy (0.98), precision (0.98), and recall (0.99). The trained model was then applied on 337 videos. In 288 videos we detected a total of 133 Sablefish, 31 Hagfish, and 321 Rockfish nearly in real-time (about 0.31 s/image) with very low power consumption (0.34 J/image). Our results have broad implications on intelligent ecological monitoring. Indeed, YOLO model can meet operational-autonomy criteria for fast image processing with limited computational and energy loads
2024
Istituto di Scienze Marine - ISMAR - Sede Secondaria Lerici
Artificial intelligence; Benthic fish; Cold-seep; Ecological monitoring; Edge-computing; Machine learning; Robotic platforms
File in questo prodotto:
File Dimensione Formato  
EcologicalInformatics_Crawler-YOLO_2024.pdf

accesso aperto

Tipologia: Versione Editoriale (PDF)
Licenza: Creative commons
Dimensione 7.15 MB
Formato Adobe PDF
7.15 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/504581
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 12
  • ???jsp.display-item.citation.isi??? 7
social impact