The abundance of tracking sensors in recent years has led to the generation of high-frequency and high-volume streams of data, including vessel locations, marine observations captured from many sensors (living resources, sea state, weather conditions, etc.). However, there are cases where the trajectory of a moving object has gaps, errors, or is unavailable. Thus, while a vast pool of tracking data is available, these data remain unexplored or underutilized and have the potential to reveal important information. The MUlti-Sensor Inferred Trajectories (MUSIT) project aims to explore and fuse data from all heterogeneous sources to provide detailed information about the location and behavior of a moving object, reduce gaps, and produce a refined and inferred trajectory with minimal errors. The fusion of multi-sensor data is required to fill in the trajectory gaps of moving objects and attach useful semantics to the trajectory. Artificial intelligence algorithms and spatiotem-poral methodologies that can fuse information and infer missing knowledge are also crucial. Furthermore, different representation models from multiple sensors will also be explored. Multi-sensor datasets will be designed and made available to experiment with models, fusion and trajectory inference algorithms, and deduce new knowledge. Therefore, the MUSIT project will tackle these issues in a three-step process: i) data collection and creation, ii) exploitation and utilization of cross-domain representation models for trajectories, and iii) analysis and processing of outcomes to produce information-rich results related to vessel monitoring
Multi-Sensor Inferred Trajectories (MUSIT) for vessel mobility
Renso C.;Carlini E.
2025
Abstract
The abundance of tracking sensors in recent years has led to the generation of high-frequency and high-volume streams of data, including vessel locations, marine observations captured from many sensors (living resources, sea state, weather conditions, etc.). However, there are cases where the trajectory of a moving object has gaps, errors, or is unavailable. Thus, while a vast pool of tracking data is available, these data remain unexplored or underutilized and have the potential to reveal important information. The MUlti-Sensor Inferred Trajectories (MUSIT) project aims to explore and fuse data from all heterogeneous sources to provide detailed information about the location and behavior of a moving object, reduce gaps, and produce a refined and inferred trajectory with minimal errors. The fusion of multi-sensor data is required to fill in the trajectory gaps of moving objects and attach useful semantics to the trajectory. Artificial intelligence algorithms and spatiotem-poral methodologies that can fuse information and infer missing knowledge are also crucial. Furthermore, different representation models from multiple sensors will also be explored. Multi-sensor datasets will be designed and made available to experiment with models, fusion and trajectory inference algorithms, and deduce new knowledge. Therefore, the MUSIT project will tackle these issues in a three-step process: i) data collection and creation, ii) exploitation and utilization of cross-domain representation models for trajectories, and iii) analysis and processing of outcomes to produce information-rich results related to vessel monitoring| File | Dimensione | Formato | |
|---|---|---|---|
|
MUSIT_project_paper___IEEE_Oceans_2025.pdf
accesso aperto
Descrizione: Multi-Sensor Inferred Trajectories (MUSIT) for Vessel Mobility PDF
Tipologia:
Documento in Post-print
Licenza:
Altro tipo di licenza
Dimensione
914.24 kB
Formato
Adobe PDF
|
914.24 kB | Adobe PDF | Visualizza/Apri |
|
Carlini-Renso et al_IEEE OCEANS-2025.pdf
solo utenti autorizzati
Descrizione: Multi-Sensor Inferred Trajectories (MUSIT) for Vessel Mobility
Tipologia:
Versione Editoriale (PDF)
Licenza:
NON PUBBLICO - Accesso privato/ristretto
Dimensione
1.53 MB
Formato
Adobe PDF
|
1.53 MB | Adobe PDF | Visualizza/Apri Richiedi una copia |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


