In this study we analyzed deep learning methods for point clouds semantic segmentation. We compared PointNet and PointNet++ on data with different characteristics, coming from distinct domains, in order to understand their behavior. Finally, we exploited the so gained knowledge to improve the performance of the models on railway data. In particular, we properly updated the training protocol and altered the PointNet++ architecture, in order to perform transfer learning by leveraging the models previously trained in the first experiments. Results on both state-of-the-art datasets and on a custom dataset specifically acquired for this scope demonstrate that transfer learning can effectively boost the performance of the models in terms of prediction accuracy and convergence rate in the railway context.

Semantic segmentation of multimodal point clouds from the railway context

Nitti M;Maglietta R;Reno V
2021

Abstract

In this study we analyzed deep learning methods for point clouds semantic segmentation. We compared PointNet and PointNet++ on data with different characteristics, coming from distinct domains, in order to understand their behavior. Finally, we exploited the so gained knowledge to improve the performance of the models on railway data. In particular, we properly updated the training protocol and altered the PointNet++ architecture, in order to perform transfer learning by leveraging the models previously trained in the first experiments. Results on both state-of-the-art datasets and on a custom dataset specifically acquired for this scope demonstrate that transfer learning can effectively boost the performance of the models in terms of prediction accuracy and convergence rate in the railway context.
2021
computer vision
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/448316
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? ND
social impact