Detecting geometric changes between two 3D captures of the same location performed at different moments is a critical operation for all systems requiring a precise segmentation between change and no-change regions. Such application scenarios include 3D surface reconstruction, environment monitoring, natural events management and forensic science. Unfortunately, typical 3D scanning setups cannot provide any one-to-one mapping between measured samples in static regions: in particular, both extrinsic and intrinsic sensor parameters may vary over time while sensor noise and outliers additionally corrupt the data. In this paper, we adopt a multi-scale approach to robustly tackle these issues. Starting from two point clouds, we first remove outliers using a probabilistic operator. Then, we detect the actual change using the implicit surface defined by the point clouds under a Growing Least Square reconstruction that, compared to the classical proximity measure, offers a more robust change/no-change characterization near the temporal intersection of the scans and in the areas exhibiting different sampling density and direction. The resulting classification is enhanced with a spatial reasoning step to solve critical geometric configurations that are common in man-made environments. We validate our approach on a synthetic test case and on a collection of real data sets acquired using commodity hardware. Finally, we show how 3D reconstruction benefits from the resulting precise change/no-change segmentation.

Detection of geometric temporal changes in point clouds

Palma G;Cignoni P;Scopigno R
2016

Abstract

Detecting geometric changes between two 3D captures of the same location performed at different moments is a critical operation for all systems requiring a precise segmentation between change and no-change regions. Such application scenarios include 3D surface reconstruction, environment monitoring, natural events management and forensic science. Unfortunately, typical 3D scanning setups cannot provide any one-to-one mapping between measured samples in static regions: in particular, both extrinsic and intrinsic sensor parameters may vary over time while sensor noise and outliers additionally corrupt the data. In this paper, we adopt a multi-scale approach to robustly tackle these issues. Starting from two point clouds, we first remove outliers using a probabilistic operator. Then, we detect the actual change using the implicit surface defined by the point clouds under a Growing Least Square reconstruction that, compared to the classical proximity measure, offers a more robust change/no-change characterization near the temporal intersection of the scans and in the areas exhibiting different sampling density and direction. The resulting classification is enhanced with a spatial reasoning step to solve critical geometric configurations that are common in man-made environments. We validate our approach on a synthetic test case and on a collection of real data sets acquired using commodity hardware. Finally, we show how 3D reconstruction benefits from the resulting precise change/no-change segmentation.
2016
Istituto di Scienza e Tecnologie dell'Informazione "Alessandro Faedo" - ISTI
Digital geometry processing
Object scanning/acquisition
Point-based methods
Computer Graphics
Shape modelling
Point-based models
File in questo prodotto:
File Dimensione Formato  
prod_359538-doc_117981.pdf

solo utenti autorizzati

Descrizione: Detection of Geometric Temporal Changes in Point Clouds
Tipologia: Versione Editoriale (PDF)
Dimensione 17.77 MB
Formato Adobe PDF
17.77 MB Adobe PDF   Visualizza/Apri   Richiedi una copia
prod_359538-doc_126694.pdf

accesso aperto

Descrizione: Postprint - Detection of Geometric Temporal Changes in Point Clouds
Tipologia: Versione Editoriale (PDF)
Dimensione 3.16 MB
Formato Adobe PDF
3.16 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/316540
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 11
  • ???jsp.display-item.citation.isi??? ND
social impact