Today's 3D scanning pipelines can be classified into two overarching categories: ofine, high accuracy methods that rely on global optimization to reconstruct complex scenes with hundreds of millions of samples, and online methods that produce real-time but low-quality output, usually from structure-from-motion or depth sensors. The method proposed in this paper is the first to combine the benefits of both approaches, supporting online reconstruction of scenes with hundreds of millions of samples from high-resolution sensing modalities such as structured light or laser scanners. The key property of our algorithm is that it sidesteps the signed-distance computation of classical reconstruction techniques in favor of direct filtering, parametrization, and mesh and texture extraction. All of these steps can be realized using only weak notions of spatial neighborhoods, which allows for an implementation that scales approximately linearly with the size of each dataset that is integrated into a partial reconstruction. Combined, these algorithmic differences enable a drastically more efficient output-driven interactive scanning and reconstruction workflow, where the user is able to see the final quality field-aligned textured mesh during the entirety of the scanning procedure. Holes or parts with registration problems are displayed in real-time to the user and can be easily resolved by adding further localized scans, or by adjusting the input point cloud using our interactive editing tools with immediate visual feedback on the output mesh. We demonstrate the effectiveness of our algorithm in conjunction with a state-of-the-art structured light scanner and optical tracking system and test it on a large variety of challenging models.

Field-aligned online surface reconstruction

Tarini M;
2017

Abstract

Today's 3D scanning pipelines can be classified into two overarching categories: ofine, high accuracy methods that rely on global optimization to reconstruct complex scenes with hundreds of millions of samples, and online methods that produce real-time but low-quality output, usually from structure-from-motion or depth sensors. The method proposed in this paper is the first to combine the benefits of both approaches, supporting online reconstruction of scenes with hundreds of millions of samples from high-resolution sensing modalities such as structured light or laser scanners. The key property of our algorithm is that it sidesteps the signed-distance computation of classical reconstruction techniques in favor of direct filtering, parametrization, and mesh and texture extraction. All of these steps can be realized using only weak notions of spatial neighborhoods, which allows for an implementation that scales approximately linearly with the size of each dataset that is integrated into a partial reconstruction. Combined, these algorithmic differences enable a drastically more efficient output-driven interactive scanning and reconstruction workflow, where the user is able to see the final quality field-aligned textured mesh during the entirety of the scanning procedure. Holes or parts with registration problems are displayed in real-time to the user and can be easily resolved by adding further localized scans, or by adjusting the input point cloud using our interactive editing tools with immediate visual feedback on the output mesh. We demonstrate the effectiveness of our algorithm in conjunction with a state-of-the-art structured light scanner and optical tracking system and test it on a large variety of challenging models.
2017
Istituto di Scienza e Tecnologie dell'Informazione "Alessandro Faedo" - ISTI
Surface Reconstruction
Parameterization
File in questo prodotto:
File Dimensione Formato  
prod_380092-doc_128813.pdf

solo utenti autorizzati

Descrizione: Field-aligned online surface reconstruction
Tipologia: Versione Editoriale (PDF)
Dimensione 20.09 MB
Formato Adobe PDF
20.09 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/334670
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 23
  • ???jsp.display-item.citation.isi??? ND
social impact