In 3D computer vision a relevant problem is to match a "source" image dataset with a "target" image dataset. The matching problem can be faced using a neural net approach, where the nodes are related with the image voxels and the synapses to the voxel information. This paper presents an improvement of the "Volume-Matcher" project, n approach to the data-driver and registration of three-dimensional images based on 3D neural networks. The approach has been improved by introducing a method for an efficient mapping of a regular mesh into a 3D neural network in order to reduce the computational complexity. The algorithms developed have been tested on real cases of interest in the field of medical imaging.

An efficient method to map a regular mesh into a 3D neural network

Salvetti O
2001

Abstract

In 3D computer vision a relevant problem is to match a "source" image dataset with a "target" image dataset. The matching problem can be faced using a neural net approach, where the nodes are related with the image voxels and the synapses to the voxel information. This paper presents an improvement of the "Volume-Matcher" project, n approach to the data-driver and registration of three-dimensional images based on 3D neural networks. The approach has been improved by introducing a method for an efficient mapping of a regular mesh into a 3D neural network in order to reduce the computational complexity. The algorithms developed have been tested on real cases of interest in the field of medical imaging.
2001
Istituto di Scienza e Tecnologie dell'Informazione "Alessandro Faedo" - ISTI
3D neural netwok
3D computer vision
Image representation: volumetric
File in questo prodotto:
File Dimensione Formato  
prod_91419-doc_141113.pdf

solo utenti autorizzati

Descrizione: An efficient method to map a regular mesh into a 3D neural network
Tipologia: Versione Editoriale (PDF)
Dimensione 1.08 MB
Formato Adobe PDF
1.08 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/113174
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact