Visual surveillance is often based on background subtraction; it usually detects moving regions in a rough way, with the presence of shadows, ghosts and reflections. In order to improve quality of segmented objects by removing these artifacts in this work we propose an approach based on edge matching. The basic idea is that edges extracted in shadow (or ghost) regions in current image exactly match with edges extracted in the same regions in the background image. On the contrary, edges extracted on foreground objects have not correspondent edges in the background image. A preliminary segmentation procedure based on the uniformity of photometric gain between adjacent points has been carried out to allow a better shadow removing. The algorithm has been tested in many different real contexts, both in indoor and outdoor context.

Edge-Based Algorithm for Shadows and Ghosts Removing

MNitti
2009

Abstract

Visual surveillance is often based on background subtraction; it usually detects moving regions in a rough way, with the presence of shadows, ghosts and reflections. In order to improve quality of segmented objects by removing these artifacts in this work we propose an approach based on edge matching. The basic idea is that edges extracted in shadow (or ghost) regions in current image exactly match with edges extracted in the same regions in the background image. On the contrary, edges extracted on foreground objects have not correspondent edges in the background image. A preliminary segmentation procedure based on the uniformity of photometric gain between adjacent points has been carried out to allow a better shadow removing. The algorithm has been tested in many different real contexts, both in indoor and outdoor context.
2009
Istituto di Studi sui Sistemi Intelligenti per l'Automazione - ISSIA - Sede Bari
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/29532
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? 0
social impact