This paper presents a new method to automatically locate pupils in images (even with low-resolution) containing human faces. In particular pupils are localized by a two steps procedure: at first self-similarity information is extracted by considering the appearance variability of local regions and then they are combined with an estimator of circular shapes based on a modified version of the Circular Hough Transform. Experimental evidence of the effectiveness of the method was achieved on challenging databases containing facial images acquired under different lighting conditions and with different scales and poses.

A Fully Automatic Approach for the Accurate Localization of the Pupils

2013

Abstract

This paper presents a new method to automatically locate pupils in images (even with low-resolution) containing human faces. In particular pupils are localized by a two steps procedure: at first self-similarity information is extracted by considering the appearance variability of local regions and then they are combined with an estimator of circular shapes based on a modified version of the Circular Hough Transform. Experimental evidence of the effectiveness of the method was achieved on challenging databases containing facial images acquired under different lighting conditions and with different scales and poses.
2013
Istituto di Scienze Applicate e Sistemi Intelligenti "Eduardo Caianiello" - ISASI
Istituto Nazionale di Ottica - INO
Inglese
Petrosino, A
IMAGE ANALYSIS AND PROCESSING (ICIAP 2013), PT 1
17th International Conference on Image Analysis and Processing (ICIAP)
503
512
SEP 09-13, 2013
Naples, ITALY
Self-similarity; Saliency; Circularity Analysis; Pupil Localization
4
none
Leo, M Leo Marco; Cazzato, D Cazzato Dario; De Marco, T De Marco Tommaso; Distante, C Distante Cosimo
273
info:eu-repo/semantics/conferenceObject
04 Contributo in convegno::04.01 Contributo in Atti di convegno
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/263304
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? 0
social impact