The measurement of object dimensions as well as the detection and localization of external defects are of large importance for many sectors in industry including agriculture, transportation and production. In this paper we investigate the feasibility of using commercial depth-sensing devices, based on a time-of-flight technology, such as the Kinect v2 camera, for the measurement and inspection of cuboidal objects (boxes). This paper presents a simplified system using only one Kinect sensor. At the beginning, object dimensions are roughly estimated by discovering the best-fit planes for a cloud of point based on a modified version of RANSAC (RANdom Sample Consensus). The precise geometry and morphology of the objects are then achieved by a transformation from depth to RGB representation of the points estimated as belonging to the object. RGB representation is finally processed (using scanlines on the RGB plane perpendicular to the initial edge estimate) to approximate at best the contour of the bounding box. In addition to the above, the paper proposes a method to automatically highlight defects on the objects' surfaces: this inspection task is performed through the analysis of both the 2D object contours and the histogram of the normalized depth values. The proposed methodology takes a few seconds to deliver the results for the monitored object and, it experienced encouraging results in terms of accuracy. Indeed, the system measured the dimensions of a set of cuboidal objects with an average error of 5 mm and it was able to identify and locate defects and holes on lateral and topmost surfaces. The experimental outcomes pointed out that the system could be effectively exploited within industrial inspection applications, even more so if the low cost of the system is taken under consideration.

Robust estimation of Object Dimensions and External Defect Detection with a Low-Cost Sensor

Marco Leo;Cosimo Distante
2017

Abstract

The measurement of object dimensions as well as the detection and localization of external defects are of large importance for many sectors in industry including agriculture, transportation and production. In this paper we investigate the feasibility of using commercial depth-sensing devices, based on a time-of-flight technology, such as the Kinect v2 camera, for the measurement and inspection of cuboidal objects (boxes). This paper presents a simplified system using only one Kinect sensor. At the beginning, object dimensions are roughly estimated by discovering the best-fit planes for a cloud of point based on a modified version of RANSAC (RANdom Sample Consensus). The precise geometry and morphology of the objects are then achieved by a transformation from depth to RGB representation of the points estimated as belonging to the object. RGB representation is finally processed (using scanlines on the RGB plane perpendicular to the initial edge estimate) to approximate at best the contour of the bounding box. In addition to the above, the paper proposes a method to automatically highlight defects on the objects' surfaces: this inspection task is performed through the analysis of both the 2D object contours and the histogram of the normalized depth values. The proposed methodology takes a few seconds to deliver the results for the monitored object and, it experienced encouraging results in terms of accuracy. Indeed, the system measured the dimensions of a set of cuboidal objects with an average error of 5 mm and it was able to identify and locate defects and holes on lateral and topmost surfaces. The experimental outcomes pointed out that the system could be effectively exploited within industrial inspection applications, even more so if the low cost of the system is taken under consideration.
2017
Istituto di Scienze Applicate e Sistemi Intelligenti "Eduardo Caianiello" - ISASI
Volume measurement 3D reconstruction Point cloud Defect detection Depth camera
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/327746
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact