This paper presents a novel multi-sensor terrain classification approach using visual and proprioceptive data, to support autonomous operations by an agricultural vehicle. The novelty of the proposed method lies in the possibility to identify the terrain type relying not only on classical appearance-based features, such as color and geometric properties, but also on contact-based features, which measure the dynamic effects related to the vehicle-terrain interaction and directly affect vehicle's mobility. Using methods from the machine learning community, it is shown that it is not only possible to classify various kinds of terrain using either sensor modality, but that these modalities are complementary to each other, and can be therefore combined to improve classification results.

All-terrain estimation for mobile robots in precision agriculture

Milella A
2018-01-01

Abstract

This paper presents a novel multi-sensor terrain classification approach using visual and proprioceptive data, to support autonomous operations by an agricultural vehicle. The novelty of the proposed method lies in the possibility to identify the terrain type relying not only on classical appearance-based features, such as color and geometric properties, but also on contact-based features, which measure the dynamic effects related to the vehicle-terrain interaction and directly affect vehicle's mobility. Using methods from the machine learning community, it is shown that it is not only possible to classify various kinds of terrain using either sensor modality, but that these modalities are complementary to each other, and can be therefore combined to improve classification results.
2018
Istituto di Sistemi e Tecnologie Industriali Intelligenti per il Manifatturiero Avanzato - STIIMA (ex ITIA)
All-terrain estimation
mobile robots
precision agriculture
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/346383
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact