The present PhD research explores the integration of vision devices and intelligent systems to monitor and enhance human well-being in healthcare and manufacturing contexts, start- ing from the standards proposed in Industry 4.0 and aiming to follow the principles of the novel Industry 5.0. Depth sensors and deep learning technologies have been exploited to ad- dress the critical aspects of human mobility assessment and action segmentation in real, non- simulated scenarios. The Microsoft Azure Kinect, a state-of-the-art depth sensor, has been selected as a key instrument for data collection, and innovative camera calibration methods have been developed to ensure the accuracy and reliability of the gathered data. Within the realm of healthcare, the research activity addresses the substantial challenges posed by neurodegenerative diseases in the well-being of older individuals. This part of the study focuses on monitoring and assessing the mobility of elderly patients, aiming to support remote diagnosis and improve their quality of life. Traditional mobility tests, administered by healthcare professionals, are essential for evaluating movement skills. Nevertheless, such techniques often suffer from human subjectivity, which could lead to errors in the assess- ments. To address such issues, video-based systems have been studied, aiming to remotely monitor and objectively evaluate mobility, reducing the burden on elderly patients. In the context of manufacturing, human actions are pivotal in enhancing operational ef- ficiency, productivity, and safety in manufacturing environments. Such challenges have led to the increasing use of industrial robotic solutions, mainly including collaborative robots, which can share a common workspace with humans, carrying out their respective tasks simul- taneously. This part of the research delves into the segmentation of human tasks for intel- ligent manufacturing systems, exploring the integration of vision devices and deep learning technologies to improve the efficiency and accuracy of manufacturing processes. In general, the study of such systems is aimed at creating comfortable work environments, adaptable to the needs and abilities of individual people, increasing the well-being of operators in a human-centered factory concept. The main goal of the present study is to evaluate the effectiveness of machine learning and deep learning models for mobility assessment and action segmentation, to determine their suitability for human monitoring. However, a notable gap in the literature is identified: the absence of datasets representing human actions in realistic environments. To bridge this gap, the research includes the creation and validation of datasets capturing human actions in healthcare and manufacturing scenarios, emphasizing the importance of generalization across different locations. By addressing the unique challenges in both healthcare and man- ufacturing, this study contributes to the development of intelligent systems that promote human well-being and enhance operational efficiency, aiming to align with the paradigms of Industry 5.0.
Vision devices and intelligent systems for monitoring the well-being of humans in healthcare and manufacturing / Romeo, Laura. - (2024 Jan 11).
Vision devices and intelligent systems for monitoring the well-being of humans in healthcare and manufacturing
Romeo, Laura
2024
Abstract
The present PhD research explores the integration of vision devices and intelligent systems to monitor and enhance human well-being in healthcare and manufacturing contexts, start- ing from the standards proposed in Industry 4.0 and aiming to follow the principles of the novel Industry 5.0. Depth sensors and deep learning technologies have been exploited to ad- dress the critical aspects of human mobility assessment and action segmentation in real, non- simulated scenarios. The Microsoft Azure Kinect, a state-of-the-art depth sensor, has been selected as a key instrument for data collection, and innovative camera calibration methods have been developed to ensure the accuracy and reliability of the gathered data. Within the realm of healthcare, the research activity addresses the substantial challenges posed by neurodegenerative diseases in the well-being of older individuals. This part of the study focuses on monitoring and assessing the mobility of elderly patients, aiming to support remote diagnosis and improve their quality of life. Traditional mobility tests, administered by healthcare professionals, are essential for evaluating movement skills. Nevertheless, such techniques often suffer from human subjectivity, which could lead to errors in the assess- ments. To address such issues, video-based systems have been studied, aiming to remotely monitor and objectively evaluate mobility, reducing the burden on elderly patients. In the context of manufacturing, human actions are pivotal in enhancing operational ef- ficiency, productivity, and safety in manufacturing environments. Such challenges have led to the increasing use of industrial robotic solutions, mainly including collaborative robots, which can share a common workspace with humans, carrying out their respective tasks simul- taneously. This part of the research delves into the segmentation of human tasks for intel- ligent manufacturing systems, exploring the integration of vision devices and deep learning technologies to improve the efficiency and accuracy of manufacturing processes. In general, the study of such systems is aimed at creating comfortable work environments, adaptable to the needs and abilities of individual people, increasing the well-being of operators in a human-centered factory concept. The main goal of the present study is to evaluate the effectiveness of machine learning and deep learning models for mobility assessment and action segmentation, to determine their suitability for human monitoring. However, a notable gap in the literature is identified: the absence of datasets representing human actions in realistic environments. To bridge this gap, the research includes the creation and validation of datasets capturing human actions in healthcare and manufacturing scenarios, emphasizing the importance of generalization across different locations. By addressing the unique challenges in both healthcare and man- ufacturing, this study contributes to the development of intelligent systems that promote human well-being and enhance operational efficiency, aiming to align with the paradigms of Industry 5.0.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


