DTA

Archivio Digitale delle Tesi e degli elaborati finali elettronici

 

Tesi etd-04272017-173356

Tipo di tesi
Perfezionamento
Autore
MOSCHETTI, ALESSANDRA
URN
etd-04272017-173356
Titolo
Wearable sensors for gesture recognition in assisted living applications
Settore scientifico disciplinare
ING-IND/34
Corso di studi
INGEGNERIA - Biorobotics
Commissione
relatore Dott. CAVALLO, FILIPPO
Parole chiave
  • Gesture recognition
  • Sensor fusion
  • Wearable sensors
Data inizio appello
22/06/2017;
Disponibilità
completa
Riassunto analitico
Several technological solutions have been proposed to face the future demographic challenge and guarantee good-quality and sustainable health care services. Among the different devices, wearable sensors and robots are gaining a lot of attention. Thanks to low cost and miniaturisation, the former have been investigated widely in Ambient Assisted Living scenarios to measure physiological and movements parameters. Several examples of wearable sensors are already daily used by people. The latter have been proposed in the last years to assist elderly people at home, being able to perform different types of tasks and to interact both physically and socially with humans. Robots should be able to assist persons at home, helping them in physical activities, as well as to entertain and monitor them.

To increase the abilities of the robots, it is important to make them aware of the environment that surrounds them, especially when working with people. Robots must have a strong perception that allows them to interpret what users are doing to monitor elderly persons and be able to interact properly with them. Wearable sensors can be used to increase the perception abilities of robots, enhancing their monitoring and interacting capabilities.

In this context, this dissertation describes the use of wearable sensors to recognise human movements and gestures that can be used by the robot to monitor people and interact with them.

Regarding monitoring tasks, hand wearable sensors were used to recognise daily gestures. Initially, the performances of wearable sensors alone were investigated by mean of an extensive experimentation in a realistic environment to assess whether these sensors can give useful information. In particular, a mix of hand-oriented gestures and eating modes was chosen, all involving the movement of the hand to the head/mouth. Despite the similarity of the gestures, the addition of a sensor worn on the index finger to the commonly used wrist sensor allowed to recognise the different gestures, both with supervised and unsupervised machine learning algorithms, maintaining however low obtrusiveness.

Then, the use of the same sensors was evaluated in a system composed of the wearable sensors and a depth camera placed on a mobile robot. In this case, it was possible to see how sensors placed on the user can help the robot to improve its perception ability in more realistic conditions. An experimentation was performed placing the moving robot in front and sideways with respect to the user, thus adding noise to the data and occluding the dominant arm of the user. In this case, the wearable sensors placed on the wrist and on the index finger provided the useful information to increase the accuracy in distinguishing among ten different activities, overcoming also the problem of the occlusion that can affect vision sensors.

Finally, the feasibility and perceived usability of a human-robot interaction task carried out by mean of wearable sensors were investigated. In particular, wearable sensors on the feet were used to evaluate real-time human gait parameters that were then used to control and modulate the robot motion in two different tasks. Tests were carried out with users, which expressed a positive evaluation of the performance of the system. In this case, the use of wearable sensors allowed to make the robot moving according to the user movements, without the need of links between the robot and the persons.

Through the implementation and evaluation of these monitoring and interacting tasks, it can be seen how wearable sensors can increase the amount of information that the robot can perceive about the users. This dissertation is, therefore, a first step in the implementation of a system composed of wearable sensors and robot that can help people in daily life.
File