Multi-Sensor Mobile Platform for the Recognition of Activities of Daily Living and their Environments based on Artificial Neural Networks

Multi-Sensor Mobile Platform for the Recognition of Activities of Daily Living and their Environments based on Artificial Neural Networks

Ivan Miguel Pires, Nuno Pombo, Nuno M. Garcia, Francisco Flórez-Revuelta

Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence

The recognition of Activities of Daily Living (ADL) and their environments based on sensors available in off-the-shelf mobile devices is an emerging topic. These devices are capable to acquire and process the sensors' data for the correct recognition of the ADL and their environments, providing a fast and reliable feedback to the user. However, the methods implemented in a mobile application for this purpose should be adapted to the low resources of these devices. This paper focuses on the demonstration of a mobile application that implements a framework, that forks their implementation in several modules, including data acquisition, data processing, data fusion and classification methods based on the sensors? data acquired from the accelerometer, gyroscope, magnetometer, microphone and Global Positioning System (GPS) receiver. The framework presented is a function of the number of sensors available in the mobile devices and implements the classification with Deep Neural Networks (DNN) that reports an accuracy between 58.02% and 89.15%.
Keywords:
Multidisciplinary Topics and Applications: Computational Biology and e-Health
Multidisciplinary Topics and Applications: Human-Computer Interaction
Planning and Scheduling: Activity and Plan Recognition
Multidisciplinary Topics and Applications: AI and Ubiquitous Computing Systems