Gao, Shan, Tan, Ah-Hwee and Setchi, Rossi ORCID: https://orcid.org/0000-0002-7207-6544 2021. Learning ADL daily routines with spatiotemporal neural networks. IEEE Transactions on Knowledge and Data Engineering 33 (1) , pp. 143-153. 10.1109/TKDE.2019.2924623 |
PDF
- Accepted Post-Print Version
Download (393kB) |
Abstract
The activities of daily living (ADLs) refer to the activities performed by individuals on a daily basis and are the indicators of a person's habits, lifestyle, and wellbeing. Learning an individual's ADL daily routines has significant value in the healthcare domain. Specifically, ADL recognition and inter-ADL pattern learning problems have been studied extensively in the past couple of decades. However, discovering the patterns performed in a day and clustering them into ADL daily routines has been a relatively unexplored research area. In this paper, a self-organizing neural network model, called the Spatiotemporal ADL Adaptive Resonance Theory (STADLART), is proposed for learning ADL daily routines. STADLART integrates multimodal contextual information that involves the time and space wherein the ADL is performed. By encoding spatiotemporal information explicitly as input features, STADLART enables the learning of time-sensitive knowledge. Moreover, a STADLART variation named STADLART-NC is proposed to normalize and customize ADL weighting for daily routine learning. Empirical experiments using both synthetic and real-world public data sets validate the performance of STADLART and STADLART-NC when compared with alternative pattern discovery methods. The results show STADLART could cluster ADL routines with better performance than baseline algorithms.
Item Type: | Article |
---|---|
Date Type: | Publication |
Status: | Published |
Schools: | Engineering |
Publisher: | IEEE |
ISSN: | 1041-4347 |
Date of First Compliant Deposit: | 14 June 2019 |
Date of Acceptance: | 7 June 2019 |
Last Modified: | 02 Dec 2024 08:15 |
URI: | https://orca.cardiff.ac.uk/id/eprint/123460 |
Citation Data
Cited 4 times in Scopus. View in Scopus. Powered By Scopus® Data
Actions (repository staff only)
Edit Item |