In this paper, we introduce an Asynchronous Multiview Learning (AML) approach to allow accurate transfer of activity classification models across asynchronous sensor views. Our study is motivated by the highly dynamic nature of health monitoring using wearable sensors. Such dynamics include changes in sensing platform (e.g., sensor upgrade) and platform settings (e.g., sampling frequency, on-body sensor location), which result in failure of the machine learning algorithms if they remain untrained in the new setting. Our approach allows machine learning algorithms to automatically reconfigure without any need for labeled training data in the new setting. Our evaluation using real data collected with wearable motion sensors demonstrates that the average classification accuracy using our automatically labeled training data is 85.2%. This accuracy is only 3.4% to 4.5% less than the experimental upper bound, where ground truth labeled training data are used to develop a new activity recognition classifier.
|Title of host publication
|2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBC 2016
|Institute of Electrical and Electronics Engineers Inc.
|Number of pages
|Published - Oct 13 2016
|38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBC 2016 - Orlando, United States
Duration: Aug 16 2016 → Aug 20 2016
|Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS
|38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBC 2016
|8/16/16 → 8/20/16
ASJC Scopus subject areas
- Signal Processing
- Biomedical Engineering
- Computer Vision and Pattern Recognition
- Health Informatics