TY - GEN
T1 - Synchronous dynamic view learning
T2 - 16th ACM/IEEE International Conference on Information Processing in Sensor Networks, IPSN 2017
AU - Rokni, Seyed Ali
AU - Ghasemzadeh, Hassan
N1 - Funding Information:
The authors would like to thank the shepherd Dr. Shahriar Nirjon and anonymous reviewers for providing valuable feedback. This work was supported in part by the National Science Foundation, under grant CNS-1566359. Any opinions, findings, conclusions, or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the funding organizations.
Publisher Copyright:
© 2017 ACM.
PY - 2017/4/18
Y1 - 2017/4/18
N2 - Wearable technologies play a central role in human-centered Internet-of-Things applications. Wearables leverage machine learning algorithms to detect events of interest such as physical activities and medical complications. These algorithms, however, need to be retrained upon any changes in configuration of the system, such as addition/removal of a sensor to/from the network or displacement/misplacement/mis-orientation of the physical sensors on the body. We challenge this retraining model by stimulating the vision of autonomous learning with the goal of eliminating the labor-intensive, time-consuming, and highly expensive process of collecting labeled training data in dynamic environments. We propose an approach for autonomous retraining of the machine learning algorithms in real-time without need for any new labeled training data. We focus on a dynamic setting where new sensors are added to the system and worn on various body locations. We capture the inherent correlation between observations made by a static sensor view for which trained algorithms exist and the new dynamic sensor views for which an algorithm needs to be developed. By applying our realtime dynamic-view autonomous learning approach, we achieve an average accuracy of 81.1% in activity recognition using three experimental datasets. This amount of accuracy represents more than 13.8% improvement in the accuracy due to the automatic labeling of the sensor data in the newly added sensor. This performance is only 11.2% lower than the experimental upper bound where labeled training data are collected with the new sensor.
AB - Wearable technologies play a central role in human-centered Internet-of-Things applications. Wearables leverage machine learning algorithms to detect events of interest such as physical activities and medical complications. These algorithms, however, need to be retrained upon any changes in configuration of the system, such as addition/removal of a sensor to/from the network or displacement/misplacement/mis-orientation of the physical sensors on the body. We challenge this retraining model by stimulating the vision of autonomous learning with the goal of eliminating the labor-intensive, time-consuming, and highly expensive process of collecting labeled training data in dynamic environments. We propose an approach for autonomous retraining of the machine learning algorithms in real-time without need for any new labeled training data. We focus on a dynamic setting where new sensors are added to the system and worn on various body locations. We capture the inherent correlation between observations made by a static sensor view for which trained algorithms exist and the new dynamic sensor views for which an algorithm needs to be developed. By applying our realtime dynamic-view autonomous learning approach, we achieve an average accuracy of 81.1% in activity recognition using three experimental datasets. This amount of accuracy represents more than 13.8% improvement in the accuracy due to the automatic labeling of the sensor data in the newly added sensor. This performance is only 11.2% lower than the experimental upper bound where labeled training data are collected with the new sensor.
KW - IoT
KW - Machine learning
KW - Signal processing
KW - Wearable sensors
UR - http://www.scopus.com/inward/record.url?scp=85019047946&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85019047946&partnerID=8YFLogxK
U2 - 10.1145/3055031.3055087
DO - 10.1145/3055031.3055087
M3 - Conference contribution
AN - SCOPUS:85019047946
T3 - Proceedings - 2017 16th ACM/IEEE International Conference on Information Processing in Sensor Networks, IPSN 2017
SP - 79
EP - 90
BT - Proceedings - 2017 16th ACM/IEEE International Conference on Information Processing in Sensor Networks, IPSN 2017
PB - Association for Computing Machinery, Inc
Y2 - 18 April 2017 through 20 April 2017
ER -