TY - GEN
T1 - mmLock
T2 - 32nd International Conference on Computer Communications and Networks, ICCCN 2023
AU - Xu, Jiawei
AU - Bi, Ziqian
AU - Singha, Amit
AU - Li, Tao
AU - Chen, Yimin
AU - Zhang, Yanchao
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - The use of smart devices such as smartphones, tablets, and laptops skyrocketed in the last decade. These devices enable ubiquitous applications for entertainment, communication, productivity, and healthcare but also introduce big concern about user privacy and data security. In addition to various authentication techniques, automatic and immediate device locking based on user leaving detection is an indispensable way to secure the devices. Current user leaving detection techniques mainly rely on acoustic ranging and do not work well in environments with multiple moving objects. In this paper, we present mmLock, a system that enables faster and more accurate user leaving detection in dynamic environments. mmLock uses a mmWave FMCW radar to capture the user's 3D mesh and detects the leaving gesture from the 3D human mesh data with a hybrid PointNet-LSTM model. Based on explainable user point clouds, mmLock is more robust than existing gesture recognition systems which can only identify the raw signal patterns. We implement and evaluate mmLock with a commercial off-the-shelf (COTS) TI mmWave radar in multiple environments and scenarios. We train the PointNet-LSTM model out of over 1 TB mmWave signal data and achieve 100% true-positive rate in most scenarios.
AB - The use of smart devices such as smartphones, tablets, and laptops skyrocketed in the last decade. These devices enable ubiquitous applications for entertainment, communication, productivity, and healthcare but also introduce big concern about user privacy and data security. In addition to various authentication techniques, automatic and immediate device locking based on user leaving detection is an indispensable way to secure the devices. Current user leaving detection techniques mainly rely on acoustic ranging and do not work well in environments with multiple moving objects. In this paper, we present mmLock, a system that enables faster and more accurate user leaving detection in dynamic environments. mmLock uses a mmWave FMCW radar to capture the user's 3D mesh and detects the leaving gesture from the 3D human mesh data with a hybrid PointNet-LSTM model. Based on explainable user point clouds, mmLock is more robust than existing gesture recognition systems which can only identify the raw signal patterns. We implement and evaluate mmLock with a commercial off-the-shelf (COTS) TI mmWave radar in multiple environments and scenarios. We train the PointNet-LSTM model out of over 1 TB mmWave signal data and achieve 100% true-positive rate in most scenarios.
UR - http://www.scopus.com/inward/record.url?scp=85173579399&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85173579399&partnerID=8YFLogxK
U2 - 10.1109/ICCCN58024.2023.10230151
DO - 10.1109/ICCCN58024.2023.10230151
M3 - Conference contribution
AN - SCOPUS:85173579399
T3 - Proceedings - International Conference on Computer Communications and Networks, ICCCN
BT - ICCCN 2023 - 2023 32nd International Conference on Computer Communications and Networks
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 24 July 2023 through 27 July 2023
ER -