TY - GEN
T1 - Optimal Use of Verbal Instructions for Multi-robot Human Navigation Guidance
AU - Yedidsion, Harel
AU - Deans, Jacqueline
AU - Sheehan, Connor
AU - Chillara, Mahathi
AU - Hart, Justin
AU - Stone, Peter
AU - Mooney, Raymond J.
N1 - Funding Information:
This work has taken place in the Learning Agents Research Group (LARG) at UT Austin. LARG research is supported in part by NSF (IIS-1637736, CPS-1739964, IIS-1724157), ONR (N00014-18-2243), FLI (RFP2-000), ARL, DARPA, and Lockheed Martin. Peter Stone serves on the Board of Directors of Cogitai, Inc. The terms of this arrangement have been reviewed and approved by the University of Texas at Austin in accordance with its policy on objectivity in research.
Funding Information:
Acknowledgment. This work has taken place in the Learning Agents Research Group (LARG) at UT Austin. LARG research is supported in part by NSF (IIS-1637736, CPS-1739964, IIS-1724157), ONR (N00014-18-2243), FLI (RFP2-000), ARL, DARPA, and Lockheed Martin. Peter Stone serves on the Board of Directors of Cogitai, Inc. The terms of this arrangement have been reviewed and approved by the University of Texas at Austin in accordance with its policy on objectivity in research.
PY - 2019
Y1 - 2019
N2 - Efficiently guiding humans in indoor environments is a challenging open problem. Due to recent advances in mobile robotics and natural language processing, it has recently become possible to consider doing so with the help of mobile, verbally communicating robots. In the past, stationary verbal robots have been used for this purpose at Microsoft Research, and mobile non-verbal robots have been used at UT Austin in their multi-robot human guidance system. This paper extends that mobile multi-robot human guidance research by adding the element of natural language instructions, which are dynamically generated based on the robots’ path planner, and by implementing and testing the system on real robots. Generating natural language instructions from the robots’ plan opens up a variety of optimization opportunities such as deciding where to place the robots, where to lead humans, and where to verbally instruct them. We present experimental results of the full multi-robot human guidance system and show that it is more effective than two baseline systems: one which only provides humans with verbal instructions, and another which only uses a single robot to lead users to their destinations.
AB - Efficiently guiding humans in indoor environments is a challenging open problem. Due to recent advances in mobile robotics and natural language processing, it has recently become possible to consider doing so with the help of mobile, verbally communicating robots. In the past, stationary verbal robots have been used for this purpose at Microsoft Research, and mobile non-verbal robots have been used at UT Austin in their multi-robot human guidance system. This paper extends that mobile multi-robot human guidance research by adding the element of natural language instructions, which are dynamically generated based on the robots’ path planner, and by implementing and testing the system on real robots. Generating natural language instructions from the robots’ plan opens up a variety of optimization opportunities such as deciding where to place the robots, where to lead humans, and where to verbally instruct them. We present experimental results of the full multi-robot human guidance system and show that it is more effective than two baseline systems: one which only provides humans with verbal instructions, and another which only uses a single robot to lead users to their destinations.
KW - Human robot interaction
KW - Indoor navigation
KW - Multi robot coordination
KW - Natural language
UR - http://www.scopus.com/inward/record.url?scp=85076545068&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85076545068&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-35888-4_13
DO - 10.1007/978-3-030-35888-4_13
M3 - Conference contribution
AN - SCOPUS:85076545068
SN - 9783030358877
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 133
EP - 143
BT - Social Robotics - 11th International Conference, ICSR 2019, Proceedings
A2 - Salichs, Miguel A.
A2 - Ge, Shuzhi Sam
A2 - Barakova, Emilia Ivanova
A2 - Cabibihan, John-John
A2 - Wagner, Alan R.
A2 - Castro-González, Álvaro
A2 - He, Hongsheng
PB - Springer
T2 - 11th International Conference on Social Robotics, ICSR 2019
Y2 - 26 November 2019 through 29 November 2019
ER -