Robot learning of manipulation activities with overall planning through precedence graph

Xin Ye, Zhe Lin, Yezhou Yang

Research output: Contribution to journalArticlepeer-review

3 Scopus citations


One critical aspect of robotic visual learning is to capture the precedence relations among primitive actions from observing human performing manipulation activities. Current state-of-the-art spatial–temporal representations do not fully capture the precedence relations. In this paper, we present a novel activity representation: Manipulation Precedence Graph (MPG) and its associated overall planning module, with the goal to enable robot to learn manipulation activities from human demonstrations with overall planning. Experiments conducted on three publicly available manipulation activity video corpora as well as on a simulation platform validate that (1) the generated MPG from our system is robust given noisy detections from perception modules; (2) the overall planning module is able to generate parallel action sequences for robot to execute them in parallel; (3) the overall system improves robots’ manipulation execution efficiency.

Original languageEnglish (US)
Pages (from-to)126-135
Number of pages10
JournalRobotics and Autonomous Systems
StatePublished - Jun 2019


  • AI and robotics
  • Intelligent systems
  • Manipulation precedence graph
  • Understanding human activities

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Software
  • General Mathematics
  • Computer Science Applications


Dive into the research topics of 'Robot learning of manipulation activities with overall planning through precedence graph'. Together they form a unique fingerprint.

Cite this