Knowledge-Driven Robot Program Synthesis from Human VR Demonstrations (bibtex)
by Alt, Benjamin, Kenfack, Franklin Kenghagho, Haidu, Andrei, Katic, Darko, Jäkel, Rainer and Beetz, Michael
Abstract:
Aging societies, labor shortages and increasing wage costs call for assistance robots capable of autonomously performing a wide array of real-world tasks. Such open-ended robotic manipulation requires not only powerful knowledge representations and reasoning (KR&R) algorithms, but also methods for humans to instruct robots what tasks to perform and how to perform them. In this paper, we present a system for automatically generating executable robot control programs from human task demonstrations in virtual reality (VR). We leverage common-sense knowledge and game engine-based physics to semantically interpret human VR demonstrations, as well as an expressive and general task representation and automatic path planning and code generation, embedded into a state-of-the-art cognitive architecture. We demonstrate our approach in the context of force-sensitive fetch-and-place for a robotic shopping assistant. The source code is available at https://github.com/ease-crc/vr-program-synthesis.
Reference:
Alt, Benjamin, Kenfack, Franklin Kenghagho, Haidu, Andrei, Katic, Darko, Jäkel, Rainer and Beetz, Michael, "Knowledge-Driven Robot Program Synthesis from Human VR Demonstrations", In Proceedings of the 20th International Conference on Principles of Knowledge Representation and Reasoning, IJCAI, Rhodes, Greece, pp. 34–43, 2023.
Bibtex Entry:
@inproceedings{alt_knowledge-driven_2023,
  title = {Knowledge-{{Driven Robot Program Synthesis}} from {{Human VR Demonstrations}}},
  booktitle = {Proceedings of the 20th {{International Conference}} on {{Principles}} of {{Knowledge Representation}} and {{Reasoning}}},
  author = {Alt, Benjamin and Kenfack, Franklin Kenghagho and Haidu, Andrei and Katic, Darko and J{\"a}kel, Rainer and Beetz, Michael},
  year = {2023},
  month = sep,
  pages = {34--43},
  publisher = {IJCAI},
  address = {Rhodes, Greece},
  doi = {10.24963/kr.2023/4},
  abstract = {Aging societies, labor shortages and increasing wage costs call for assistance robots capable of autonomously performing a wide array of real-world tasks. Such open-ended robotic manipulation requires not only powerful knowledge representations and reasoning (KR\&R) algorithms, but also methods for humans to instruct robots what tasks to perform and how to perform them. In this paper, we present a system for automatically generating executable robot control programs from human task demonstrations in virtual reality (VR). We leverage common-sense knowledge and game engine-based physics to semantically interpret human VR demonstrations, as well as an expressive and general task representation and automatic path planning and code generation, embedded into a state-of-the-art cognitive architecture. We demonstrate our approach in the context of force-sensitive fetch-and-place for a robotic shopping assistant. The source code is available at https://github.com/ease-crc/vr-program-synthesis.},
  copyright = {All rights reserved},
  isbn = {978-1-956792-02-7}
}
Powered by bibtexbrowser