Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revisionNext revisionBoth sides next revision | ||
teaching:gsoc2018 [2018/01/17 18:44] – nyga | teaching:gsoc2018 [2018/02/20 19:23] – [Topic 5: Unreal - openEASE Live Connection] ahaidu | ||
---|---|---|---|
Line 4: | Line 4: | ||
====== Google Summer of Code 2018 ====== | ====== Google Summer of Code 2018 ====== | ||
~~NOTOC~~ | ~~NOTOC~~ | ||
+ | |||
+ | In the following we shortly present the [[# | ||
+ | |||
+ | For the **proposed topics** see [[# | ||
+ | |||
+ | For **Q/A** check out our [[https:// | ||
+ | |||
+ | |||
+ | ===== Software ===== | ||
+ | |||
===== pracmln ===== | ===== pracmln ===== | ||
Line 29: | Line 39: | ||
and tutorials that facilitate getting started with MLNs. It is provided as a pip | and tutorials that facilitate getting started with MLNs. It is provided as a pip | ||
package in the Python package index ([[https:// | package in the Python package index ([[https:// | ||
+ | |||
+ | |||
+ | ===== RoboSherlock -- Framework for Cognitive Perception ===== | ||
+ | |||
+ | RoboSherlock is a common framework for cognitive perception, based on the principle of unstructured information management (UIM). UIM has proven itself to be a powerful paradigm for scaling intelligent information and question answering systems towards real-world complexity (i.e. the Watson system from IBM). Complexity in UIM is handled by identifying (or hypothesizing) pieces of | ||
+ | structured information in unstructured documents, by applying ensembles of experts for annotating information pieces, and by testing and integrating these isolated annotations into a comprehensive interpretation of the document. | ||
+ | |||
+ | RoboSherlock builds on top of the ROS ecosystem and is able to wrap almost any existing perception algorithm/ | ||
+ | |||
+ | ===== openEASE -- Web-based Robot Knowledge Service ===== | ||
+ | |||
+ | OpenEASE is a generic knowledge database for collecting and analyzing experiment data. Its foundation is the KnowRob knowledge processing system and ROS, enhanced by reasoning mechanisms and a web interface developed for inspecting comprehensive experiment logs. These logs can be recorded for example from complex CRAM plan executions, virtual reality experiments, | ||
+ | |||
+ | The OpenEASE web interface as well as further information and publication material can be accessed through its publicly available [[http:// | ||
+ | |||
+ | ===== RobCoG - Robot Commonsense Games ===== | ||
+ | |||
+ | [[http:// | ||
+ | |||
+ | The games are split into two categories: (1) VR/Full Body Tracking with physics based interactions, | ||
+ | |||
+ | |||
+ | ===== CRAM - Cognition-enabled Robot Executive ===== | ||
+ | |||
+ | CRAM is a software toolbox for the design, implementation and deployment of cognition-enabled plan execution on autonomous robots. CRAM equips autonomous robots with lightweight reasoning mechanisms that can infer control decisions rather than requiring the decisions to be preprogrammed. This way CRAM-programmed autonomous robots are more flexible and general than control programs that lack such cognitive capabilities. CRAM does not require the whole reasoning domain to be stated explicitly in an abstract knowledge base. Rather, it grounds symbolic expressions into the perception and actuation routines and into the essential data structures of the control plans. CRAM includes a domain-specific language that makes writing reactive concurrent robot behavior easier for the programmer. It extensively uses the ROS middleware infrastructure. | ||
+ | |||
+ | CRAM is an open-source project hosted on [[https:// | ||
+ | [[http:// | ||
+ | and tutorials that help to get started. | ||
+ | |||
===== Proposed Topics ===== | ===== Proposed Topics ===== | ||
Line 57: | Line 97: | ||
**Requirements: | **Requirements: | ||
language (CPython/ | language (CPython/ | ||
- | (ideally SRL technques and logic) | + | (ideally SRL technques and logic). Knowledge about C/C++ will be very helpful. |
**Expected Results:** The core components of pracmln, i.e. the learning | **Expected Results:** The core components of pracmln, i.e. the learning | ||
Line 66: | Line 106: | ||
**Contact: | **Contact: | ||
+ | **Remarks: | ||
+ | ==== Topic 2: Flexible perception pipeline manipulation for RoboSherlock ==== | ||
+ | |||
+ | {{ : | ||
+ | |||
+ | **Main Objective: | ||
+ | |||
+ | **Task Difficulty: | ||
+ | | ||
+ | **Requirements: | ||
+ | |||
+ | **Expected Results:** an extension to RoboShelrock that allows splitting and joingin pipelines, executing them in parallel, merging results from multiple types of cameras etc. | ||
+ | |||
+ | **Assignement: | ||
+ | |||
+ | ---- | ||
+ | |||
+ | e-mail: [[team/ | ||
+ | |||
+ | chat: | ||
+ | |||
+ | ==== Topic 3: Unreal - ROS 2 Integration ==== | ||
+ | |||
+ | {{ : | ||
+ | |||
+ | Since [[https:// | ||
+ | |||
+ | **Task Difficulty: | ||
+ | | ||
+ | **Requirements: | ||
+ | |||
+ | **Expected Results** We expect to have an integrated communication level with ROS 2 and Unreal Engine on Windows and Linux side. | ||
+ | |||
+ | Contact: [[team/ | ||
+ | |||
+ | Chat: [[https:// | ||
+ | |||
+ | |||
+ | ==== Topic 4: Unreal Editor User Interface Development ==== | ||
+ | |||
+ | {{ : | ||
+ | |||
+ | For this topic we would like to extend the modules from RobCoG with intuitive Unreal Engine Editor Panels. This would allow easier and faster manipulation/ | ||
+ | |||
+ | **Task Difficulty: | ||
+ | | ||
+ | **Requirements: | ||
+ | |||
+ | **Expected Results** We expect to have intuitive Unreal Engine UI Panels for editing, visualizing various RobCoG plugins data and features. | ||
+ | |||
+ | Contact: [[team/ | ||
+ | |||
+ | Chat: [[https:// | ||
+ | |||
+ | ==== Topic 5: Unreal - openEASE Live Connection ==== | ||
+ | |||
+ | {{ : | ||
+ | |||
+ | For this topic we would like to create a live connection between openEASE and RobCoG. A user should be able to connect to openEASE from the Unreal Engine Editor and perform various queries. For example to verify if the items from the Unreal Engine world are present in the ontology of the robot. It should be able to upload new data directly from the editor. | ||
+ | |||
+ | **Task Difficulty: | ||
+ | | ||
+ | **Requirements: | ||
+ | |||
+ | **Expected Results** We expect to have a live connection with between openEASE and the Unreal Engine editor. | ||
+ | |||
+ | Contact: [[team/ | ||
+ | |||
+ | Chat: [[https:// | ||
+ | |||
+ | ==== Topic 6: CRAM -- Visualizing Robot' | ||
+ | |||
+ | {{ : | ||
+ | |||
+ | **Main Objective: | ||
+ | |||
+ | **Task Difficulty: | ||
+ | |||
+ | |||
+ | {{ : | ||
+ | |||
+ | **Requirements: | ||
+ | * Familiarity with functional programming paradigms: some functional programming experience is a requirement (preferred language is Lisp but Haskel, Scheme, OCaml, Clojure, Scala or similar will do); | ||
+ | * Experience with ROS (Robot Operating System). | ||
+ | |||
+ | **Expected Results:** We expect operational and robust contributions to the source code of the existing robot control system including documentation. | ||
+ | |||
+ | Contact: [[team/ | ||
+ | |||
+ | ==== Topic 7: Robot simulation in Unreal Engine with PhysX ==== | ||
+ | |||
+ | {{ : | ||
+ | |||
+ | **Main Objective: | ||
+ | |||
+ | **Task Difficulty: | ||
+ | level, as it requires programming skills of various frameworks (Unreal Engine, | ||
+ | PhysX), expertise in robotic simulation and physics engines. | ||
+ | | ||
+ | **Requirements: | ||
+ | of the Unreal Engine and PhysX API. Experience in robotics and robotic simulation is a plus. | ||
+ | |||
+ | **Expected Results** We expect to be able to simulate robots in unreal, have support and able to control standard joints. | ||
+ | |||
+ | Contact: [[team/ |
Prof. Dr. hc. Michael Beetz PhD
Head of Institute
Contact via
Andrea Cowley
assistant to Prof. Beetz
ai-office@cs.uni-bremen.de
Discover our VRB for innovative and interactive research
Memberships and associations: