Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
Next revisionBoth sides next revision
teaching:gsoc2018 [2018/01/17 18:44] nygateaching:gsoc2018 [2018/01/22 09:27] – [RoboSherlock -- Framework for Cognitive Perception] ahaidu
Line 30: Line 30:
 package in the Python package index ([[https://pypi.python.org/pypi/pracmln|PyPI]]). package in the Python package index ([[https://pypi.python.org/pypi/pracmln|PyPI]]).
  
 +
 +===== RoboSherlock -- Framework for Cognitive Perception =====
 +
 +RoboSherlock is a common framework for cognitive perception, based on the principle of unstructured information management (UIM). UIM has proven itself to be a powerful paradigm for scaling intelligent information and question answering systems towards real-world complexity (i.e. the Watson system from IBM). Complexity in UIM is handled by identifying (or hypothesizing) pieces of
 +structured information in unstructured documents, by applying ensembles of experts for annotating information pieces, and by testing and integrating these isolated annotations into a comprehensive interpretation of the document.
 +
 +RoboSherlock builds on top of the ROS ecosystem and is able to wrap almost any existing perception algorithm/framework, and allows easy and coherent combination of the results of these. The framework has a close integration with two of the most popular libraries used in robotic perception, namely OpneCV and PCL. More details about RoboSherlock can be found on the project [[http://robosherlock.org/|webpage]].
 +
 +===== openEASE -- Web-based Robot Knowledge Service =====
 +
 +OpenEASE is a generic knowledge database for collecting and analyzing experiment data. Its foundation is the KnowRob knowledge processing system and ROS, enhanced by reasoning mechanisms and a web interface developed for inspecting comprehensive experiment logs. These logs can be recorded for example from complex CRAM plan executions, virtual reality experiments, or human tracking systems. OpenEASE offers interfaces for both, human researchers that want to visually inspect what has happened during a robot experiment, and robots that want to reason about previous task executions in order to improve their behavior.
 +
 +The OpenEASE web interface as well as further information and publication material can be accessed through its publicly available [[http://www.open-ease.org/|website]]. It is meant to make complex experiment data available to research fields adjacent to robotics, and to foster an intuition about robot experience data.
 ===== Proposed Topics ===== ===== Proposed Topics =====
  
Line 65: Line 78:
  
 **Contact:** [[team/daniel_nyga|Daniel Nyga]] **Contact:** [[team/daniel_nyga|Daniel Nyga]]
 +
 +
 +==== Topic 2: Felxible perception pipeline manipulation for RoboSherlock ====
 +
 +{{  :teaching:gsoc:topic1_rs.png?nolink&200|}}
 +
 +**Main Objective:** RoboSherlock is based on the unstructured information management paradigm and uses the uima library at it's core. The c++ implementation of this library is limited multiple ways. In this topic you will develop a module in order to flexibly manage perception pipelines by extending the current implementation to enable new modalities and  run pipelines in parallel. This involves implementing an API for pipeline and data handling that is rooted in the domain of UIMA. 
 +
 +**Task Difficulty:** The task is considered to be of medium difficulty. 
 +  
 +**Requirements:** Good programming skills in C++ and basic knowledge of CMake and ROS. Experience with PCL, OpenCV is prefered.
 +
 +**Expected Results:** an extension to RoboShelrock that allows splitting and joingin pipelines, executing them in parallel, merging results from multiple types of cameras etc. 
 +
 +Contact: [[team/ferenc_balint-benczedi|Ferenc Bálint-Benczédi]]
 +
  




Prof. Dr. hc. Michael Beetz PhD
Head of Institute

Contact via
Andrea Cowley
assistant to Prof. Beetz
ai-office@cs.uni-bremen.de

Discover our VRB for innovative and interactive research


Memberships and associations:


Social Media: