Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
Next revisionBoth sides next revision
jobs [2014/02/18 21:22] tenorthjobs [2015/03/19 12:25] – [Theses and Jobs] balintbe
Line 3: Line 3:
 If you are looking for a bachelor/master thesis or a job as a student research assistant, you may find some interesting opportunities on this page. If you are looking for a bachelor/master thesis or a job as a student research assistant, you may find some interesting opportunities on this page.
  
-== Web-based knowledge visualizations (BA/HiWi) == 
- 
-{{ :research:knowledge-visualization.png?200}} 
- 
-The knowledge base visualization allows to display semantic environment maps, 
-objects, human poses, trajectories and similar information stored in the 
-robot's knowledge base. Its [[http://www.ros.org/wiki/mod_vis|current version]] 
-is written in Java and has grown old over the years. Recently, there has been 
-much progress in creating [[http://www.ros.org/wiki/interactive_markers|interactive 
-3D visualizations]] for ROS that can also be accessed with a 
-[[http://www.ros.org/wiki/wviz|web browser]]. This project is about creating 
-a web-based version of the visualization using these techniques. 
- 
-Requirements: 
-  * Good programming skills 
-  * Experience with JavaScript development 
-  * Knowledge of Web technologies (HTML, XML, OWL) 
- 
-Contact: [[team:moritz_tenorth|Moritz Tenorth]] 
- 
- 
-== Tools for knowledge acquisition from the Web (BA/MA/HiWi) == 
- 
-There are several options for doing a project related to the acquisition of 
-knowledge from Web sources like online shops, repositories of object models, 
-recipe databases, etc. 
- 
-Requirements: 
-  * Programing skills (Java) 
-  * Experience with Web languages and datamining techniques is helpful 
-  * Depending on the focus of the project, experience with database technology, natural-language processing or computer vision may be helpful 
- 
-Contact: [[team:moritz_tenorth|Moritz Tenorth]] 
  
  
Line 89: Line 56:
 [2] http://www.youtube.com/watch?v=0eIryyzlRwA [2] http://www.youtube.com/watch?v=0eIryyzlRwA
  
-== HiWi position: Segmentation and interpretation of 3D object models (HiWi/MA) == 
  
-{{ :research:cup2-segmented.png?170}}+== Kitchen Activity Games in a Realistic Robotic Simulator (BA/MA/HiWi)== 
 + {{ :research:gz_env1.png?200|}} 
  
-Competent object interaction requires knowledge about the structure and +Developing new activities and improving the current simulation framework done under the [[http://gazebosim.org/|Gazebo]] robotic simulator. Creating a custom GUI for the game, in order to launch new scenarios, save logs etc.
-composition of objects. In an ongoing research project, we are investigating +
-how part-based object models can automatically be extracted from CAD models +
-found on the Web, e.g. on the [[http://sketchup.google.com/3dwarehouse|3D warehouse]].+
  
-We are looking for a student research assistant to push this topic forward. +Requirements: 
-In close collaboration with the researchers of the Institute for Artificial +  * Good programming skills in C/C++ 
-Intelligence, the student will work on extending the software for mesh +  * Basic physics/rendering engine knowledge 
-segmentation, implement novel algorithms, improve the integration of the +  * Gazebo simulator basic tutorials
-methods into the robot'knowledge base, and update the modules for visualizing +
-the segmentation results. Depending on the results, cooperation on joint  +
-publications may be possible.+
  
-This HiWi position can serve as a starting point for future Bachelor's, +Contact: [[team:andrei_haidu|Andrei Haidu]] 
-Master's or Diploma Theses. Alternatively, we could extract a research + 
-project that could be worked on as a Master's or Diploma Thesis.+== Integrating Eye Tracking in the Kitchen Activity Games (BA/MA)== 
 + {{ :research:eye_tracker.png?200|}}  
 + 
 +Integrating the eye tracker in the [[http://gazebosim.org/|Gazebo]] based Kitchen Activity Games framework and logging the gaze of the user during the gameplay. From the information typical activities should be inferred.
  
 Requirements: Requirements:
-  * Good Java programming skills +  * Good programming skills in C/C++ 
-  * Basic knowledge of 3D geometry calculations +  * Gazebo simulator basic tutorials 
-  * Experience with working with 3D models is helpful+ 
 +Contact: [[team:andrei_haidu|Andrei Haidu]] 
 + 
 +== Hand Skeleton Tracking Using Two Leap Motion Devices (BA/MA)== 
 + {{ :research:leap_motion.jpg?200|}}  
 + 
 +Improving the skeletal tracking offered by the [[https://developer.leapmotion.com/|Leap Motion SDK]], by using two devices (one tracking vertically the other horizontally) and switching between them to the one that has the best current view of the hand. 
 + 
 +The tracked hand can then be used as input for the Kitchen Activity Games framework. 
 + 
 +Requirements: 
 +  * Good programming skills in C/C++ 
 + 
 +Contact: [[team:andrei_haidu|Andrei Haidu]] 
 + 
 +== Fluid Simulation in Gazebo (BA/MA)== 
 + {{ :research:fluid.png?200|}}  
 + 
 +[[http://gazebosim.org/|Gazebo]] currently only supports rigid body physics engines (ODE, Bullet etc.), however in some cases fluids are preferred in order to simulate as realistically as possible the given environment. 
 + 
 +Currently there is an [[http://gazebosim.org/tutorials?tut=fluids&cat=physics|experimental version]] of fluids  in Gazebo, using the [[http://onezero.ca/fluidix/|Fluidix]] library to run the fluids computation on the GPU. 
 + 
 +The computational method for the fluid simulation is SPH (Smoothed-particle Dynamics), however newer and better methods based on SPH are currently present 
 +and should be implemented (PCISPH/IISPH). 
 + 
 +The interaction between the fluid and the rigid objects is a naive one, the forces and torques are applied only from the particle collisions (not taking into account pressure and other forces). 
 + 
 +Another topic would be the visualization of the fluid, currently is done by rendering every particle. For the rendering engine [[http://www.ogre3d.org/|OGRE]] is used. 
 + 
 +Here is a [[https://vimeo.com/104629835|video]] example of the current state of the fluid in Gazebo.  
 + 
 +Requirements: 
 +  * Good programming skills in C/C++ 
 +  * Interest in Fluid simulation 
 +  * Basic physics/rendering engine knowledge 
 +  * Gazebo simulator and Fluidix basic tutorials 
 + 
 +Contact: [[team:andrei_haidu|Andrei Haidu]] 
 + 
 + 
 +== Automated sensor calibration toolkit (MA)== 
 + 
 +Computer vision is an important part of autonomous robots. For robots the image sensors are the main source of information of the surrounding world. Each camera is different, even if they are from the same production line. For computer vision, especially for robots manipulating their environment, it is important that the parameters for the cameras in use are well known. The calibration of a camera is a time consuming task, and the result depends highly on the chosen setup and the accuracy of the operator. 
 + 
 +The topic for this master thesis is to develop an automated system for calibrating cameras, especially RGB-D cameras like the Kinect v2. 
 + 
 +The system should be: 
 +  * independent of the camera type 
 +  * estimate intrinsics and extrinsics 
 +  * have depth calibration (case of RGBD) 
 +  * integrate capabilities from Halcon [1] 
 + 
 +Requirements:  
 +  * Good programming skills in Python and C/C++ 
 +  * ROS, OpenCV
  
-Contact: [[team:moritz_tenorth|Moritz Tenorth]]+[1] http://www.halcon.de/
  
 +Contact: [[team:thiemo_wiedemeyer|Thiemo Wiedemeyer]]




Prof. Dr. hc. Michael Beetz PhD
Head of Institute

Contact via
Andrea Cowley
assistant to Prof. Beetz
ai-office@cs.uni-bremen.de

Discover our VRB for innovative and interactive research


Memberships and associations:


Social Media: