Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
Next revisionBoth sides next revision
jobs [2015/11/27 10:22] – [Theses and Jobs] ahaidujobs [2019/09/18 14:42] – [Open researcher positions] nyga
Line 1: Line 1:
 ~~NOTOC~~ ~~NOTOC~~
-=====Theses and Jobs=====+ 
 +=====Open researcher positions===== 
 + 
 +=====Theses and Student Jobs=====
 If you are looking for a bachelor/master thesis or a job as a student research assistant, you may find some interesting opportunities on this page. If you are looking for a bachelor/master thesis or a job as a student research assistant, you may find some interesting opportunities on this page.
  
  
  
-== Kitchen Activity Games in a Realistic Robotic Simulator (BA/MA)== 
- {{ :research:gz_env1.png?200|}}  
  
-Developing new activities and improving the current simulation framework done under the [[http://gazebosim.org/|Gazebo]] robotic simulator. Creating a custom GUI for the game, in order to launch new scenarios, save logs etc.+ 
 +== Visualization support assistant (HiWi) == 
 + 
 +Implementing a visualization web page for a project using a custom framework using python/javascriptStart date: Oct 2019
  
 Requirements: Requirements:
-  * Good programming skills in C/C++ +  * Good python programming skills 
-  * Basic physics/rendering engine knowledge +  * Familiar with Javascript 
-  * Gazebo simulator basic tutorials+  * Experience in web development is recommended 
 +  * Familiar with version-control systems (git) 
 +  * Able to work independently with minimal supervision
  
-Contact: [[team:andrei_haidu|Andrei Haidu]]+Contact: [[team:mareike_picklum|Mareike Picklum]]
  
-== Integrating Eye Tracking in the Kitchen Activity Games (BA/MA)== 
- {{ :research:eye_tracker.png?200|}}  
  
-Integrating the eye tracker in the [[http://gazebosim.org/|Gazebo]] based Kitchen Activity Games framework and logging the gaze of the user during the gameplay. From the information typical activities should be inferred.+== Knowledge-enabled PID Controller for 3D Hand Movements in Virtual Environments (BA/MA Thesis) == 
 + 
 +Implementing a force-, velocity- and impulse-based PID controller for precise and responsive hand movements in a virtual environment. The virtual environment used in Unreal Engine in combination with Virtual Reality devices. The movements 
 +of the human user will be mapped to the virtual hands, and controlled via the implemented PID controllers. 
 +The controller should be able to dynamically tune itself depending on the executed actions (opening/closing a drawer or lifting a heavy object) in combination with the physical limitations of the physics engine (dynamic update rates).
  
 Requirements: Requirements:
-  * Good programming skills in C/C++ +  * Good C++ programming skills 
-  * Gazebo simulator basic tutorials+  * Familiar with PID controllers and control theory 
 +  * Experience with simulators/game engines is recommended 
 +  * Experience with Unreal Engine 
 +  * Familiar with version-control systems (git) 
 +  * Able to work independently with minimal supervision
  
 Contact: [[team:andrei_haidu|Andrei Haidu]] Contact: [[team:andrei_haidu|Andrei Haidu]]
  
-== Hand Skeleton Tracking Using Two Leap Motion Devices (BA/MA)== +<html><!-- 
- {{ :research:leap_motion.jpg?200|}} +== Lisp CRAM support assistant (HiWi) ==
  
-Improving the skeletal tracking offered by the [[https://developer.leapmotion.com/|Leap Motion SDK]], by using two devices (one tracking vertically the other horizontally) and switching between them to the one that has the best current view of the hand+Technical support for the group for Lisp and the CRAM framework\\ 
- +8+ hours per week for up to 1 year (paid).
-The tracked hand can then be used as input for the Kitchen Activity Games framework.+
  
 Requirements: Requirements:
-  * Good programming skills in C/C+++  * Good programming skills in Common Lisp 
 +  * Basic ROS knowledge
  
-Contact: [[team:andrei_haidu|Andrei Haidu]]+The student will be introduced to the CRAM framework at the beginning of the job, which is a robot programming framework written in Lisp. The student will then be responsible for assisting not familiar with the framework people, explaining them the parts they don't understand and pointing them to the relevant documentation sources.
  
-== Fluid Simulation in Gazebo (BA/MA)== +Contact[[team:gayane_kazhoyan|Gayane Kazhoyan]] 
- {{ :research:fluid.png?200|}} +--></html>
  
-[[http://gazebosim.org/|Gazebo]] currently only supports rigid body physics engines (ODE, Bullet etc.), however in some cases fluids are preferred in order to simulate as realistically as possible the given environment.+<html><!-- 
 +== Mesh Editing Mesh Segmentation/Cutting (Student Job HiWi)== 
 + {{ :research:human_hand_cutting.png?150|}}
  
-Currently there is an [[http://gazebosim.org/tutorials?tut=fluids&cat=physics|experimental version]] of fluids  in Gazebo, using the [[http://onezero.ca/fluidix/|Fluidix]] library to run the fluids computation on the GPU.+ Editing and cutting a human mesh into different parts in Blender Maya (or other).
  
-The computational method for the fluid simulation is SPH (Smoothed-particle Dynamics), however newer and better methods based on SPH are currently present +Requirements: 
-and should be implemented (PCISPH/IISPH).+  * Good knowledge in 3D Modeling 
 +  * Familiar with Blender / Maya (or other)
  
-The interaction between the fluid and the rigid objects is a naive one, the forces and torques are applied only from the particle collisions (not taking into account pressure and other forces).+Contact: [[team/mona_abdel-keream|Mona Abdel-Keream]] 
 +--></html>
  
-Another topic would be the visualization of the fluid, currently is done by rendering every particle. For the rendering engine [[http://www.ogre3d.org/|OGRE]] is used.+<html><!-- 
 +== 3D Model Material Lightning Developer (Student Job / HiWi)== 
 + {{ :research:kitchen_unreal.jpg?200|}} 
  
-Here is a [[https://vimeo.com/104629835|video]] example of the current state of the fluid in Gazebo+Developing and improving existing 3D models in Blender / Maya (or other). Importing the models in Unreal Engine, where the Materials and Lightning should be improved to be close as possible to realism.  
 + 
 +Bonus: Working with state of the art 3D Scanners [[https://www.goscan3d.com/|Go!SCAN]] for creating new realistic models.
  
 Requirements: Requirements:
-  * Good programming skills in C/C++ +  * Experience with Blender Maya (or other) 
-  * Interest in Fluid simulation +  * Knowledge of Unreal Engine material / lightning development 
-  * Basic physics/rendering engine knowledge +  * Familiar with version-control systems (git) 
-  * Gazebo simulator and Fluidix basic tutorials+  * Able to work independently with minimal supervision
  
-Contact: [[team:andrei_haidu|Andrei Haidu]] 
  
  
-== Automated sensor calibration toolkit (BA/MA)==+Contact: [[team:andrei_haidu|Andrei Haidu]] 
 +--></html>
  
-Computer vision is an important part of autonomous robots. For robots the image sensors are the main source of information of the surrounding world. Each camera is different, even if they are from the same production line. For computer vision, especially for robots manipulating their environment, it is important that the parameters for the cameras in use are well known. The calibration of a camera is a time consuming task, and the result depends highly on the chosen setup and the accuracy of the operator.+<html><!-- 
 +== Integrating PR2 in the Unreal Game Engine Framework (BA/MA/HiWi)== 
 + {{ :research:unreal_ros_pr2.png?100|}} 
  
-The topic for this thesis is to develop an automated system for calibrating cameras, especially RGB-D cameras like the Kinect v2.+Integrating the [[https://www.willowgarage.com/pages/pr2/overview|PR2]] robot with [[http://www.ros.org/|ROS]] support in the [[https://www.unrealengine.com|Unreal Engine 4]] Framework.
  
- {{ :kinect2_calibration_setup_small.jpg?200|}} +Requirements
-The system should: +  * Good programming skills in C/C++ 
-  * be independent of the camera type +  * Basic physics/rendering engine knowledge 
-  * estimate intrinsic and extrinsic parameters +  * Basic ROS knowledge 
-  * calibrate depth images (case of RGB-D) +  * UE4 basic tutorials
-  * integrate capabilities from Halcon [1] +
-  * operate autonomously+
  
-Requirements +Contact[[team:andrei_haidu|Andrei Haidu]]
-  * Good programming skills in Python and C/C++ +
-  * ROS, OpenCV+
  
-[1] http://www.halcon.de/ 
  
-Contact: [[team:alexis_maldonado|Alexis Maldonado]] and [[team:thiemo_wiedemeyer|Thiemo Wiedemeyer]]+== Realistic Grasping using Unreal Engine (BA/MA/HiWi) ==
  
-== On-the-fly 3D CAD model creation (MA)==+{{  :teaching:gsoc:topic2_unreal.png?nolink&150|}}
  
-Create models during runtime for unknown textured objets based on depth and color information. Track the object and update the model with more detailed information, completing it's 3D model from multiple views improving redetectionUsing the robots manipulator pick up the object and complete the model by viewing it from multiple viewpoints.+The objective of the project is to implement var- 
 +ious human-like grasping approaches in a game developed using [[https://www.unrealengine.com/|Unreal Engine]]
  
 +The game consist of a household environment where a user has to execute various given tasks, such as cooking a dish, setting the table, cleaning the dishes etc. The interaction is done using various sensors to map the users hands onto the virtual hands in the game.
 +
 +In order to improve the ease of manipulating objects the user should
 +be able to switch during runtime the type of grasp (pinch, power
 +grasp, precision grip etc.) he/she would like to use.
 +  
 Requirements:  Requirements: 
-  * Good programming skills in C/C++ +  * Good programming skills in C++ 
-  * strong background in computer vision  +  * Good knowledge of the Unreal Engine API.  
-  * ROS, OpenCV, PCL+  * Experience with skeletal control / animations / 3D models in Unreal Engine. 
 +  * Familiar with version-control systems (git) 
 +  * Able to work independently with minimal supervision
  
-Contact: [[team:thiemo_wiedemeyer|Thiemo Wiedemeyer]] 
  
-== Simulation of a robots belief state to support perception(MA) ==+Contact: [[team/andrei_haidu|Andrei Haidu]] 
 +--></html>
  
-Create a simulation environment that represents the robots current belief state and can be updated frequently. Use off-screen rendering to investigate the affordances these objects possess, in order to support segmentation, detection and tracking of these in the real world+<html><!-- 
 +== Unreal Engine Editor Developer (Student Job / HiWi)== 
 + {{ :research:unreal_editor.png?150|}} 
  
-Requirements:  +Creating new user interfaces (panel customization) for various internal plugins using the Unreal C++ framework [[https://docs.unrealengine.com/latest/INT/Programming/Slate/|SLATE]].
-  * Good programming skills in C/C++ +
-  * strong background in computer vision  +
-  * Gazebo, OpenCV, PCL+
  
-Contact: [[team:ferenc_balint-benczedi|Ferenc Balint-Benczedi]] 
  
-== Multi-expert segmentation of cluttered and occluded scenes ==+Requirements: 
 +  * Good C++ programming skills 
 +  * Familiar with the [[https://docs.unrealengine.com/latest/INT/Programming/Slate/|SLATE]] framework 
 +  * Familiar with Unreal Engine API 
 +  * Familiar with version-control systems (git) 
 +  * Able to work independently with minimal supervision
  
-Objects in a human environment are usually found in challenging scenes. They can be stacked upon eachother, touching or occluding, can be found in drawers, cupboards, refrigerators and so on. A personal robot assistant in order to execute a task, needs to detect these objects and recognize them. In this thesis a multi-modal approach to interpreting cluttered scenes is going to be investigated, combining the results of multiple segmentation algorithms in order to come up with more reliable object hypotheses.+Contact: [[team:andrei_haidu|Andrei Haidu]]
  
-Requirements:  
-  * Good programming skills in C/C++ 
-  * strong background in 3D vision  
-  * basic knowledge of ROS, OpenCV, PCL 
  
-Contact: [[team:ferenc_balint-benczedi|Ferenc Balint-Benczedi]] 
  
-== Robot control systems in underwater robotics ==+== OpenEASE rendering in Unreal Engine (BA/MA Thesis, Student Job / HiWi)==
  
-The use of robots in underwater missions shows a challenging task. The dynamic terrain and its different conditions makes it difficult for robots to perform tasks correctly. In order to accomplish tasks in a proper way, the robot control routines have to be coordinated. The topic of this thesis is to develop robot control systems for underwater robotics in an underwater mission in order to navigate and to execute tasks correctly in the terrain. 
  
-Requirements +Implmenting the rendering of the [[https://data.open-ease.org/|OpenEASE]] knowledge base visualization in Unreal Engine. This will have to be packaged as a HTML5 executable and inserted into the website.
-  * Good programming skills in C/C++ or JAVA +
-  * basic knowledge of ROS, OpenCV+
  
-Contact: [[team:fereshta_yazdani|Fereshta Yazdani]]+Requirements: 
 +  * Good C++ programming skills 
 +  * Familiar with Unreal Engine API 
 +  * Familiar with HTML5 and JavaScript 
 +  * Familiar with the [[https://docs.unrealengine.com/latest/INT/Programming/Slate/|SLATE]] framework 
 +  * Familiar with basic ROS communication 
 +  * Familiar with version-control systems (git) 
 +  * Able to work independently with minimal supervision 
 + 
 +Contact: [[team:andrei_haidu|Andrei Haidu]] 
 +--></html>




Prof. Dr. hc. Michael Beetz PhD
Head of Institute

Contact via
Andrea Cowley
assistant to Prof. Beetz
ai-office@cs.uni-bremen.de

Discover our VRB for innovative and interactive research


Memberships and associations:


Social Media: