Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revisionNext revisionBoth sides next revision | ||
jobs [2015/10/15 09:22] – [Theses and Jobs] froggy86 | jobs [2019/09/18 14:42] – [Open researcher positions] nyga | ||
---|---|---|---|
Line 1: | Line 1: | ||
~~NOTOC~~ | ~~NOTOC~~ | ||
- | =====Theses and Jobs===== | + | |
+ | =====Open researcher positions===== | ||
+ | |||
+ | =====Theses and Student | ||
If you are looking for a bachelor/ | If you are looking for a bachelor/ | ||
- | == Kitchen Activity Games in a Realistic Robotic Simulator (BA/ | ||
- | {{ : | ||
- | Developing new activities and improving the current simulation | + | |
+ | == Visualization support assistant (HiWi) == | ||
+ | |||
+ | Implementing a visualization web page for a project using a custom | ||
Requirements: | Requirements: | ||
- | * Good programming skills in C/C++ | + | * Good python |
- | * Basic physics/ | + | * Familiar with Javascript |
- | * Gazebo simulator basic tutorials | + | * Experience |
+ | * Familiar with version-control systems (git) | ||
+ | * Able to work independently with minimal supervision | ||
- | Contact: [[team:andrei_haidu|Andrei Haidu]] | + | Contact: [[team:mareike_picklum|Mareike Picklum]] |
- | == Integrating Eye Tracking in the Kitchen Activity Games (BA/MA)== | ||
- | {{ : | ||
- | Integrating the eye tracker | + | == Knowledge-enabled PID Controller for 3D Hand Movements |
+ | |||
+ | Implementing a force-, velocity- and impulse-based PID controller for precise | ||
+ | of the human user will be mapped to the virtual hands, and controlled via the implemented PID controllers. | ||
+ | The controller | ||
Requirements: | Requirements: | ||
- | * Good programming skills in C/C++ | + | * Good C++ programming skills |
- | * Gazebo simulator basic tutorials | + | * Familiar with PID controllers and control theory |
+ | * Experience with simulators/ | ||
+ | * Experience with Unreal Engine | ||
+ | * Familiar with version-control systems (git) | ||
+ | * Able to work independently with minimal supervision | ||
Contact: [[team: | Contact: [[team: | ||
- | == Hand Skeleton Tracking Using Two Leap Motion Devices (BA/MA)== | + | < |
- | {{ : | + | == Lisp / CRAM support assistant (HiWi) == |
- | Improving | + | Technical support for the group for Lisp and the CRAM framework. \\ |
- | + | 8+ hours per week for up to 1 year (paid). | |
- | The tracked hand can then be used as input for the Kitchen Activity Games framework. | + | |
Requirements: | Requirements: | ||
- | * Good programming skills in C/C++ | + | * Good programming skills in Common Lisp |
+ | * Basic ROS knowledge | ||
- | Contact: [[team: | + | The student will be introduced to the CRAM framework at the beginning of the job, which is a robot programming framework written in Lisp. The student will then be responsible for assisting not familiar with the framework people, explaining them the parts they don't understand and pointing them to the relevant documentation sources. |
- | == Fluid Simulation in Gazebo (BA/MA)== | + | Contact: [[team:gayane_kazhoyan|Gayane Kazhoyan]] |
- | | + | --></ |
- | [[http://gazebosim.org/|Gazebo]] currently only supports rigid body physics engines (ODE, Bullet etc.), however in some cases fluids are preferred in order to simulate as realistically as possible the given environment. | + | < |
+ | == Mesh Editing | ||
+ | {{ : | ||
- | Currently there is an [[http:// | + | Editing and cutting a human mesh into different parts in Blender |
- | The computational method for the fluid simulation is SPH (Smoothed-particle Dynamics), however newer and better methods based on SPH are currently present | + | Requirements: |
- | and should be implemented | + | * Good knowledge in 3D Modeling |
+ | * Familiar with Blender / Maya (or other) | ||
- | The interaction between the fluid and the rigid objects is a naive one, the forces and torques are applied only from the particle collisions (not taking into account pressure and other forces). | + | Contact: [[team/ |
+ | --></ | ||
- | Another topic would be the visualization of the fluid, currently is done by rendering every particle. For the rendering engine [[http://www.ogre3d.org/|OGRE]] is used. | + | < |
+ | == 3D Model / Material | ||
+ | {{ : | ||
- | Here is a [[https://vimeo.com/104629835|video]] example of the current state of the fluid in Gazebo. | + | Developing and improving existing 3D models in Blender / Maya (or other). Importing the models in Unreal Engine, where the Materials and Lightning should be improved to be close as possible to realism. |
+ | |||
+ | Bonus: Working with state of the art 3D Scanners | ||
Requirements: | Requirements: | ||
- | * Good programming skills in C/C++ | + | * Experience with Blender |
- | * Interest in Fluid simulation | + | * Knowledge of Unreal Engine material / lightning development |
- | * Basic physics/ | + | * Familiar with version-control systems (git) |
- | * Gazebo simulator and Fluidix basic tutorials | + | * Able to work independently with minimal supervision |
- | Contact: [[team: | ||
- | == Automated sensor calibration toolkit (BA/MA)== | + | Contact: [[team: |
+ | --></html> | ||
- | Computer vision is an important part of autonomous robots. For robots the image sensors are the main source of information of the surrounding world. Each camera is different, even if they are from the same production line. For computer vision, especially for robots manipulating their environment, | + | < |
+ | == Integrating PR2 in the Unreal Game Engine Framework (BA/ | ||
+ | {{ : | ||
- | The topic for this thesis is to develop an automated system for calibrating cameras, especially RGB-D cameras like the Kinect v2. | + | Integrating |
- | | + | Requirements: |
- | The system should: | + | * Good programming skills in C/C++ |
- | * be independent of the camera type | + | * Basic physics/ |
- | * estimate intrinsic and extrinsic parameters | + | * Basic ROS knowledge |
- | * calibrate depth images (case of RGB-D) | + | * UE4 basic tutorials |
- | * integrate capabilities from Halcon [1] | + | |
- | * operate autonomously | + | |
- | Requirements: | + | Contact: [[team: |
- | * Good programming skills in Python and C/C++ | + | |
- | * ROS, OpenCV | + | |
- | [1] http:// | ||
- | Contact: [[team: | + | == Realistic Grasping using Unreal Engine (BA/ |
- | == On-the-fly 3D CAD model creation (MA)== | + | {{ : |
- | Create models during runtime for unknown textured objets based on depth and color information. Track the object and update the model with more detailed information, | + | The objective of the project is to implement var- |
+ | ious human-like grasping approaches in a game developed using [[https:// | ||
+ | The game consist of a household environment where a user has to execute various given tasks, such as cooking a dish, setting the table, cleaning the dishes etc. The interaction is done using various sensors to map the users hands onto the virtual hands in the game. | ||
+ | |||
+ | In order to improve the ease of manipulating objects the user should | ||
+ | be able to switch during runtime the type of grasp (pinch, power | ||
+ | grasp, precision grip etc.) he/she would like to use. | ||
+ | | ||
Requirements: | Requirements: | ||
- | * Good programming skills in C/C++ | + | * Good programming skills in C++ |
- | * strong background | + | * Good knowledge of the Unreal Engine API. |
- | * ROS, OpenCV, PCL | + | * Experience with skeletal control / animations / 3D models |
+ | * Familiar with version-control systems (git) | ||
+ | * Able to work independently with minimal supervision | ||
- | Contact: [[team: | ||
- | == Simulation of a robots belief state to support perception(MA) == | + | Contact: [[team/ |
+ | --></ | ||
- | Create a simulation environment that represents the robots current belief state and can be updated frequently. Use off-screen rendering to investigate the affordances these objects possess, in order to support segmentation, | + | < |
+ | == Unreal Engine Editor Developer (Student Job / HiWi)== | ||
+ | {{ : | ||
- | Requirements: | + | Creating new user interfaces (panel customization) for various internal plugins using the Unreal |
- | * Good programming skills in C/C++ | + | |
- | * strong background in computer vision | + | |
- | * Gazebo, OpenCV, PCL | + | |
- | Contact: [[team: | ||
- | == Multi-expert segmentation of cluttered and occluded scenes == | + | Requirements: |
+ | * Good C++ programming skills | ||
+ | * Familiar with the [[https:// | ||
+ | * Familiar with Unreal Engine API | ||
+ | * Familiar with version-control systems (git) | ||
+ | * Able to work independently with minimal supervision | ||
- | Objects in a human environment are usually found in challenging scenes. They can be stacked upon eachother, touching or occluding, can be found in drawers, cupboards, refrigerators and so on. A personal robot assistant in order to execute a task, needs to detect these objects and recognize them. In this thesis a multi-modal approach to interpreting cluttered scenes is going to be investigated, | + | Contact: [[team: |
- | Requirements: | ||
- | * Good programming skills in C/C++ | ||
- | * strong background in 3D vision | ||
- | * basic knowledge of ROS, OpenCV, PCL | ||
- | Contact: [[team: | ||
- | == Robot control systems | + | == OpenEASE rendering |
- | The use of robots in underwater missions shows a challenging task. The dynamic terrain and its different conditions makes it difficult for robots to perform tasks correctly. In order to accomplish tasks in a proper way, the robot control routines have to be coordinated. The topic of this thesis is to develop robot control systems for underwater robotics in an underwater mission in order to navigate and to execute tasks correctly in the terrain. | ||
- | Requirements: | + | Implmenting the rendering of the [[https:// |
- | * Good programming skills in C/C++ or JAVA | + | |
- | * basic knowledge | + | |
- | Contact: [[team:fereshta_yazdani|Fereshta Yazdani]] | + | Requirements: |
+ | * Good C++ programming skills | ||
+ | * Familiar with Unreal Engine API | ||
+ | * Familiar with HTML5 and JavaScript | ||
+ | * Familiar with the [[https:// | ||
+ | * Familiar with basic ROS communication | ||
+ | * Familiar with version-control systems (git) | ||
+ | * Able to work independently with minimal supervision | ||
+ | |||
+ | Contact: [[team:andrei_haidu|Andrei Haidu]] | ||
+ | --></ |
Prof. Dr. hc. Michael Beetz PhD
Head of Institute
Contact via
Andrea Cowley
assistant to Prof. Beetz
ai-office@cs.uni-bremen.de
Discover our VRB for innovative and interactive research
Memberships and associations: