User Tools

Site Tools


teaching:se-kiba:perception-tutorial

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
teaching:se-kiba:perception-tutorial [2013/04/25 09:54] balintbeteaching:se-kiba:perception-tutorial [2016/05/19 09:19] (current) – external edit 127.0.0.1
Line 1: Line 1:
-=====Perception Tutorial=====+=======Perception Tutorial=======
  
 Tutorial describing first steps needed to start processing data from the Kinect, in the form of Point Clouds,  using tools included in ROS Fuerte. Tutorial describing first steps needed to start processing data from the Kinect, in the form of Point Clouds,  using tools included in ROS Fuerte.
 +
 +==== 1. Prerequisite ====
 +
 +Follow the steps described in [[software:ros:installation|Recommended procedure for installing ROS]] and [[software:ros:getting-started|Getting Started with ROS]]
  
 If you are using a PrimeSense device (Xbox Kinect, Asus Xtion) check whether //ros-fuerte-openni-kinect// is installed. You can do this by running the following command in a terminal:  If you are using a PrimeSense device (Xbox Kinect, Asus Xtion) check whether //ros-fuerte-openni-kinect// is installed. You can do this by running the following command in a terminal: 
Line 8: Line 12:
   sudo apt-get install ros-fuerte-openni-kinect   sudo apt-get install ros-fuerte-openni-kinect
  
-Connect the device to your PC and run +Check out the code presented at seminar in your ros-workspace from:\\ 
-  roslaunch openni_launch openni.launch +[[https://github.com/ai-seminar/perception-tutorials.git]] \\ 
-in order to start publishing data from it to the appropriate topics.+Prerecorded bag file can be found in ../tutorial_pkg/data/.
  
-More detailed description to be added soon.\\+==== 2Viewing and recording data from the Kinect ===
  
 +== Bag files and Rviz ==
 +For a PrimeSense device: connect it to your PC and run:
 +  roslaunch openni_launch openni.launch
 +Run rviz:
 +  rosrun rviz rviz
 +Add a new PointCloud2 display type to rviz, and choose // /camera/depth_registered_points // in the topics field. If rviz returns errors, change //Fixed Frame// in //Global Options// from //<Filxed Frame>// to one of the available ones in the dropdown list (e.g.  /camera_link)
  
-Code presented at seminar will be available at:\\ +Use //rosbag// to record data in a bag file, e.g.
-[[https://github.com/ai-seminar/perception-tutorials.git]]+  rosbag record /camera/depth_registered/points /tf 
 +Note: /tf is needed if you want to view the recorded data using rviz. 
 +Play back a bag file using: 
 +  rosbag play filename.bag --loop 
 +More detail and a more elegant way of saving data from a Kinect to bag files can be found [[http://www.ros.org/wiki/openni_launch/Tutorials/BagRecordingPlayback|here]]
  
-==Viewing and recording data from the Kinect== +In order to save Point Clouds to *.pcd files run: 
-  +  rosrun pcl_ros pointcloud_to_pcd /input:=/camera/depth_registered/points 
-  * viewing with //rviz// (point clouds) and //image_view //(images)\\ +   
-  * record using bag files or save clouds in *pcd files using tool in ros_pcl\\ +== Image_view == 
-  * replay bag files, view them in rviz\\ +View rgb image: 
-  * subscribing to the PointCloud topic from C+++  rosrun image_view image_view image:=/camera/rgb/image_color 
 +View depth image: 
 +  rosrun image_view image_view image:=/camera/depth/image 
 +   
 +== From a ROS node ==
  
-==Processing Point Clouds==+To see how you subscribe to a topic from a ros node take a look at //subscriber.cpp// from the source code.
  
-  * reading in a pcd file +==== 3.Processing Point Clouds ==== 
-  * using PCLVisualizer+ 
 +The following are presented in //tutorial.cpp//, for more detailed description check the links attached. 
 +  * reading in a pcd file[[http://pointclouds.org/documentation/tutorials/reading_pcd.php#reading-pcd|link]]   
 +  * using PCLVisualizer[[http://pointclouds.org/documentation/tutorials/cloud_viewer.php#cloud-viewer|link]]
   * removing nans   * removing nans
-  * filtering based on axes +  * calcultaing normals[[http://pointclouds.org/documentation/tutorials/normal_estimation.php#normal-estimation|link]] 
-  * downsampling the Point Cloud +  * filtering based on axes[[http://pointclouds.org/documentation/tutorials/passthrough.php#passthrough|link]] 
-  * RANSAC plane fitting +  * downsampling the Point Cloud[[http://pointclouds.org/documentation/tutorials/voxel_grid.php#voxelgrid|link]] 
-  * extracting indices+  * RANSAC plane fitting, and extracting indices [[http://pointclouds.org/documentation/tutorials/planar_segmentation.php#planar-segmentation|link1]][[http://pointclouds.org/documentation/tutorials/extract_indices.php#extract-indices|link2]]
  
- +For further questions contact: //**balintbe** at **tzi** dot **de**//
-  +
teaching/se-kiba/perception-tutorial.1366883666.txt.gz · Last modified: 2016/05/19 09:18 (external edit)

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki