Visualizing the SICK Laser in simulation![Conflict Detection Enabled Conflict Detection Enabled](/sites/all/themes/redux/css/images/icons/conflict_enabled_icon.png)
The goal of this tutorial is to visualize the SICK laser (front_laser_points) in rviz (and gzclient) to understand how the topic works. The approach will use a more complex world file that has artificial barriers and structures erected.
Launch the simulator
Ensure you have sourced the devel/setup.bash file. Then launch the neighborhood tutorial.
source devel/setup.bash
roslaunch catvehicle catvehicle_neighborhood.launch
Visualize in gzclient
Opening up a new tab and starting the gazebo client
You should see several home-looking-structures as well as some jersey barriers, with sensor fields interrupted by these structures. Because this tutorial has structures interrupting the sensor data, you may notice that your real-time factor (RTF) drops below 1.0. That's OK for these tutorials.
Visualize in rviz
In a new tab, run
rosrun rviz rviz
Then, load your catvehicle_tutorial.rviz (or similar) file, to recover the sensor displays from other tutorials. You can rearrange/zoom/pan until you get a view similar to the image.
At this point, you should be able to make out the jersey barrier, but you won't see the houses yet. That's because we're looking at the Velodyne sensor here---let's begin to vizualize the SICK laser sensor now.
In the Display panel, execute Add->By Topic->/catvehicle/front_laser_points/LaserScan
Like the PointCloud type, these data points are actually kind of hard to see, so we must make them a little bigger (set them to 0.25 or 0.5).
If you compare, now, the Gazebo client window with the rviz window, you should be able to see how the outline of the structures and jersey barriers matches up between the two windows. With this in mind, let's close gzclient so we can save a few cycles of our processor.
The reason that some of these squares are different colors, by the way, is that the default coloring is by the intensity return (which is lower on the sides of the structures).
Now we can see that the sensor return values reflect our simulated environment. We'll next try to move the vehicle around, so we can see how these sensor points move when we move.
azcar_sim (deprecated)
Below instructions are from v1, and will be removed.
Launch the tutorial
Ensure you have sourced the devel/setup.bash file. Then:
roslaunch azcar_sim azcar_neighborhood.launch
Look at the gzclient visualization
In a new tab, run
gzclient
You should see several home-looking-structures as well as some jersey barriers, with sensor fields interrupted by these structures. Because this tutorial has structures interrupting the sensor data, you may notice that your real-time factor (RTF) drops below 1.0. That's OK for these tutorials.
Look at the rviz visualization
In a new tab, run
source devel/setup.bash
rosrun rviz rviz
Then, load your azcar_sim.rviz file, to recover the sensor displays from other tutorials. You can rearrange/zoom/pan until you get a view similar to the image. At this point, you should be able to make out the jersey barrier, but you won't see the houses yet. That's because we're looking at the Velodyne sensor here---let's begin to vizualize the SICK laser sensor now.
In the Display panel, execute
Add->By Topic->/azcar_sim/front_laser_points/LaserScan
Like the PointCloud type, these data points are actually kind of hard to see, so we must make them a little bigger (set them to 0.25 or 0.5).
If you compare, now, the Gazebo client window with the rviz window, you should be able to see how the outline of the structures and jersey barriers matches up between the two windows. With this in mind, let's close gzclient so we can save a few cycles of our processor.
The reason that some of these squares are different colors, by the way, is that the default coloring is by the intensity return (which is lower on the sides of the structures).
Now we can see that the sensor return values reflect our simulated environment. We'll next try to move the vehicle around, so we can see how these sensor points move when we move.