This is an old revision of the document!
UKU is a mobile robot-sized mobile vehicle sized by students of Tallinn University of Technology. A Gazebo simulator model has been created to experiment with the UKU robot. This allows anyone, no matter where they are, to program the robot, and if the program runs correctly in the simulator, it can simply be run on a real robot.
The UKU's main sensor is lidar. Lidars are very good sensors to sense the surroundings, but they are also relatively expensive. The use of capable lidars because of their price limits their use on personal robots. UKU uses a 3D Lid Velodyne VLP-16. Lidar has 16 channels and 360 ° viewing angle. This means that lidar emits 16 rays vertically in each direction. This lidar is also used by most self-driving cars and other sophisticated self-driving robot systems.
Lidar allows laser beams to obtain a point cloud from the surroundings, which is very useful for autonomous navigation. The more channels the lidar has and the higher its resolution, the more accurate the point cloud will be.
The Velodyne VLP-16 lidar on top of the UKU robot is expensive and simpler and cheaper lidars can also be used in simpler applications. One of the most popular cheaper lidar is RPlidar. RPlidar A1 costs around 100 € and RPlidar A3 600 €. Lidars differ mainly in the resolution and measurement frequency.
RPlidar has a 360 ° field of view but has only one channel. Therefore, the output of the lidar is two-dimensional. Such lidars are a good choice for navigating indoors where walls and other vertical objects are obstructed.
In the Gazebo simulator it is also possible to simulate lidars. Many lidar manufacturers have published lidar models (URDF) on the Internet, which makes adding a lidar to their robot model much easier. The official Velodyne VLP-16 URDF model is also used in the UKU robot simulation.
The necessary models and libraries for experimenting with the robot have been created to simulate the robot in Gazebos. For convenience, the entire ROS workspace has been uploaded to a git repository. Simply download the working environment and install the necessary libraries to use the simulation.
Let's start by cloning the repository:
$ sudo apt install git $ git clone http://gitlab.robolabor.ee/ingmar05/uku_simulation_ws
Install the required libraries and compile:
$ cd uku_simulation_ws $ rosdep install --from-paths src --ignore-src -r -y $ catkin_make
$ source devel / setup.bash
So that we don't have to get a working environment every time we open a new terminal window, we can add the following line to the .bashrc file:
$ echo "source ~/uku_simulation_ws/devel/setup.bash" >> ~/.bashrc
To avoid having to manually launch multiple nodes each time with the correct parameters, launch files (.launch) have been created. Using these, you can execute roscore with one command and several different nodes with the correct parameters. To run all the necessary nodes to simulate the robot, a startup file called uku_vehicle.launch has been created.
Run simulation:
$ roslaunch uku_vehicle_gazebo uku_vehicle.launch
The Gazebo simulator and Rviz should now open. The UKU robot should be visible in the Gazebo virtual world.
Using the Gazebo interface, we can add objects to the virtual world.
A robot remote control unit is also included with the workspace. Launch Remote Control Node in another window:
$ rosrun ackermann_drive_teleop keyop.py
Now we should be able to control the robot with the keyboard.
The startup file also launched the Rviz visualization tool. When we open the Rviz window, we see a 3D lidar image and a robot model.