This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
en:ros:simulations:uku [2020/03/23 08:59] – tomykalm | en:ros:simulations:uku [Unknown date] (current) – external edit (Unknown date) 127.0.0.1 | ||
---|---|---|---|
Line 1: | Line 1: | ||
- | ====== | + | ====== |
- | UKU is a mobile robot-sized | + | Outdoor mobile robot UKU is a mid-sized |
- | {{: | ||
- | {{: | ||
- | ===== LIDAR ===== | + | {{ : |
- | The UKU's main sensor is lidar. Lidars are very good sensors to sense the surroundings, | + | ===== Sensors ===== |
- | {{: | + | **LiDAR** |
- | {{: | + | The UKU's main sensor is Lidar. Lidars are very good sensors to sense the surroundings, |
- | Lidar allows laser beams to obtain a point cloud from the surroundings, | + | ^ Velodyne VLP-16 Specification |
+ | ^ Spec. ^ Value || | ||
+ | | **Channels** | ||
+ | | **Measurement Range** | ||
+ | | **Range Accuracy** | ||
+ | | **Field of View (Vertical)** | ||
+ | | **Angular Resolution (Vertical)** | ||
+ | | **Angular Resolution (Horizontal/ | ||
+ | | **Rotation Rate** | ||
- | {{: | + | Lidar allows laser beams to obtain a point cloud from the surroundings, |
- | The //Velodyne VLP-16// | + | {{ : |
+ | |||
+ | The //Velodyne VLP-16// | ||
//RPlidar A1// costs around 100 € and //RPlidar A3// 600 €. Lidars differ mainly in the resolution and measurement frequency. | //RPlidar A1// costs around 100 € and //RPlidar A3// 600 €. Lidars differ mainly in the resolution and measurement frequency. | ||
+ | {{ : | ||
- | {{: | + | //RPlidar// has a 360 ° field of view but has only one channel. Therefore, the output of the Lidar is two-dimensional. Such Lidars are a good choice for navigating indoors-where walls and other vertical objects are obstructed. |
- | {{: | + | |
- | //RPlidar// has a 360 ° field of view but has only one channel. Therefore, the output of the lidar is two-dimensional. Such lidars are a good choice for navigating indoors where walls and other vertical objects are obstructed. | + | {{ : |
- | {{: | + | In the Gazebo simulator, it is also possible to simulate Lidars. Many Lidar manufacturers have published Lidar models (URDF) on the Internet, which makes adding a Lidar to their robot model much easier. The official //Velodyne VLP-16// URDF model is also used in the UKU robot simulation. |
- | //In the Gazebo// simulator it is also possible to simulate lidars. Many lidar manufacturers have published lidar models (URDF) on the Internet, which makes adding a lidar to their robot model much easier. The official //Velodyne VLP-16// URDF model is also used in the UKU robot simulation. | + | ===== UKU Simulation ===== |
- | ===== UKU simulation ===== | + | The necessary models and libraries for experimenting with the robot have been created to simulate the robot in Gazebo. For convenience, |
- | + | ||
- | The necessary models and libraries for experimenting with the robot have been created to simulate the robot in //Gazebos//. For convenience, | + | |
Let's start by cloning the repository: | Let's start by cloning the repository: | ||
$ sudo apt install git | $ sudo apt install git | ||
- | $ git clone http:// | + | $ git clone http:// |
+ | | ||
Install the required libraries and compile: | Install the required libraries and compile: | ||
Line 45: | Line 51: | ||
$ rosdep install --from-paths src --ignore-src -r -y | $ rosdep install --from-paths src --ignore-src -r -y | ||
$ catkin_make | $ catkin_make | ||
- | $ source devel / setup.bash | + | $ source devel / setup.bash |
So that we don't have to get a working environment every time we open a new terminal window, we can add the following line to the //.bashrc// file: | So that we don't have to get a working environment every time we open a new terminal window, we can add the following line to the //.bashrc// file: | ||
Line 57: | Line 63: | ||
$ roslaunch uku_vehicle_gazebo uku_vehicle.launch | $ roslaunch uku_vehicle_gazebo uku_vehicle.launch | ||
- | The //Gazebo// simulator and //Rviz// should now open. The UKU robot should be visible in the //Gazebo// virtual world. | + | The Gazebo simulator and //Rviz// should now open. The UKU robot should be visible in the Gazebo virtual world. Using the // |
- | Using the //Gazebo// interface, we can add objects to the virtual world. | + | {{ : |
- | {{: | + | A robot remote control unit is also included with the workspace. Launch Remote Control Node in another window: |
- | + | ||
- | A robot remote control unit is also included with the workspace. | + | |
- | + | ||
- | Launch Remote Control Node in another window: | + | |
$ rosrun ackermann_drive_teleop keyop.py | $ rosrun ackermann_drive_teleop keyop.py | ||
Line 71: | Line 73: | ||
Now we should be able to control the robot with the keyboard. | Now we should be able to control the robot with the keyboard. | ||
- | {{: | + | {{ : |
- | The startup file also launched the //Rviz// visualization tool. When we open the //Rviz// window, we see a 3D lidar image and a robot model. | + | The startup file also launched the //Rviz// visualization tool. When we open the //Rviz// window, we see a 3D Lidar image and a robot model. |
+ | |||
+ | {{ : | ||
- | {{: |