Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
en:ros:simulations:uku [2020/03/23 08:59] tomykalmen:ros:simulations:uku [Unknown date] (current) – external edit (Unknown date) 127.0.0.1
Line 1: Line 1:
-====== UKU ======+====== Outdoor Mobile Robot ======
  
-UKU is a mobile robot-sized mobile vehicle sized by students of Tallinn University of Technology. A //Gazebo// simulator model has been created to experiment with the UKU robot. This allows anyone, no matter where they are, to program the robot, and if the program runs correctly in the simulator, it can simply be run on a real robot.+Outdoor mobile robot UKU is a mid-sized self-driving vehicle for educational and research purposes. A Gazebo simulator model has been created to experiment with the UKU robot. This allows anyone, no matter where they are, to program the robot, and if the program runs correctly in the simulator, it can simply be run on a real robot.
  
-{{:et:ros:simulations:ukuirl.jpg?400|}} 
-{{:et:ros:simulations:h3oqjuc.png?400||}} 
  
-===== LIDAR =====+{{ :en:ros:simulations:uku.jpg?600 |}}
  
-The UKU's main sensor is lidar. Lidars are very good sensors to sense the surroundings, but they are also relatively expensive. The use of capable lidars because of their price limits their use on personal robots. UKU uses a 3D Lid //Velodyne VLP-16//. Lidar has 16 channels and 360 ° viewing angle. This means that lidar emits 16 rays vertically in each direction. This lidar is also used by most self-driving cars and other sophisticated self-driving robot systems.+===== Sensors =====
  
-{{:et:ros:simulations:vlp.jpg?300|}}+**LiDAR**
  
-{{:et:ros:simulations:lidarspec.png?900|}}+The UKU's main sensor is Lidar. Lidars are very good sensors to sense the surroundings, but they are also relatively expensive. The use of capable lidars because of their price limits their use on personal robots. UKU uses a 3D Lidar //Velodyne VLP-16//. Lidar has 16 channels and a 360 ° viewing angle. This means that Lidar emits 16 rays vertically in each direction. This Lidar is also used by most self-driving vehicles and other sophisticated self-driving robot systems.
  
-Lidar allows laser beams to obtain a point cloud from the surroundings, which is very useful for autonomous navigationThe more channels the lidar has and the higher its resolution, the more accurate the point cloud will be.+^  Velodyne VLP-16 Specification                                                                                ||| 
 +^ Spec.                                        ^ Value                                                           || 
 +| **Channels**                                 | 16                      | {{ :et:ros:simulations:vlp.jpg?200 }}  | 
 +| **Measurement Range**                        | 100 m                   | :::                                    | 
 +| **Range Accuracy**                           | Up to ±3 cm (Typical)   | :::                                    | 
 +| **Field of View (Vertical)**                 | +15.0° to -15.0° (30°)  | :::                                    | 
 +| **Angular Resolution (Vertical)**            | 2.0°                    | :::                                    | 
 +| **Angular Resolution (Horizontal/Azimuth)**  | 0.1° – 0.4°             | :::                                    | 
 +| **Rotation Rate**                            | 5 Hz – 20 Hz            | :::                                    |
  
-{{:et:ros:simulations:lidardrive.gif|}}+Lidar allows laser beams to obtain a point cloud from the surroundings, which is very useful for autonomous navigation. The more channels the Lidar has and the higher its resolution, the more accurate the point cloud will be.
  
-The //Velodyne VLP-16// lidar on top of the UKU robot is expensive and simpler and cheaper lidars can also be used in simpler applications. One of the most popular cheaper lidar is [[https://www.slamtec.com/en/Lidar/A3|RPlidar]].+{{ :et:ros:simulations:lidardrive.gif |}} 
 + 
 +The //Velodyne VLP-16// Lidar on top of the UKU robot is expensive and simpler and cheaper Lidars can also be used in simpler applications. One of the most popular cheaper Lidar is [[https://www.slamtec.com/en/Lidar/A3|RPlidar]].
 //RPlidar A1// costs around 100 € and //RPlidar A3// 600 €. Lidars differ mainly in the resolution and measurement frequency. //RPlidar A1// costs around 100 € and //RPlidar A3// 600 €. Lidars differ mainly in the resolution and measurement frequency.
 +{{ :en:ros:simulations:rplidar.png?400 |}}
  
-{{:et:ros:simulations:rplidar1.jpg?400|}} +//RPlidar// has a 360 ° field of view but has only one channelTherefore, the output of the Lidar is two-dimensional. Such Lidars are a good choice for navigating indoors-where walls and other vertical objects are obstructed.
-{{:et:ros:simulations:rplidar3.jpg?450|}}+
  
-//RPlidar// has a 360 ° field of view but has only one channel. Therefore, the output of the lidar is two-dimensional. Such lidars are a good choice for navigating indoors where walls and other vertical objects are obstructed.+{{ :en:ros:simulations:phone-8k.gif |}}
  
-{{:et:ros:simulations:2dlidar.gif|}}+In the Gazebo simulator, it is also possible to simulate Lidars. Many Lidar manufacturers have published Lidar models (URDF) on the Internet, which makes adding a Lidar to their robot model much easier. The official //Velodyne VLP-16// URDF model is also used in the UKU robot simulation.
  
  
-//In the Gazebo// simulator it is also possible to simulate lidars. Many lidar manufacturers have published lidar models (URDF) on the Internet, which makes adding a lidar to their robot model much easier. The official //Velodyne VLP-16// URDF model is also used in the UKU robot simulation.+===== UKU Simulation =====
  
-===== UKU simulation ===== +The necessary models and libraries for experimenting with the robot have been created to simulate the robot in Gazebo. For convenience, the entire ROS workspace has been uploaded to a git repository. Simply download the working environment and install the necessary libraries to use the simulation.
- +
-The necessary models and libraries for experimenting with the robot have been created to simulate the robot in //Gazebos//. For convenience, the entire ROS workspace has been uploaded to a git repository. Simply download the working environment and install the necessary libraries to use the simulation.+
  
 Let's start by cloning the repository: Let's start by cloning the repository:
  
    $ sudo apt install git    $ sudo apt install git
-   $ git clone http://gitlab.robolabor.ee/ingmar05/uku_simulation_ws +   $ git clone http://gitlab.robolabor.ee/ingmar05/uku_simulation_ws.git 
 +   
 Install the required libraries and compile: Install the required libraries and compile:
  
Line 45: Line 51:
    $ rosdep install --from-paths src --ignore-src -r -y    $ rosdep install --from-paths src --ignore-src -r -y
    $ catkin_make    $ catkin_make
-   $ source devel / setup.bash+   $ source devel / setup.bash
  
 So that we don't have to get a working environment every time we open a new terminal window, we can add the following line to the //.bashrc// file: So that we don't have to get a working environment every time we open a new terminal window, we can add the following line to the //.bashrc// file:
Line 57: Line 63:
    $ roslaunch uku_vehicle_gazebo uku_vehicle.launch    $ roslaunch uku_vehicle_gazebo uku_vehicle.launch
  
-The //Gazebo// simulator and //Rviz// should now open. The UKU robot should be visible in the //Gazebo// virtual world.+The Gazebo simulator and //Rviz// should now open. The UKU robot should be visible in the Gazebo virtual world. Using the //Gazebo// interface, we can add objects to the virtual world.
  
-Using the //Gazebo// interface, we can add objects to the virtual world.+{{ :et:ros:simulations:giphy.gif |}}
  
-{{:et:ros:simulations:giphy.gif|}} +A robot remote control unit is also included with the workspace. Launch Remote Control Node in another window:
- +
-A robot remote control unit is also included with the workspace. +
-    +
-Launch Remote Control Node in another window:+
  
    $ rosrun ackermann_drive_teleop keyop.py    $ rosrun ackermann_drive_teleop keyop.py
Line 71: Line 73:
 Now we should be able to control the robot with the keyboard. Now we should be able to control the robot with the keyboard.
  
-{{:et:ros:simulations:rviz480.gif|}}+{{ :et:ros:simulations:rviz480.gif |}} 
        
-The startup file also launched the //Rviz// visualization tool. When we open the //Rviz// window, we see a 3D lidar image and a robot model.+The startup file also launched the //Rviz// visualization tool. When we open the //Rviz// window, we see a 3D Lidar image and a robot model. 
 + 
 +{{ :et:ros:simulations:gazebo480.gif |}} 
  
-{{:et:ros:simulations:gazebo480.gif|}} 
en/ros/simulations/uku.1584953977.txt.gz · Last modified: 2020/07/20 09:00 (external edit)
CC Attribution-Share Alike 4.0 International
www.chimeric.de Valid CSS Driven by DokuWiki do yourself a favour and use a real browser - get firefox!! Recent changes RSS feed Valid XHTML 1.0