This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
en:ros:simulations:turtlebot [2020/03/23 08:56] – tomykalm | en:ros:simulations:turtlebot [Unknown date] (current) – external edit (Unknown date) 127.0.0.1 | ||
---|---|---|---|
Line 1: | Line 1: | ||
- | ====== | + | ====== |
TurtleBot is a flexible robotic platform designed to work with ROS. Built from common components, TurtleBot is modular and therefore allows the user to create many different configurations. Turtlebot is the ideal platform for experimenting with and learning about ROS. There are many users of the turntable, which means that many functions are freely available on the Internet in the form of ROS nodes. You don't have to write any lines of code to get Turtlebot to automatically navigate the space. All you need to do is find the right nodes on the Internet and get them working. | TurtleBot is a flexible robotic platform designed to work with ROS. Built from common components, TurtleBot is modular and therefore allows the user to create many different configurations. Turtlebot is the ideal platform for experimenting with and learning about ROS. There are many users of the turntable, which means that many functions are freely available on the Internet in the form of ROS nodes. You don't have to write any lines of code to get Turtlebot to automatically navigate the space. All you need to do is find the right nodes on the Internet and get them working. | ||
+ | {{ : | ||
- | {{:et:ros: | + | The most recent and developed version of the turtlebot is version 3. TurtleBot3 is made up of modular plates that users can customize the shape. Available in three types: small-size Burger and medium-size Waffle, Waffle Pi. TurtleBot3 consists of a base, two Dynamixel motors, a 1,800mAh battery pack, a 360 degree LIDAR, a camera (+ RealSense camera for Waffle kit, + Raspberry Pi Camera for Waffle Pi kit), and SBC (single-board computer: Raspberry PI 3 and Intel Joule 570x) and a hardware mounting kit attaching everything together and adding future sensors. Turtlebot3 was released in May 2017. |
- | ===== 3D Camera ===== | + | {{ : |
- | {{: | + | ===== Sensors ===== |
+ | The main sensor of the turtlebot is the 360 Laser Distance Sensor LDS-01. The LDS-01 is a 2D laser scanner capable of sensing 360 degrees that collects a set of data around the robot to use for SLAM (Simultaneous Localization and Mapping). The below figure shows that how a robot sees the environment from a 360 laser sensor. The blue laser rays are reflected from the objects and the distance is measured and finally, a 2D point cloud of the environment is built. | ||
- | The main sensor of the turntable is the 3D camera. The 3D camera is one of the most versatile robot sensors. One output of a 3D camera is a 2D camera image, which means that various object recognition algorithms can be used. Many machine vision libraries are available for ROS. One of the most widely used and versatile is [[https:// | + | {{ : |
+ | |||
+ | The 3D camera is one of the most versatile robot sensors. One output of a 3D camera is a 2D camera image, which means that various object recognition algorithms can be used. Many machine vision libraries are available for ROS. One of the most widely used and versatile is [[https:// | ||
Objects will be identified and a box will be drawn around them, with the name of the object type identified: | Objects will be identified and a box will be drawn around them, with the name of the object type identified: | ||
- | {{: | + | {{ : |
The advantage of a 3D camera over a conventional camera is the depth dimension. This allows the robot to sense the distance between objects. This feature allows the robot to develop autonomous navigation. | The advantage of a 3D camera over a conventional camera is the depth dimension. This allows the robot to sense the distance between objects. This feature allows the robot to develop autonomous navigation. | ||
- | {{: | + | {{ : |
- | ===== Simulating the Turbot Bot ===== | + | ===== Turtlebot 3 Simulation |
- | To simulate a turtle robot, everything you need to simulate a Gazebos robot is freely available on the Internet. | ||
- | Let's download freely available libraries | + | ==== Install Dependent ROS 1 Packages ==== |
+ | First of all, you need to install some dependencies. These are based on Ubuntu 18.04 and ROS Melodic. | ||
- | $ sudo apt-get install ros-kinetic-turtlebot | + | |
- | qt-library | + | |
+ | | ||
+ | | ||
+ | | ||
+ | ros-melodic-rosserial-msgs ros-melodic-amcl ros-melodic-map-server \ | ||
+ | ros-melodic-move-base ros-melodic-urdf ros-melodic-xacro \ | ||
+ | ros-melodic-compressed-image-transport ros-melodic-rqt* \ | ||
+ | ros-melodic-gmapping ros-melodic-navigation ros-melodic-interactive-markers | ||
- | Let's start the robot simulation: | + | ==== Install TurtleBot3 Packages ==== |
+ | Install TurtleBot3 via Debian Packages. | ||
- | $ roslaunch turtlebot_gazebo turtlebot_world.launch | + | |
- | {{:et:ros: | + | $ sudo apt-get install ros-melodic-turtlebot3 |
+ | $ sudo apt-get install | ||
- | In the Gazebo Simulator, you can add simulation objects from the menu by clicking on the desired object. You'll also find tools for moving, enlarging and rotating objects in the same place. | + | ==== Install Simulation Package ==== |
+ | The TurtleBot3 Simulation Package requires // | ||
- | {{:et: | + | $ cd ~/ |
+ | $ git clone -b melodic-devel https:// | ||
+ | $ cd ~/catkin_ws && catkin_make | ||
+ | |||
+ | ==== Set TurtleBot3 Model Name ==== | ||
+ | Set the default // | ||
- | Next we try to control the robot remotely. | + | In case of TurtleBot3 Burger: |
+ | $ echo " | ||
- | Download TurtleBot Remote Libraries: | + | The above line write //export TURTLEBOT3_MODEL=burger// |
- | $ sudo apt-get install ros-kinetic-turtlebot-apps ros-kinetic-turtlebot-rviz-launchers | + | ==== Start the robot simulation ==== |
+ | There are several predefined environments that you can run the turtle inside them. In the following, you can see an empty, sample and house world. | ||
- | Let's start the robot control unit: | + | Empty World: |
+ | $ roslaunch turtlebot3_gazebo turtlebot3_empty_world.launch | ||
+ | |||
+ | {{ : | ||
- | $ roslaunch | + | Sample World: |
+ | | ||
+ | |||
+ | {{ : | ||
+ | |||
+ | House: | ||
+ | $ roslaunch turtlebot3_gazebo turtlebot3_house.launch | ||
+ | |||
+ | {{ : | ||
+ | |||
+ | |||
+ | In case you need to build your environment, | ||
+ | {{ : | ||
+ | |||
+ | |||
+ | In the Gazebo Simulator, you can add simulation objects from the menu by clicking on the desired object. You'll also find tools for moving, enlarging, and rotating objects in the same place. | ||
+ | |||
+ | {{ : | ||
+ | |||
+ | |||
+ | Next, we try to control the robot remotely. | ||
+ | |||
+ | ==== Operate TurtleBot3 ==== | ||
+ | In order to teleoperate the TurtleBot3 with the keyboard, launch the teleoperation node with the below command in a new terminal window. | ||
+ | |||
+ | $ roslaunch turtlebot3_teleop turtlebot3_teleop_key.launch | ||
Using the keyboard, we should now see how Turtlebot moves in the simulation. | Using the keyboard, we should now see how Turtlebot moves in the simulation. | ||
- | ===== RVIZ ===== | + | ===== Visualize Simulation data (RViz) |
+ | RViz visualizes published topics while the simulation is running. You can launch RViz in a new terminal window by entering the below command. | ||
+ | |||
+ | $ roslaunch turtlebot3_gazebo turtlebot3_gazebo_rviz.launch | ||
+ | |||
+ | {{ : | ||
+ | |||
+ | ==== SLAM Simulation ==== | ||
+ | When SLAM (Simultaneous Localization and Mapping) in the Gazebo simulator, you can select or create various environments and robot models in the virtual world. | ||
+ | We assume that you already launch the // | ||
+ | |||
+ | **Run SLAM Node** | ||
+ | |||
+ | Open a new terminal, and run the SLAM node. G mapping SLAM method is used by default. We already set the robot model to burger in the //.bashrc// file. | ||
+ | |||
+ | $ roslaunch turtlebot3_slam turtlebot3_slam.launch slam_methods: | ||
+ | |||
+ | **Run Teleoperation Node** | ||
+ | |||
+ | Open a new terminal with Ctrl + Alt + T and run the teleoperation node. Move the robot and take a look at RViz, you are scanning the area by the laser sensor and creating a map out of it. | ||
+ | |||
+ | $ roslaunch turtlebot3_teleop turtlebot3_teleop_key.launch | ||
+ | |||
+ | Control Your TurtleBot3! | ||
+ | --------------------------- | ||
+ | Moving around: | ||
+ | w | ||
+ | a s d | ||
+ | x | ||
+ | |||
+ | w/x : increase/ | ||
+ | a/d : increase/ | ||
+ | space key, s : force stop | ||
+ | |||
+ | CTRL-C to quit | ||
+ | |||
+ | **Save the Map** | ||
+ | |||
+ | When the map is created successfully, | ||
+ | $rosrun map_server map_saver -f ~/map | ||
+ | |||
+ | {{ : | ||
+ | |||
+ | ==== Navigation Simulation ==== | ||
+ | Just like the SLAM in the Gazebo simulator, you can select or create various environments and robot models in a virtual Navigation world. However, a proper map has to be prepared before running the Navigation. | ||
+ | |||
+ | **1. Launch Simulation World** | ||
+ | |||
+ | Stop all previous launch files and running the node by "Ctrl + C". In the previous SLAM section, " | ||
+ | |||
+ | $ roslaunch turtlebot3_gazebo turtlebot3_world.launch | ||
+ | |||
+ | **2. Run Navigation Node** | ||
+ | |||
+ | $ roslaunch turtlebot3_navigation turtlebot3_navigation.launch map_file: | ||
+ | |||
+ | **3. Estimate Initial Pose** | ||
+ | |||
+ | Initial Pose Estimation must be performed before defining the destination for autonomous navigation as this process initializes the AMCL parameters that are critical in Navigation. TurtleBot3 has to be correctly located on the map with the LDS sensor data that neatly overlaps the displayed map. | ||
+ | |||
+ | 3.1. Click the 2D Pose Estimate button in the RViz menu. | ||
+ | {{ : | ||
+ | |||
+ | 3.2. Click on the map where the actual robot is located and drag the large green arrow toward the direction where the robot is facing. | ||
+ | |||
+ | {{ : | ||
+ | |||
+ | You can repeat steps 1 and 2 until the LDS sensor data is overlayed on the saved map. | ||
+ | |||
+ | {{ : | ||
+ | |||
+ | NB! 2D Pose Estimate actually publishes the position and direction into the /initial pose topic. So you can publish it by the command line or a script. | ||
+ | |||
+ | **4. Set Navigation Goal** | ||
+ | |||
+ | 4.1 Click the 2D Nav Goal button in the RViz menu. | ||
+ | |||
+ | {{ : | ||
- | To avoid having | + | 4.2 Click on the map to set the destination of the robot and drag the green arrow toward the direction where the robot will be facing. |
- | Run TurtleBot Visualization //Rviz//: | + | {{ :en: |
- | $ roslaunch turtlebot_rviz_launchers view_robot.launch | + | *This green arrow is a marker that can specify the destination of the robot. |
+ | *The root of the arrow is the x, y coordinate of the destination, | ||
+ | *As soon as x, y, θ are set, TurtleBot3 will start moving to the destination immediately. | ||
- | //Rviz// is initially configured to display the 3D Camera Image and Robot Model (URDF). In the bottom left, we also see a two-dimensional camera image. | + | {{ : |
- | {{: |