Table of Contents

Indoor Mobile Robot

TurtleBot is a flexible robotic platform designed to work with ROS. Built from common components, TurtleBot is modular and therefore allows the user to create many different configurations. Turtlebot is the ideal platform for experimenting with and learning about ROS. There are many users of the turntable, which means that many functions are freely available on the Internet in the form of ROS nodes. You don't have to write any lines of code to get Turtlebot to automatically navigate the space. All you need to do is find the right nodes on the Internet and get them working.

The most recent and developed version of the turtlebot is version 3. TurtleBot3 is made up of modular plates that users can customize the shape. Available in three types: small-size Burger and medium-size Waffle, Waffle Pi. TurtleBot3 consists of a base, two Dynamixel motors, a 1,800mAh battery pack, a 360 degree LIDAR, a camera (+ RealSense camera for Waffle kit, + Raspberry Pi Camera for Waffle Pi kit), and SBC (single-board computer: Raspberry PI 3 and Intel Joule 570x) and a hardware mounting kit attaching everything together and adding future sensors. Turtlebot3 was released in May 2017.

Sensors

The main sensor of the turtlebot is the 360 Laser Distance Sensor LDS-01. The LDS-01 is a 2D laser scanner capable of sensing 360 degrees that collects a set of data around the robot to use for SLAM (Simultaneous Localization and Mapping). The below figure shows that how a robot sees the environment from a 360 laser sensor. The blue laser rays are reflected from the objects and the distance is measured and finally, a 2D point cloud of the environment is built.

The 3D camera is one of the most versatile robot sensors. One output of a 3D camera is a 2D camera image, which means that various object recognition algorithms can be used. Many machine vision libraries are available for ROS. One of the most widely used and versatile is OpenCV. In addition, for example, the most up-to-date artificial intelligence library You only look once (YOLO) is available.

Objects will be identified and a box will be drawn around them, with the name of the object type identified:

The advantage of a 3D camera over a conventional camera is the depth dimension. This allows the robot to sense the distance between objects. This feature allows the robot to develop autonomous navigation.

Turtlebot 3 Simulation

Install Dependent ROS 1 Packages

First of all, you need to install some dependencies. These are based on Ubuntu 18.04 and ROS Melodic.

$ sudo apt-get install ros-melodic-joy ros-melodic-teleop-twist-joy \
ros-melodic-teleop-twist-keyboard ros-melodic-laser-proc \
ros-melodic-rgbd-launch ros-melodic-depthimage-to-laserscan \
ros-melodic-rosserial-arduino ros-melodic-rosserial-python \
ros-melodic-rosserial-server ros-melodic-rosserial-client \
ros-melodic-rosserial-msgs ros-melodic-amcl ros-melodic-map-server \
ros-melodic-move-base ros-melodic-urdf ros-melodic-xacro \
ros-melodic-compressed-image-transport ros-melodic-rqt* \
ros-melodic-gmapping ros-melodic-navigation ros-melodic-interactive-markers

Install TurtleBot3 Packages

Install TurtleBot3 via Debian Packages.

$ sudo apt-get install ros-melodic-turtlebot3-msgs
$ sudo apt-get install ros-melodic-turtlebot3
$ sudo apt-get install ros-melodic-turtlebot3-gazebo

Install Simulation Package

The TurtleBot3 Simulation Package requires turtlebot3 and turtlebot3_msgs packages as prerequisite. Without these prerequisite packages, the simulation cannot be launched.

$ cd ~/catkin_ws/src/
$ git clone -b melodic-devel https://github.com/ROBOTIS-GIT/turtlebot3_simulations.git
$ cd ~/catkin_ws && catkin_make

Set TurtleBot3 Model Name

Set the default TURTLEBOT3_MODEL name to your model. Enter the below command to a terminal.

In case of TurtleBot3 Burger:

$ echo "export TURTLEBOT3_MODEL=burger" >> ~/.bashrc

The above line write export TURTLEBOT3_MODEL=burger in .bashrc file in your home directory. So whenever you open a new terminal the “burger” is assigned to the TURTLEBOT3_MODEL variable.

Start the robot simulation

There are several predefined environments that you can run the turtle inside them. In the following, you can see an empty, sample and house world.

Empty World:

$ roslaunch turtlebot3_gazebo turtlebot3_empty_world.launch

Sample World:

$ roslaunch turtlebot3_gazebo turtlebot3_world.launch

House:

$ roslaunch turtlebot3_gazebo turtlebot3_house.launch

In case you need to build your environment, you can use Gazebo building editor tools (Edit > Building Editor) to create a customer shape building.

In the Gazebo Simulator, you can add simulation objects from the menu by clicking on the desired object. You'll also find tools for moving, enlarging, and rotating objects in the same place.

Next, we try to control the robot remotely.

Operate TurtleBot3

In order to teleoperate the TurtleBot3 with the keyboard, launch the teleoperation node with the below command in a new terminal window.

 $ roslaunch turtlebot3_teleop turtlebot3_teleop_key.launch

Using the keyboard, we should now see how Turtlebot moves in the simulation.

Visualize Simulation data (RViz)

RViz visualizes published topics while the simulation is running. You can launch RViz in a new terminal window by entering the below command.

$ roslaunch turtlebot3_gazebo turtlebot3_gazebo_rviz.launch

SLAM Simulation

When SLAM (Simultaneous Localization and Mapping) in the Gazebo simulator, you can select or create various environments and robot models in the virtual world. We assume that you already launch the turtlebot3_world.launch file as shown above.

Run SLAM Node

Open a new terminal, and run the SLAM node. G mapping SLAM method is used by default. We already set the robot model to burger in the .bashrc file.

$ roslaunch turtlebot3_slam turtlebot3_slam.launch slam_methods:=gmapping

Run Teleoperation Node

Open a new terminal with Ctrl + Alt + T and run the teleoperation node. Move the robot and take a look at RViz, you are scanning the area by the laser sensor and creating a map out of it.

$ roslaunch turtlebot3_teleop turtlebot3_teleop_key.launch
Control Your TurtleBot3!
---------------------------
Moving around:
       w
  a    s    d
       x
        
w/x : increase/decrease linear velocity
a/d : increase/decrease angular velocity
space key, s : force stop
 
CTRL-C to quit

Save the Map

When the map is created successfully, open a new terminal from Remote PC with Ctrl + Alt + T and save the map.

$rosrun map_server map_saver -f ~/map

Just like the SLAM in the Gazebo simulator, you can select or create various environments and robot models in a virtual Navigation world. However, a proper map has to be prepared before running the Navigation.

1. Launch Simulation World

Stop all previous launch files and running the node by “Ctrl + C”. In the previous SLAM section, “TurtleBot3_World.world” is used to create a map. The same Gazebo environment will be used for Navigation. So run the simulation in the same world file :

$ roslaunch turtlebot3_gazebo turtlebot3_world.launch

2. Run Navigation Node

$ roslaunch turtlebot3_navigation turtlebot3_navigation.launch map_file:=$HOME/map.yaml

3. Estimate Initial Pose

Initial Pose Estimation must be performed before defining the destination for autonomous navigation as this process initializes the AMCL parameters that are critical in Navigation. TurtleBot3 has to be correctly located on the map with the LDS sensor data that neatly overlaps the displayed map.

3.1. Click the 2D Pose Estimate button in the RViz menu.

3.2. Click on the map where the actual robot is located and drag the large green arrow toward the direction where the robot is facing.

You can repeat steps 1 and 2 until the LDS sensor data is overlayed on the saved map.

NB! 2D Pose Estimate actually publishes the position and direction into the /initial pose topic. So you can publish it by the command line or a script.

4. Set Navigation Goal

4.1 Click the 2D Nav Goal button in the RViz menu.

4.2 Click on the map to set the destination of the robot and drag the green arrow toward the direction where the robot will be facing.