Table of Contents

Self-driving Vehicles Simulation

Autonomous vehicle (AV) development is one of the top trends in the automotive industry and the technology behind them has been evolving to make them safer. In this way, engineers are facing new challenges especially moving toward the Society of Automotive Engineers (SAE) levels 4 and 5. To put self-driving vehicles on roads and evaluate the reliability of their technologies they have to be driven billions of miles, which takes a long time to achieve unless with the help of simulation. Furthermore, due to the past real crash cases of AVs, a high-fidelity simulator has become an efficient and alternative approach to provide different testing scenarios for control of these vehicles, also enabling safety validation before real road driving. Different high-resolution virtual environments can be developed based on the real world for simulators by using cameras or lidars to simulate the scenarios as close as possible to the real. Also, virtual environment development enables us to customize and create various urban backgrounds for testing the vehicle. Creating a virtual copy of an existing intelligent system is a common approach nowadays called a digital twin.

Simulator

Simulation has been widely used in vehicle manufacturing, particularly for mechanical behavior and dynamical analysis. However, AVs need more than that due to their nature. Simulation in various complex environments and scenarios included other road users with different sensors combination and configuration enables us to verify their decision-making algorithms. One of the most popular robotic simulator platforms is Gazebo. It is based on ROS (Robot Operating System) and utilizes physics engines and various sensor modules suitable for autonomous systems. However, there are several modern AV simulators based on powerful game engines that enable us to simulate the scene as real as possible for having high-fidelity simulations. Simulators such as SVL or CARLA are the most popular ones that are used nowadays for simulating different scenarios.

Virtual Environment

Nowadays the fierce competition in the gaming industry has brought many features to the table in terms of game engines. These engines can simulate physics and thus can be exploited as simulators aside from game development. SVL and others have already taken advantage of these engines and have created a framework for testing autonomous vehicles within these physics simulators. However, even though these simulators provide some basic tools and assets to get started, it is not enough. To make it more realistic, we need real-life terrains simulated.

Data Collection and Processing

The first step of the Terrain generation is data collection of the desired area. Aerial imagery with a drone, over the area to be mapped, has to be conducted. The images are captured at a grid-based flight path. This ensures that the captured images contain different sides of a subject. In order to make sure the images have maximum coverage, the flight path is followed three times in different camera angles but at a constant altitude. All the images are georeferenced and IMU tagged to put the positioning and orientation data on them for better stitching and later processes. Then a dense point cloud is created by using third-party software for the captured picture.

Terrain generation

Digitalization of a real-life environment can be used for simulating AVs in countless different scenarios without taking the vehicle out for once. Terrain generation from point-cloud is done right in Unity. The in-house developed plugin reads a pre-classified point-cloud file and based on chosen parameters it creates a normal map, a heightmap, and a color map to use in conjunction with the unity’s terrain engine to create realistic environments.

In the following, we have a brief introduction to AV basic requirements and simulation.