Path Planning on Autoware: A Simulation tool (Part 3/3)

Simulation of Demo Data

Co-Authors:- Makarand Mandolkar, Utkarsh Shukla

To get an idea and a link of the content here in this blog, please have a look of the previous blogs, here Blog 1 and Blog 2

The GitHub repository can be found here.

PROJECT STATUS AND TASK ACCOMPLISHED :

The more and more we were exploring, the more and more we became aware of the challenges involved in achieving the objectives defined. Nonetheless, all the predefined tasks, which are Autoware installation and its exhaustive documentation, Running Demo Data (ROSBAG) on Autoware, Implementation of Openplanner, Create and follow the waypoints in the Open-Planner, Lane detection and changing, Obstacle detection and corresponding action, signal detection and stopping, On understanding the need that more flexibility is required for local planning and simulation, we even went ahead in learning and installation of CARLA (car learning to act). We did find the Coursera course about self-driving vehicle by the University of Toronto, quite helpful, without which the implementation of the above mammoth task could have been incomprehensible. As we were making progress in this journey, we also went ahead in trying signal detection and A* implementation (Link of the video) and having more time we would definitely be able to accomplish the same. Please read the below flow chart to have an understanding of the process flow.

PROCESS FLOW :

Process flow

As we already know, to extend the visualization and simulation capabilities of ROS, Autoware was developed. The entire process, in brief, is as follows: we need to initialize the Autoware to run the sample moriyama data (Rosbag data), about which has been explained in blog 2 and which can also be found here. This is followed by initializing various parameters of Autoware, that is mapping, localization, mission planning, motion planning, detection and sensing. A comprehensive explanation about which can be found in this video.

Local Planning

The Global Path is defined in the Rviz through Open-Planner, the waypoints generated covering the global path are stored inside the text file.This Text file is further accessed by CARLA Simulation and the waypoints are read through this textfile. CARLA is used to simulate the virtual environment which consists of Stationary as well as Dynamic Obstacles. We simulated the fixed obstacle for example a Parked Vehicle. The obstacle avoidance algorithm is then implemented to avoid this parked vehicle. The Obstacle avoidance is implemented by considering the blocked point and available path a vehicle can take. The distance is then calculated between the blocked point and available path and current lane and available path, the available path is then chosen by considering the next statement, “Shortest distance from current lane and available lane and lane farthest from the lane containing obstacle.”.

The introduction, installation and integration procedure for CARLA is as follows.

CARLA is an autonomous vehicle simulation software, which has been developed from the ground up to support development, training, and validation of autonomous driving systems. The simulation platform supports flexible specification of sensor suites, environmental conditions, full control of all static and dynamic actors, maps generation and much more. Following are the features of CARLA which are useful for the users .

CARLA installation :-

  1. Scalability via a server multi-client architecture: Multiple clients in the same or in different nodes can control different actors. Which means CARLA is scalable across different platforms.
  2. Flexible API: CARLA exposes a powerful API that allows users to control all aspects related to the simulation, including traffic generation, pedestrian behaviors, weathers, sensors, and much more.
  3. Autonomous Driving sensor suite: Users can configure diverse sensor suites including LIDARs, multiple cameras, depth sensors and GPS among others.
  4. Fast simulation for planning and control: This mode disables rendering to offer a fast execution of traffic simulation and road behaviors for which graphics are not required.
  5. Maps generation: Users can easily create their own maps following the OpenDrive standard via tools like RoadRunner.
  6. Traffic scenarios simulation: Our engine ScenarioRunner allows users to define and execute different traffic situations based on modular behaviors.
  7. ROS integration: CARLA is provided with integration with ROS via our ROS-bridge
  8. Autonomous Driving baselines: We provide Autonomous Driving baselines as runnable agents in CARLA, including an AutoWare agent and a Conditional Imitation Learning agent.

Installing CARLA and Autoware Depedencies.

Now coming on the second part, and that is how to run CARLA with the combination of the data that we are using for our autonomous vehicle. There are many development things available in CARLA such as map customization, adding assets etc.

But for now we will focus on the simulator only. Since our use is just for simulating the vehicle in the locally created map. The simulator can be run in two different modes and that are:

  1. Server mode: The simulator is controlled by a client application that collects data and sends driving instructions. In this mode the simulator hangs until a client starts a connection.
  2. Standalone mode: The simulator starts in sort of video-game mode in which you can control the vehicle with the keyboard.

For installing CARLA, you need to have UBUNTU 16.04 or later version of it.

Now following are the commands for installing it. These all commands require administrator permission.

sudo apt-get updatesudo apt-get install wget software-properties-commonsudo add-apt-repository ppa:ubuntu-toolchain-r/testwget -O - https://apt.llvm.org/llvm-snapshot.gpg.key|sudo apt-key add -sudo apt-add-repository "deb http://apt.llvm.org/xenial/ llvm-toolchain-xenial-7 main"sudo apt-get updatesudo apt-get install build-essential clang-7 lld-7 g++-7 cmake ninja-build libvulkan1 python python-pip python-dev python3-dev python3-pip libpng16-dev libtiff5-dev libjpeg-dev tzdata sed curl unzip autoconf libtool rsyncpip2 install --user setuptoolspip3 install --user setuptools

[Note-: In step 7 for libpng16, you have to use libpng for UBUNTU 18.04.]

For avoiding compatibility issues between Unreal Engine and the CARLA dependencies, the best configuration is to compile everything with the same compiler version and C++ runtime library. For running CARLA we need to have Unreal version installed in our system, which has the following procedure. Just you have to replace your own path in the given commands.

git clone --depth=1 -b 4.22 https://github.com/EpicGames/UnrealEngine.git ~/UnrealEngine_4.22cd ~/UnrealEngine_4.22./Setup.sh && ./GenerateProjectFiles.sh && make

There is another method to install CARLA using github official repository. But that’s a tedious process. So above installation method, is the recommended one.

Coming on the next step, which is one of the most important one and that we implicated in our project is, combining autoware and CARLA. Following are the requirements for the autoware to run in your system-:

  1. ROS kinetic
  2. Autoware (tested with 1.12.0)
  3. CARLA 0.9.6

For making a connection between autoware and CARLA we need to setup a bridge between the two. For that we need to follow the following procedure-:

cd catkin_wssource <path-to-autoware>/install/setup.bashcatkin_init_workspace src/# install dependenciesrosdep updaterosdep install -y --from-paths src --ignore-src --rosdistro $ROS_DISTROcatkin_make

Now for running the CARLA and Autoware parallely, following is the procedure that is being followed:-

Firstly run Carla Server, and the start Autoware (including carla-ros-bridge and additional nodes).

Following are the commands for the 2 terminals-:

#Terminal 1

#execute Carla#For details, please refer to the CARLA documentationnvidia-docker run -p 2000-2001:2000-2001 -it --rm carlasim/carla:<carla-version> ./CarlaUE4.sh

#Terminal 2

export CARLA_AUTOWARE_ROOT=~/carla-autoware/autoware_launchexport CARLA_MAPS_PATH=~/carla-autoware/autoware_data/mapssource $CARLA_AUTOWARE_ROOT/../catkin_ws/devel/setup.bashexport PYTHONPATH=$PYTHONPATH:~/carla-python/carla/dist/carla-<carla-version>-py2.7-linux-x86_64.egg:~/carla-python/carla/roslaunch $CARLA_AUTOWARE_ROOT/devel.launch

So above terminal commands will lead you to run both CARLA and Autoware. This will start you the map, being provided by the CARLA named as town. How the combination of these two works could be seen from the following photo.

Bridging

Since open source community of CARLA and Autoware is not that rich, so it gives you a bug within Autoware that leads to errors if the simulation time is below 5 seconds (e.g. ray_ground_filter and ndt_matching die) The simulation time is reset whenever you change the CARLA town (e.g by executing carla_ros_bridge with argument town:=Town01).

Results:-

The final working of the implementation can be seen in this video.

WHAT NEXT?

We went ahead in implementing A* algorithm but we were not able to execute it completely as expected. The vehicle wasn’t able to identify the lane and move over the obstacles, which it should not do. The further work will be done to implement this path corrected within the constraints.

Vehicle moving over the obstacles and lanes (green rectangle represents the vehicle)

THE CONCLUSION

We wish that the community of autonomous vehicles should grow more in the near future.This entire process was an edifying journey, and we wish more and more resources and opensource platform should be worked on and be made available to masses, to exploit the power of self-driving vehicles.

Though tried out best, Kindle excuse for any grammatical and spelling mistake, typos still found! Each video referred and attached gives a more descriptive explanation of the action to initiated than writing the same.

Thanks in advance!!

References :

1.https://www.autoware.org/

2.https://gitlab.com/autowarefoundation/autoware.ai/autoware/wikis/home

3.TY — JOUR AU — Darweesh, Hatem AU — Takeuchi, Eijiro AU — Takeda, Kazuya AU — Ninomiya, Yoshiki AU — Sujiwo, Adi AU — Morales, Y. AU — Akai, Naoki AU — Tomizawa, Tetsuo AU — Kato, Shinpei PY — 2017/08/20 SP — 668 EP — 684 T1 — Open Source Integrated Planner for Autonomous Navigation in Highly Dynamic Environments VL — 29 DO — 10.20965/jrm.2017.p0668 JO — Journal of Robotics and Mechatronics ER

4.[13] M. Likhachev, D. I. Ferguson, G. J. Gordon, A. Stentz, and S. Thrun, “Anytime dynamic A*: An anytime replanning algorithm,” Proc. of the Int. Conf. on Automated Planning and Scheduling 2005 (ICAPS 2005), pp. 262–271, 2005.

5.S. Thrun, M. Montemerlo, H. Dahlkamp, D. Stavens, A. Aron, J. Diebel, P. Fong, J. Gale, M. Halpenny, G. Hoffmann, K. Lau, C. Oakley, M. Palatucci, V. Pratt, P. Stang, S. Strohband, C. Dupont, L.-E. Jendrossek, C. Koelen, C. Markey, C. Rummel, J. van Niekerk, E. Jensen, P. Alessandrini, G. Bradski, B. Davies, S. Ettinger, A. Kaehler, A. Nefian, and P. Mahoney, “Stanley: The robot that won the DARPA grand challenge,” J. of Field Robotics, Vol.23, №1, pp. 661–692, June 2006.

6. https://www.coursera.org/specializations/self-driving-cars

7. https://www.youtube.com/watch?v=puOrnJJhpXE&t=7s

8. https://www.aitrends.com/ai-insider/simultaneous-localization-mapping-slam-ai-self-driving-cars

9. http://proceedings.mlr.press/v78/dosovitskiy17a/dosovitskiy17a.pdf

10. https://github.com/carla-simulator/carla-autoware

11. https://carla.readthedocs.io/en/latest/how_to_build_on_linux/

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store