Unity and ROS: Keeping it real

Gaurav Gupta
March 5, 2023
Read time 4 mins

In an earlier blog post, I talked about the usage of simulations for robotics application development. While my focus was on ground vehicles, the general principle of using simulation for ease of development and testing applies to all robotic systems. In this article, I’ll dive into what a gaming engine such as Unity can offer towards the development of robot software and the state of the art with regards to using Unity alongside ROS/ROS2.

Why Unity?

Most (if not all) autonomous robots today use visual information in some way shape or form — SLAM, Odometry, object identification, lane detection, and the list goes on. Unfortunately, several de-facto simulators such as Gazebo fail to provide any meaningful photo-realism for the development and testing of vision-based algorithms.

Unity-3D has been one of the two (other being Unreal Engine) go-to gaming engines used extensively by developers for over a decade. This means there is a huge library of rich assets, documentation, and user community to benefit from. In the context of simulating robots, this involves everything modeling the robot design through articulate bodies, setting up several types of joints, physics parameters like friction, colors, and lighting, modeling moving obstacles, and a huge arsenal of objects to model the environment, and the list goes on.

An oarchard like world in unity

Alright, now that we have a use case for using Unity for robot simulations, there arises an obvious question — how? Like many other things, there is no single correct answer. Multiple open-source packages integrate Unity with ROS/robotics in different ways. ros2-for-unity for example does a really good job of facilitating communication in a native ROS2 manner, ZeroSimROSUnity provides a rich arsenal of resources and ROS communication but we found ourselves constrained while using that to develop new modules per our use case. Therefore, for now, we will be checking out the release from the horse’s mouth — the official Unity release for ROS integration. Let’s dive in!

Unity Robotics Hub

ROS-TCP-Connector (Connector) is the official Unity-ROS interface, developed and maintained by Unity-Technologies themselves. Fundamentally it solves the communication problem between the Unity world and ROS world using TCP sockets. The setup only involves two steps —

  1. Unity side (Connector): Adding the connector package to Unity
  2. ROS side (Endpoint): Running the endpoint in a Docker

The setup works out of the box without much change in configuration, the Github documentation covers it well. This setup allows you the ability to write ROS publishers/subscribers with standard and custom messages across the Unity-ROS world. Unity Robotics Hub also provides some other useful and complimentary packages, let’s check them out —

  1. URDF Importer — This allows the user to directly import their existing robot descriptions to the Unity world in a fairly straightforward manner. Robots are imported as a familiar articulated body comprising links and joints as specified in the URDF.
  2. Visualization package — Used to visualize useful ROS messages in the Unity world such as TF, 3D Lidar, Twist commands, etc. While Rviz does that, this module is likely to be useful for debugging and ensuring the expected data is sent out and received in.
  3. Scripts and examples — The set of packages some with some basic scripts such as RPM control, AGVControl (uses Twist commands to move a differential drive robot), 2D Laser scan, and some more. They are also accompanied by basic integration examples such as nav2-SLAM and arm robot pick-place.

In addition to solving the paramount problem of Unity-ROS communication, I found the URDF importer to be pretty useful in terms of getting robot visuals, collisions, and link connections seamlessly into Unity-land. Scripts are useful in terms of showing some direction concerning more serious application development using these packages. There are still several questions and challenges to overcome to use the system with your ROS system, namely —

  1. Non-native ROS2 communication — While the setup allows the user to choose between ROS1 and ROS2, and indeed you get ROS2 data out from the ROS Endpoint, but communication between Connector and Endpoint is TCP based. The Endpoint then sends these messages across the ROS2 DDS layer. Now this may seem trivial but one major hindrance we found working with the setup was the inability to control topic-specific Quality Of Service (QoS) — one of the, if not the most important attribute of ROS2
  2. Not enough sensor plugins — The scripts and examples do not contain some of the more important and also more tricky sensors such as a 3D Lidar, RBG-D camera, IMU, and then some more. This means the user (in this case, the simulation developer) is expected to write efficient plugins for these sensors and package them as appropriate ROS(2) topics for use
  3. URDF importer can do more — In the current setup, the URDF importer does not take into account several parameters that the majority of developers are used to with ROS/Gazebo. This would include specifying sensors and their necessary attributes (like FoV, Range, noise etc)


Gaming engines have a lot to offer to robotics, not least of all realistic scene rendering. However, using Unity to simulate your robotics stack would take some development, knowledge of C# (which unlike c++/python is not exactly the primary language of ROS developers), and an understanding of core Unity concepts.

As time goes by, we expect camera based algorithms and applications to be all the more prevalent in autonomous robot navigation. At Black Coffee Robotics, we have been actively addressing these issues in an attempt to make the transition from simulators like Gazebo to Unity for vision/AI-heavy applications. Here’s a little demo of a robot fleet carrying out intra-logistic operations in a warehouse setting!

If you’re looking to leverage some photo-realism and compute optimizations for your development, testing, and even demos, reach out to us!

Read more
| Copyright © 2023 Black Coffee Robotics