Robot software: Beyond algorithms

Gaurav Gupta
March 5, 2023
Read time 5 mins

My first pull into robotics, perhaps like most people was through planning algorithms. Over time as I worked through my early years in research and industry, I got a good understanding and interest in other core robotics concepts such as Mapping, Localization (SLAM), control, and computer vision. As time progressed, I was (am) often tasked with developing, deploying and maintaining software systems for robots with efficiency and scalability in mind. This journey has helped me understand and value some of the lesser emphasized components that go into autonomous robots. This article is a homage to them.

Hardware Interfaces

While almost all popular vendors provide ROS/ROS2 interfaces for their sensors, there are still a plethora of issues to overcome.

  1. There are a range of SBCs (Single Board Computer) to choose from. But the differences don’t just stop here, each SBC has several official partners providing their unique carrier boards. Making sure that vendor-provided ROS drivers or any other custom interfaces work on the application-specific SBC reliably, on boot, and functions, as desired, can still be a challenge.
  2. While sensor interfaces have been largely standardized in form of ROS drivers, the mechanism to standardize actuator interfaces hasn’t quite been cracked yet. Depending on the application, and several considerations one of many communication protocols may be used. CANbus, Modbus, and EtherCAT are some of the more common ones in the world of robotics.
  3. Setting up a working system is one thing, maintaining it is quite different. There are a plethora of hardware errors that occur when robots are out in the field. Overheating of hardware interfaces, loose cables, voltage spikes, storage getting overrun — just to name a few.

While there is specific expertise that goes into solving each of the above, but at some level, all of the above involves a good understanding of the Linux Kernel.

Software deployment and maintenance — DevOps

In the early stages of development, it is a fairly common practice to manually copy/paste ROS workspace to the robot, and compile and, delete the source files for privacy. This is a poor practice in the long run. Right here comes a need to maintain software versions, and releases and have a streamlined way to push software updates to the robots, this can also be challenging since robots are often in poor network zones.
Containerizing robot code and using CI/CD pipelines from Github/Gitlab are commonly observed practices as the robot deployments scale. Another alternative is the use of AWS services, particularly Robomaker. In my limited experience with AWS Robomaker, I’ve found it to be a mix of several other AWS services, complicated to grasp, and perhaps expensive in operation. Regardless, it is reported to be successfully used by some major robotics companies and is definitely an option to consider if the application and budget permit.

DevOps Pipeline (Courtesy)

Telemetry and Analysis

There are several useful reasons to monitor the state of a robot deployed remotely. Sharing robot performance insights and analytics with customers gives them a better picture of their investment. It is also remarkably useful for the developers to access live, and especially recorded data from the robot. Since the development of autonomous robots is rarely finished, there is always another bug to fix, or a new feature to release, use of ROS bags to analyze and improve the system can’t be underplayed.

Telemetry from the robot can be of two forms,

  1. Real-time — This is useful to monitor the current state of the robot, more like a live stream from the machine. This could contain information like the camera feed, vital sensor status, current task undertaken by the robot, and state/percentage completion of the task at hand. Multiple software services precisely help achieve this goal — Formant, Freedom Robotics, Inorbit, to name a few.
  2. Bulk transfer — This is of particular use to software developers. An autonomous robot in the field depends on sensor data such as multiple camera feeds, point clouds from LiDAR, IMU, GPS, etc. All of these put together generate several Gigabytes of data per minute! To understand, and fix any issues that may occur with the robot, all of these data streams are worth preserving and relaying back to the developer for analysis. The transfer of such a huge amount of data would typically involve some level of compression and secure upload to a reliable server, such as an AWS S3.

State and Task management

While core robotics concepts like SLAM/Planning/Control would largely remain unchanged across a range of robots, the actual mission statement for each of them is quite different. For example, an AMR in a given factory environment could be tasked with a set of 100 start-goal pairs in advance and execute them serially. In another setting, such tasks could be generated at run-time with instructions to charge batteries if no task is provided. In such a simple scenario presented, the fundamental principles that enable the robot to move autonomously remain unchanged but the software itself must be different.
These differences can be further magnified if the underlying industries the AMR is serving keep varying. A plant process for packaging water bottles for example would be remarkably different from sorting parcels, which would, in turn, be remarkably different from machining an engine part. While the core navigation stack remains unchanged across these applications, the state and task management by the robot’s decision-making engine keep changing. Developers often use tools like state machines, or behavior trees to achieve the desired outcome for a given application.

Behavior tree modelling the search and grasp plan of a two-armed robot. (Courtesy)


In this article, I highlighted some of the lesser discussed software components that go into making a deployable autonomous robot. No doubt the fundamental concepts and superior algorithms affect the performance of a robotics system but in absence of adept engineering, it will at best be a research prototype.

Read more
| Copyright © 2023 Black Coffee Robotics