TIAGo++ Basic Capabilities

I have already written about a more sophisticated use case in my VQA article, but I briefly wanted to showcase a few basic capabilities of the platform as well. First of all: The robot I’m talking about is a Pal Robotics TIAGo++ with two arms:

TIAGo++ serving a (fake, non-liquid) Martini

Besides operating in the real-world the robot can also be used in simulators such as Gazebo or Webots – the next image is in Webots, the other ones will be in Gazebo:

TIAGo++ in Webots simulator

The robot is bi-manual, comes with a navigation laser and an ASUS Xtion RGB-D camera, its arms have 7 degrees of freedom and the grippers are replacable with full five-fingered hands. Since it is already challenging enough to reliably pickup objects, the grippers should suffice for now. It also can vary its height (from 110 to 145cm), has a differential drive and comes with a docking station it can drive back to in order to recharge, even though the battery packs usually last an entire day. Furthermore, it has an optional LIDAR and Nvidia Jetson which I still have to work with.

Some of the things you can do right out of the box includes Simultaneous Localization and Mapping to understand your environment as you drive around.

SLAM in Gazebo Simulation Environment

TIAGO++ comes with a comparably sophisticated tech stack. While I would have wished for it to be ROS2 already, I understand that providing ROS2 requires a rewrite of a large part of the provided software and is thus still elusive. However, that you sometimes need to go back to Python 2 is a serious nuisance in my opinion, in particular if you consider that this language version has been sunset since January 1st 2020. Still, once you made your peace with that aspect, you can enjoy how easy it is to do things like recognizing objects in the environment…

Traditional Object Detection by matching features and descriptors

…or ArUco markers:

ArUco Marker Detection in the environment (robot identifies particular marker in array of markers)

Similarly, face detection via OpenCV is easy to integrate (here based on an Adaboost cascade of Haar features)…

Face Detection

…and so is person detection…

…and basic tracking.

It gets more interesting when you interpret the environment in 3D. An Octomap, for instance, allows you to detect collision and thus perform motion planning in 3D, e.g. to avoid hitting the table. The coarse-grained structure is easy to plan over and is particularly powerful in combination with packages like MoveIt which take the burden of inverse kinematics (IK) from you and provide a convenient plan to get into particular position without collisions.

Octomap for Motion Planning and Collision Checking in 3D Space

As a final example of this let me demonstrate a pick and place operation which combines a few of these capabilities. As you can see, the initial ArUco marker detection, grasp generation, motion planning and pickup work well. However, after an abrupt movement the robot drops the object, even though its belief state is still that the object is in hand. The fallen object has its ArUco marker on the side meaning that even though the robot notices its error with this simple pipeline it is unable to detect the object again and resolve the situation itself.

TIAGo Object Pickup

The robot comes with many more capabilities such as playback of prerecorded motions like waving, reaching its hand/gripper, fully extending its arms etc., speech synthesis, being led by a human by pulling or pushing its arms and so on. It is also interesting that besides MoveIt you can directly use its trajectory controllers to manually control the individual motors. A neat third way of controlling movement used for the VQA demo linked to above is to use its Whole Body Controller (WBC) which not only provides a nice API that makes it easy to look at or point at 3D points, but also avoids self collisions. While it might not be obvious to people unfamiliar with robotics, it is quite challenging to avoid hitting your own body when freely controlling actuators and WBC not only avoids this by planning, but will also rotate the head out of the way when moving the arm which is just cool. In summary, this is one of the coolest toys I have had the pleasure of working with and while the VQA demo was a nice warmup exercise, I’m excited about the things to come.

Other Robots In My Life

It is interesting that besides TIAGo++ at least two additional major robots have found their way into my life: A Boston Dynamics Spot Arm from a friendly lab down the street as well as the vacuum robot in my apartment complex.

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply