Every robot has a geometric fingerprint. It is determined by the arrangement of its joints, and every robot in the world falls into one of six categories.
That is the finding from EPFL's Learning Algorithms and Systems Laboratory. Their new framework, called Kinematic Intelligence, uses this taxonomy to transfer skills between different machines: a Duatic DynaArm, a KUKA LWR IIWA 7, and a Neura Robotics Maira M. No machine learning required. Just the fingerprint and algebra.
The fingerprint has a more precise name: singularity topology, the pattern of points where a robot's joints lock or behave unpredictably during a task. The researchers classified every possible three-revolute robot, the standard configuration in most factory arms, into one of six categories based on how its joints behave under stress. Once you know which category your robot falls into, the math tells you exactly where those danger zones are and how to plan around them. Different robot, same singularity map, different movement plan.
It is AI-free, by design. The researchers explicitly chose not to use machine learning because algebraic models generalize cleanly: a solution derived from pure math works across machines the way a geometry proof works across all triangles. Training a neural network to transfer skills between a Duatic and a KUKA requires either retraining on each new machine or a large dataset of both machines operating together in the same configuration. The taxonomy approach requires knowing the robot's joint parameters: length, joint limits, degrees of freedom. The math does the rest.
The authors tested the framework on three machines with very different joint constraints. The researchers were PhD student Sthithpragya Gupta, scientist Durgesh Haribhau Salunkhe, and LASA head Aude Billard. The Duatic DynaArm has tight joint limits. The KUKA IIWA has moderate ones. The Neura Maira M has much more relaxed boundaries. When the team moved the robots to new positions and reassigned tasks, the framework adjusted the trajectories without any retraining. The assembly sequence held.
The limitations are real. The framework lacks advanced sensing and cannot distinguish object weight: it knows where an object is, not how heavy it is. A robot running Kinematic Intelligence will move toward a cup the same way whether the cup is empty or full. The authors say mechanically safer robots will make realistic deployment possible in the next five years. That is a research timeline, not a product commitment.
The narrower claim is harder: math beats expensive machine learning for a specific, well-defined robotics problem on the testbed. Whether it holds on a real factory floor with human collaborators, unstructured objects, and the constant small variations that real environments produce is the question the paper explicitly does not answer. The assembly line was controlled. The world is not.
What the work demonstrates is that for problems with well-defined physical constraints, the classical robotics toolbox is not exhausted. The singularity taxonomy is a genuine contribution: an organizing principle for cross-robot skill transfer that did not exist before. The paper appeared in Science Robotics in April 2026. The AI industry has made enormous progress on manipulation and control, but this paper is a reminder that not every hard problem in robotics requires an expensive neural network to solve.