For decades, engineers treated biology as a reference manual: watch a fish swim, extract the principle, build a machine. A paper published April 23 in Nature npj Robotics describes something different: robots becoming the reference source. Living organisms as input devices. The machine learning from the animal, not the other way around.
The work, from Lei Li and Junzhi Yu of Peking University with nine co-authors, is a synthesis. But its central argument is a genuine inversion. The paper's explicit claim: robotics now generates biological knowledge as much as it borrows from it. Robots are controlled physical experiments, testbeds where researchers isolate variables that living animals will not allow them to touch. The robot becomes the instrument, not just the artifact.
"The robot allows variables to be isolated and tested in ways that are often impossible in living animals," the paper states. That sentence is the hinge. It means biology is no longer just a design template. It is becoming a data source the machine can interrogate.
The 2021 work by Dan Quinn and Qiang Zhong at the University of Virginia already pointed here. Their robot fish had tunable tail stiffness: adjustable in real time, like a fish flexing its body, and swam across a wider speed range using roughly half the energy of a fixed-stiffness equivalent. They were not copying fish more carefully. They were extracting a design rule from biological data.
What the Li and Yu paper synthesizes across dozens of such cases: fish muscles do double duty. They actuate motion and they sense the surrounding flow. Sometimes muscle activation occurs after body deformation in vortical flows: the muscle is not commanding the body into position, it is responding to what is already happening in the water. That changes the engineering calculus. A robot whose muscles both move and sense can in principle do more with less: fewer sensors, lighter hardware, tighter integration between actuation and perception. The target both lines of work are aiming at is muscle-as-sensor fusion, a machine where the thing that moves is also the thing that feels.
The paper distills four biological principles from this body of work: locomotion and fluid-structure interaction, morphology and material architecture, distributed sensing and flow perception, and biological control for adaptive behavior. It cites specific biological architectures: tuna channel muscle that power high-speed swimming through tendons, paddlefish gill rakers that filter plankton with 95 percent efficiency using gaps only 50 micrometers apart. These are not just design templates. They are data points in an ongoing experiment.
MIT's electrofluidic fiber muscle research from earlier this month points in a similar direction, soft and self-contained actuation without external pumps. The muscle-as-sensor fusion both trajectories are chasing has no working prototype yet. The gap between what these papers describe and what works in the field remains significant.
But the trajectory is clear: the next generation of underwater robots will not just look like fish. They will think like them too.