Drone swarms have a communication problem, and the solution may be light.
Researchers at Worcester Polytechnic Institute (WPI), a technical university in Worcester, Massachusetts, and MIT Lincoln Laboratory, a federally funded research and development center operated by MIT for the U.S. Air Force, have built an optical communication system that lets robots talk to each other using flashing LEDs and event cameras — no radio waves, no jamming risk. A preprint posted to arXiv on March 19 describes a system that achieves over 95 percent decoding accuracy while tracking a moving LED transmitter at frequencies of at least 1 kilohertz. The processing is seven times faster than the previous state of the art.
The work is Air Force funded. Two of the six authors — Brendan Long and Matthew Cleaveland — are researchers at MIT Lincoln Laboratory, which operates under Air Force contracts administered at Hanscom Air Force Base in Massachusetts. The paper lists two contract numbers: FA8702-15-D-0001 and FA8702-25-D-B002. The copyright is held by MIT and delivered to the U.S. Government with Unlimited Rights under DFARS — the standard language for DoD-owned research cleared for public release.
What the Air Force is paying for, in plain terms: drone swarms that can coordinate even when someone is jamming their radios.
The operational urgency is not hypothetical. IEEE Spectrum reported that tens of thousands of jammers have been deployed on Ukrainian front lines, forcing drone manufacturers to scramble for alternatives. KrattWorks, an Estonian drone company, adopted optical neural-net navigation for its Ghost Dragon quadcopter specifically to overcome GPS and RF jamming. The cat-and-mouse game, KrattWorks COO Martin Karmin said, is moving extremely fast. The Associated Press confirmed that both Russia and Ukraine have deployed fiber-optic tethered drones as a workaround — the tether makes jamming irrelevant. But a tethered drone is a solo drone. Tethers kill the dynamics that make swarms worth building. A RAND Corporation commentary from November 2025 framed electromagnetic warfare as NATO's blind spot: control over the invisible battlespace where communications are jammed and drones are blinded can decide the outcome of a conflict.
The WPI/MIT LL system is designed for the untethered case. Instead of radio frequencies, robots use flashing LEDs as transmitters and event cameras as receivers. Event cameras — unlike conventional frame-based cameras — don't capture video at a fixed frame rate. They register individual pixel-level changes as they happen, with microsecond resolution, and they handle high dynamic range environments that would blind a normal camera. That makes them extraordinarily sensitive to a blinking light, even when both the camera and the light source are in motion.
The gap the new paper closes is a real one. Prior work by Ziwei Wang and colleagues, published as a preprint to arXiv in 2022 and presented at IROS, showed event cameras could receive LED-encoded messages at 4 kilobits per second indoors and 500 bits per second at 100 meters outdoors — but only between a stationary transmitter and stationary receiver. A 2025 preprint by Hang Su and colleagues, later published in IEEE Robotics and Automation Letters, extended event-camera optical communication to moving receivers watching a stationary screen, reaching up to 114 kilobits per second. Neither handled the moving-transmitter case — a drone sending light signals while flying toward another drone that is also moving.
The WPI/MIT LL system handles it. The hardware is a commercial Prophesee EVK4 event camera receiving LED signals from a transmitter mounted on a moving drone. The key algorithmic contribution is a Geometry-Aware Unscented Kalman Filter, or GA-UKF, that operates on a Riemannian manifold — a geometric space that accounts for the non-Euclidean structure of the covariance matrices used in tracking. In practical terms: it processes events in batches rather than sequentially, uses that geometry to run a more accurate tracking filter, and does so fast enough for real-time decoding. The result is seven times faster processing than the previous EKF-based baseline while maintaining equivalent tracking accuracy.
A 2025 survey of event-based optical marker systems by Maxime Robic and colleagues, submitted to IEEE Robotics and Automation Magazine, named drone-to-drone communication as one of the primary target applications for the field — confirming this is an active research direction, not one laboratory's niche.
The institutional thread connecting WPI to MIT Lincoln Lab runs through the paper's lead academic, Kevin Leahy, assistant professor of robotics engineering at WPI. Leahy spent 2017 to 2023 at MIT Lincoln Lab's AI Technology Group, where his work focused on multi-agent autonomous systems — decentralized planning, collision avoidance, coordination for heterogeneous robot teams. His WPI faculty profile shows he arrived at WPI as someone who already knew what problems Lincoln Lab was trying to solve. Long and Cleaveland didn't wander into this collaboration. They came through Leahy.
The event camera hardware is worth putting in context. Gregor Lenz, former CTO of Inivation and an industry observer of the event camera market, wrote in 2025 that the market was worth roughly $220 million and still searching for its killer app. Defense and autonomy — specifically the kind of low-latency, low-power sensing needed onboard a drone executing a mission without a radio link — was his candidate. A communication system that requires only an LED and a commercially available event camera is a realistic deployment path in a way that exotic custom sensors are not.
Caveats, because they matter here: this is a preprint submitted to the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2026 and is currently under review. The results are from a lab demonstration, not a fielded system. The six authors — Harmeet Dhillon, Pranay Katyal, Brendan Long, Rohan Walia, Matthew Cleaveland, and Kevin Leahy — report 95 percent decoding accuracy during motion, which is strong for a research prototype. Range limitations, performance in bright outdoor conditions, multi-drone scalability, and the rate at which messages can actually be transmitted are open questions the paper does not fully address.
What it does establish is that the fundamental challenge — tracking a light source on a moving transmitter and decoding its signal in real time — is solvable with commercially available hardware and a well-designed filter. The Air Force is paying to find out how far that goes.