Cheap VR Headsets Are Turning Factory Robots Into Remote-Operable Tools
On a factory floor somewhere, a person in a Meta Quest headset is controlling a robot arm that is mounting components on a production line thirty miles away.

image from GPT Image 1.5
On a factory floor somewhere, a person in a Meta Quest headset is controlling a robot arm that is mounting components on a production line thirty miles away. The arm is a Mitsubishi Electric MELFA, a workhorse industrial robot that has been loading and unloading machines in factories for years. The person running it has been trained on the headset for a few hours, not on the robot itself. That is the shift that Mitsubishi Electric Automation Systems UK and a London-based robotics company called Extend Robotics are betting on: turning industrial robot arms into remotely operable tools that any trained worker can manage from a consumer VR headset, anywhere in the world.
In April 2024, Mitsubishi Electric Automation Systems UK, a subsidiary of the Japanese industrial robot manufacturer, announced a partnership with Extend Robotics to deploy Extend Robotics' Advanced Mechanics Assistance System, or AMAS, on its MELFA range of industrial arms, according to Robotics and Automation Magazine. AMAS is middleware that connects off-the-shelf consumer VR hardware — Meta Quest, HTC Vive — to robot controllers via the cloud. The robot workspace renders in 3D inside the headset, giving the remote operator genuine depth perception. Gesture input controls the arm. Latency is low enough for real-time error recovery. The setup cost and training time are a fraction of what conventional robot reprogramming requires. Barry Weller, product manager for mechatronics at Mitsubishi Electric Automation Systems UK, said the partnership was aimed squarely at manufacturers struggling with a skilled worker shortage that shows no sign of easing. One remote operator, the companies say, can oversee multiple production lines or sites from a single location.
That is the factory floor version of the same insight driving AR smart glasses into warehouses: give workers better eyes into automated systems, and you can deploy robotics faster without waiting to hire specialists. DHL has had Vision Picking running globally since 2019, with 15 to 25 percent productivity gains and 99.9 percent picking accuracy, according to AI Multiple and reporting by Supply Chain Dive. Coca-Cola reported six to eight percent improvement in picking operations with comparable accuracy after deploying AR guidance. Airbus uses Microsoft HoloLens 2 for assembly guidance on the floor. Taqtile's Manifest platform handles quality inspections. Vuforia supports maintenance and repair. XR device shipments grew 40 percent year over year in 2025, per AI Multiple, which also projects the broader XR market growing from $253 billion in 2025 to $1.6 trillion by 2032.
The use cases Extend Robotics has documented go beyond conventional picking. Leyland Trucks, a legacy British manufacturer, ran an initial feasibility study using AMAS for hazardous truck painting and high-voltage component insertion — tasks that require judgment in unstructured conditions and have historically resisted full automation. The company is exploring continued work on the pilot, according to Extend Robotics. Airbus is exploring space operations. AtkinsRealis, the Canadian engineering group, is evaluating nuclear waste handling applications, per Extend Robotics. A project with Saffron Grange vineyard in the UK is testing agricultural robots. None of these are solved by conventional preprogrammed automation. They require a human in the loop who can see what the robot sees and react.
Chang Liu, founder and chief executive of Extend Robotics, frames the opportunity not as a productivity play but as a data problem. Language models trained on the entire internet have vast quantities of text and images to learn from. A robot arm doing a picking task generates almost no comparable dataset — this is the core bottleneck that the embodied AI field has not solved. AMAS addresses it incidentally: every teleoperation session captures synchronized sensor, video, and motion data that feeds back into machine learning pipelines. Suddenly your teleoperated robots become AI-driven autonomous robots, Liu said, and you are still in control. The company showed a trial in which AMAS trained a robot to autonomously complete a complex industrial plugging task by combining teleoperation data with Nvidia-powered digital twin simulations. The long-term vision is a self-improving loop: start with teleoperation, add edge cases from deployed robots, fine-tune the AI model, move toward autonomy while keeping a human in the loop. It is an incremental path to useful robot autonomy that does not require solving general intelligence first.
Azmat Hossain, business development director at Extend Robotics, put it plainly: there are still a lot of jobs that do not make sense for people to be doing. Hazardous environments, repetitive strain, tasks in nuclear facilities. Robots are not coming for these jobs — they are stuck on the loading dock because no one has figured out how to train them on edge cases at scale. XR teleoperation, at least for now, is the workaround.
The EU Horizon MASTER research project, which the original EE Times coverage focused on, explores parallel themes — dynamic safety zones, gaze-based robot interaction, codeless programming for non-expert operators — through a consortium including researchers from Tekniker, the University of Patras, and the German Research Center for Artificial Intelligence (DFKI). The VIROO platform developed under the project offers a commercial XR training environment. A XR Today report on the project noted the collaborative approach between industry and academia. But MASTER is a research programme. The Mitsubishi Electric partnership is a commercial product available today, and the Leyland Trucks feasibility study represents the kind of early-stage pilot the industry is now running. One operator in a Meta Quest headset can now oversee a production line that previously required a robotics specialist on site. That is the gap between the lab and the floor — and on the factory floor, the floor wins.

