You may disagree or think there’s still a long way to go, but it’s hard to ignore the direction in which society is moving: one where humanoid robots will be part of our daily lives. When the technology matures and costs are no longer an obstacle, the decision surrounding robots will no longer be a technical one. How far will robots be able to go, and who will set the limits?
Robots could serve as soldiers, deployed in conflicts either independently or alongside human troops. They could act as emotional companions, not just for the elderly, but for anyone living alone. They could handle tasks like cleaning, cooking, and organizing. In factories, they could work tirelessly and consistently, taking over tasks we used to do ourselves.
The amazing thing is that many of these scenarios are already starting to come to life. For instance, the Figure 02 robot has been working autonomously in a BMW factory since last year. It can sort objects accurately thanks to what its developers call “touch” and “short-term memory.”
Back in 2018, the Detroit: Become Human video game seemed like pure science fiction. Even then, creator David Cage wouldn’t have predicted that by 2025 we would witness something similar in the real world. Yet, here we are. The most fascinating part is that Figure 02 doesn’t operate in isolation. Behind its actions is a neural network that is the real driving force.
Helix: The Brains Behind Figure 02
Helix is the neural network that serves as the robot’s brain, enabling its movements to appear increasingly natural. Helix has made significant advancements in just three months of operation in a logistics environment. It has learned to manipulate objects with dexterity. It also understands context, adapts to new situations, and seemingly remembers past actions. Remarkably, Helix continues to improve its performance, becoming faster and making fewer mistakes.
Initially, Helix worked with a simpler catalog of items, but over time, it has expanded to handle a wider variety of packages. These range from rigid boxes to soft envelopes and deformable plastic bags, which present a greater challenge for gripping and positioning. Some packages crumple, some bend, and others slide easily, making tasks more complex for a robot.
However, Helix adapts seamlessly. It adjusts its grip for each object and alters its strategy for flatter or softer packages. It even knows when to apply a slight twist or use a specific type of gripper. All of these adjustments happen on the fly, without any prior guidance about the package in front of it.
This adaptive behavior hasn’t been programmed line by line. Instead, Helix has learned through observation. It underwent 60 hours of training based on human demonstrations. With each new example, the robot gained a better understanding of how to handle different scenarios. As a result, the average time taken to process a package has decreased significantly.
Helix isn’t only fast. It also boasts impressive accuracy. Labels that were once misaligned are now positioned correctly 94.4% of the time. How has it achieved this? Helix has learned to smooth out wrinkled envelopes before scanning. A light pressure on the plastic is often enough to make the barcode visible. This minimal gesture reflects the robot’s evolving capabilities.

Learning relies on a fundamental aspect: memory. Helix has a vision system that analyzes what it sees in the moment, while also retaining information about what it observed a few seconds earlier. It’s like having a short-term visual memory, which enables the robot to make smarter decisions. For instance, if it notices that a package has already been turned at a specific angle, it won’t turn it again. If it fails to locate a label at first, it remembers where it might be and revisits that area.
Another significant improvement is the ability to incorporate the history of its movements. Previously, each action was an isolated event: see, act, see again. Now, Helix remembers the positions of its arms, torso, and head from moments ago, allowing for smoother movement. If something goes awry, such as a package slipping or a grip not being secure, it can make real-time corrections without having to start over.
Additionally, it has developed a sense of touch. It’s not the human kind, but a way to detect how much pressure it’s applying to an object. This force feedback allows Helix to pause a movement if it senses resistance or to adjust its actions if the object is lighter than expected. As a result, it can handle items more carefully and adapt to variations in weight or stiffness.
This isn’t just theory. Figure 02 has been tested in real-life conditions. During a one-hour session, the robot operated uninterrupted in a logistics environment, sorting packages autonomously while utilizing its new capabilities. It has a sense of touch, short-term visual memory, and the ability to correct mistakes on the fly. The outcome demonstrated fluid and precise work, requiring no direct intervention.
The footage from that session, included in the video above, showcases more than just a working robot. It highlights a neural network successfully navigating the chaos of the physical world. It illustrates Figure 02 making decisions, adapting, and acting with a level of autonomy that, until recently, seemed like science fiction.
Tesla has Optimus, Boston Dynamics has Atlas, and Agility Robotics has Digit. All are competing to lead the humanoid robot race. Meanwhile, Figure 02 is progressing steadily, operating without fanfare but achieving tangible results. Time will tell how all of these advancements evolve.
Image | Figure AI
Related | Forget Humanoid Robots Working in Factories. In South Korea, They Do Parkour
View 0 comments