robotHPT1000

How to Train Your Robot

MIT, Korea, and $7 million in Korean funding are going to co-develop training techniques for industrial robots via HPT, which allows robots to adapt to new tasks and environments without extensive retraining.

Called heterogeneous pre-trained transformers (HPT), the technique combines diverse data from various domains and modalities allowing robots to learn new tasks without needing to start training from scratch.

“Our dream is to have a universal robot brain that you could download
and use for your robot without any training at all.” — Lirui Wa
ng, MIT
co-author

MIT & Korea Partner
By far the global leader in robot density at over 1000 per 10,000 workers, Korea knows full well that all of those robots work with a severe limitation: they are all “dumb” robots. Lack of intelligence will soon be a critical handicap for any nation’s robots as “physical AI” begins its much-heralded convergence with robotics.

Korea, arguably East Asia’s leader in applied artificial intelligence, has correctly foreseen the potential of “physical AI” in robots and is moving fast to capture as much of this newly emergent technology as possible.

Robotis, a prominent Korean robot developer and MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL ), plus $7 million in grant money, were selected for an international joint research project led by the Korea Institute for Advancement of Technology (KIAT), under the Korean Ministry of Trade, Industry, and Energy, to develop physical AI.

Kim Byung-soo, CEO of Robotis, expressed his enthusiasm for the partnership, stating, “As ‘Physical AI’ is expected to become the biggest topic in the robotics industry, we will focus on producing results by seizing this great opportunity.”

See related:
Korea’s Plan for AI/ML Dominance…Brilliant!
Korea looks to ramp up artificial intelligence and converge with recent successes in robotics (6x growth from 2009 to 2016).

Specifically, the aspect of physical AI to be researched will be to co-develop what the partners call: “achieving human-level manipulation capabilities.” Basically, that’s all about using physical AI to engineer intelligence to create a “smart” cobot arm and hand (end-effector), which is something very much needed but elusive to achieve so far.

Plug & Play to the Max!
Imagine a near future of manufacturing or logistics where a smart cobot arm and gripper come off the assembly line totally pre-trained on the exact job(s) that it will perform. Imagine a cobot/gripper duo that listens when spoken to and adjusts, performs QC and remembers, and reports out on its performance.

AI is the difference maker. As an example of what AI brings to manufacturing, take smartphone manufacturer Xiaomi switching on its fully autonomous dark factory of robot-only workers, capable of producing 10 million handsets a year without human intervention.

“This model will become increasingly common as manufacturers chase improved efficiency, sustainability and reduced waste. While “lights out” factories have been around for a while, Xiaomi’s factory is the first that is able to learn how to operate more efficiently and optimize its own processes thanks to its AI-powered ‘brain’!”

See related:
Demise of Dumb: Why Make Industrial Robots Smart? A new generation of robots is on the rise, and their calling card reads, smart!he demise of dumb

“In my view,” notes Wang, the study’s lead author, “another big problem is that the data come from so many different domains, modalities, and robot hardware. Our work shows how you’d be able to train a robot with all of them put together.”

The main advantage of this technique is its ability to integrate data from different sources into a unified system. This approach is similar to how large language models are trained, showing proficiency across many tasks due to their extensive and varied training data. HPT enables robots to learn from a wide range of experiences and environments.”

Robotis plans to build physical AI capabilities into its cobot arm and gripper, called the OpenMANIPULATOR-Y (OM-Y), Currently, OM-Y performs tasks with a gripper-type end-effector, but with human-level manipulation capabilities, it is expected to operate more efficiently in a wider variety of industrial environments.

“Developing physical AI requires cutting-edge precision components, including tactile sensors that mimic the touch sensitivity of human hands and ultra-small, ultra-precise actuators equipped with high back-drivability and torque density.”

Robotis also plans to utilize its actuator technology to collaborate with MIT researchers to develop technologies and components related to physical AI.

Physical AI is all about systems on a robot that learn about an environment directly from sensor data, like sensors and actuators, including cameras, microphones, temperature gauges, inertial sensors, RADAR and LiDAR, physical agents that enable robots to solve problems that involve direct interaction between them and the physical world.

Our elusive sixth sense…in a machine
Maybe the toughest challenge for robot training and HPT is proprioception, better known as our sixth sense. “Sight, hearing, smell, taste, touch: We’re all familiar with the five senses that allow us to experience our surroundings,” says Dr. Niccolò Zampieri, head of the Development and Function of Neural Circuits Lab at the Max Delbrück Center in Berlin.

“Equally important but much less well known is the sixth sense: Its job is to collect information from the muscles and joints about our movements, our posture and our position in space, and then pass that on to our central nervous system. This sense, known as proprioception, is what allows the central nervous system to send the right signals through motor neurons to muscles so that we can perform a specific movement.

“People [as well as robots] without proprioception can’t actually perform coordinated movements.” Like how to put a cup to your lips in the dark.

Lirui Wang and his crew at MIT are well aware of their program’s need to address some aspect of proprioception for their HPT training techniques. Until then, bipedal humanoid robots will look gawky and stiff while walking or while attempting any coordinated movement. Same goes for cobot arms while tending a CNC machine.

There’s certainly a ways to go for physical AI and robotics to totally converge, but the journey is afoot.