eva1000

Cheap Cobots That Mimic Humans

Cobot To-Do List: Cheaper to Buy, Easier to Use

The $6,500 cobot and cobots “mimicking humans” have arrived

First, the cheap part
“Is there a cheaper cobot? Can they be made easier to use?” Those all-too-familiar refrains from potential customers are still being heard on the cobot circuit.

Crack those two wish-list favorites and all hell might break loose in cobot world.

At an integrator-sponsored cobot demo held at Castle Island Brewery (really liked the choice of venue), I witnessed, once again, the two refrains ushering up from five different groups of demo participants. Amazingly, without any influence from any of the other demo participants, each group of five individuals inquired about price and ease of use. …Where have I heard that before? Everywhere!

Each participant did what humans do instinctively: each reached out, grabbed the pieces needing to be assembled, and then assembled them. Undoubtedly, the thought that must have struck each participant: why can’t this robot watch me and then mimic how I assembled the pieces?

As much as cobot manufacturers try to mitigate customer pushback with cobots that are much less costly than standard industrial robots; and as hard as they work to make teaching cobot set up and operation as simple and as painless as possible, the same two questions still haunt demos.

The answers, of course, are cheaper cobots, and cobots that can watch humans and then mimic human arm and hand movements.

The cobot manufacturer that can serve up solutions to those two needs will have more cobot orders than they can possibly accommodate.

For cobot sales going forward, Markets & Markets projects “50 percent growth a year, from just $710 million in sales in 2018 to $12.3 billion in 2025.”

Price aversion gets a smile
British-based start-up Automata (founded in 2015) has just released a six-axis cobot arm named Eva for $6,500 (£5k).

Amazing: Three Eva cobots are $24k cheaper than a one UR 10.

To give a little perspective to that bargain-basement price, cobots, according to Robotiq’s blog, go on average for $35k. A brand new, still-in-the-box Universal Robots UR-10 is selling for $44k on eBay. A used teach pendant for the UR-10 is $3k on eBay. Upkeep for the UR-10 could run as much as the purchase price.

The prospects of an SME outfitting production with a few of the high-end cobots can be a downright withering experience. So says Alex Craig, owner of Qualitetch, a 38-employee team with $4 million in revenues. Craig says that Eva’s arrival was like Christmas morning: “I had been looking forward to this for more than 12 months. It was his first opportunity to introduce true automation at the factory.”

An SME who truly wanted to introduce automation but couldn’t. Now, that’s a market waiting to happen.

How many SMEs like Craig are there worldwide? Hundreds of thousands, easy! How many would be overjoyed to get their hands on a few Evas? Plenty!

“On a single-shift operation as we are,” says Craig, “[expensive cobots are] not practical, it’s not economic. But we can shell out five grand (£5k) per unit. It’s a complete no brainer.”

The set-up and programming of Eva is simple enough for Qualitetch to have even made an “unboxing” video of the experience.  “The actual arrival to functional use was about two hours,” Craig said. “The actual programming was ten minutes.”

Automata co-founders, Mostafa ElSayed and Suryansh Chandra, say of their sleek-looking cobot arm: [We] “started Automata to democratize robotics and to ultimately allow anyone to seamlessly use a robot. We are extremely proud to offer Eva at the price point we do. People can visit the Automata website and buy a piece of industrial quality equipment on their credit card — it doesn’t get much more accessible than that.”

Previously, Automata had raised around $2 million in initial funding; plus a recent $7.4 million in a series A round from Belgium-based Hummingbird Ventures, with participation from Firstminute Capital, Hardware Club, LocalGlobe, ABB, and Entrepreneur First.

You be the judge. See video below:

Second, the mimicking part:
Still in semi-research mode—but closer than you may think to the marketplace—is the ability of a robot to mimic human movements. Here are two novel ways:

The Nvidia way: Synthetically trained neural networks

“A team of researchers from Nvidia has developed a new deep learning technique that allows robots to be taught to mimic human actions just by observing how people perform certain tasks. This technology can greatly reduce the amount of time it takes to program robots to perform their desired workloads.

“One of the most impressive parts of Nvidia’s research is that a human only needs to perform a task one time for the robot to learn how to repeat the actions. A video camera streams a live feed to a pair of neural networks that handle object recognition.”

The Cornell University way: Domain-adaptive meta-learning

“Humans and animals are capable of learning a new behavior by observing others perform the skill just once.

At Cornell University, researchers consider the problem of “allowing a robot to do the same — learning from the raw video pixels of a human, even when there is substantial domain shift in the perspective, environment, and embodiment between the robot and the observed human.

“We [Cornell researchers] present an approach for one-shot learning from a video of a human by using human and robot demonstration data from a variety of previous tasks to build up prior knowledge through meta-learning.

“Then, combining this prior knowledge and only a single video demonstration from a human, the robot can perform the task that the human demonstrated.

“Prior approaches to this problem have hand-specified how human and robot actions correspond and often relied on explicit human pose detection systems.

“In this work, we present an approach for one-shot learning from a video of a human by using human and robot demonstration data from a variety of previous tasks to build up prior knowledge through meta-learning.

“Then, combining this prior knowledge and only a single video demonstration from a human, the robot can perform the task that the human demonstrated…the robot can learn to place, push, and pick-and-place new objects using just one video of a human performing the manipulation.”

AI casually moves into cobot sphere.