Training, testing and coding robots is a grueling process. Our recently launched Isaac platform promises to change all that.
Few know that better than roboticist Madeline Gannon. For the past month, she’s been hard at work developing a robotic art installation in her research studio in the Polish Hill neighborhood of Pittsburgh.
Her development sprint is focused on the debut of Manus, which connects 10 industrial arms with a single robot brain to illustrate new frontiers in human-robot interaction.
She’s been racing the clock to develop the software and interaction design to bring these robotic arms to life in time for today’s World Economic Forum exhibit, in Tianjin, China. Putting in 80-hour weeks in her warehouse, she’s faced the difficult task of taking two bots on loan from robotics company ABB and using them to simulate the interactions of all 10 robots that will be at the show.
Gannon relied heavily on her simulations to create the interactions for spectators. And she wouldn’t know for certain whether it actually works until she got the 10 machines onsite in China up and running.
The challenge of simulating robots in operation has traditionally driven roboticists to take on custom programming — not to mention big gambles and anxiety — because software until recently hasn’t reliably worked.
Yet it remains a key issue for the industry, as logistics operations and warehouses shift gears to embrace robots featuring increasing levels of autonomy to work alongside humans.
“As we transition from robotic automation to robotic autonomy, art installations like Manus provide people an opportunity to experience firsthand how humans and autonomous machines might coexist in the future,” she says.
To be sure, Gannon’s gruelling effort in getting this demonstration off the ground underscores the industry’s nascent state of affairs for developing robotics at scale.
Robotics Help Arrives
Much of that is now changing. Earlier this year, we launched the Isaac Simulator for developing, testing and training autonomous machines in the virtual world. Last week, at GTC Japan, we announced the availability of the Jetson AGX Xavier devkit for developers to put to work on autonomous machine such as robots and drones.
Combined, this software and hardware will boost the robotics revolution by turbo-charging development cycles.
“Isaac is going to allow people to develop smart applications a heck of a lot faster,” Gannon said. “We’re really at a golden age for robotics right now.”
This isn’t Gannon’s first robotics rodeo. Last year, while a Ph.D. candidate at Carnegie Mellon University, she developed an interactive industrial robot arm that was put on display at the Design Museum in London.
That robot, Mimus, was a 2,600-pound giant designed to be curious about its surroundings. Enclosed in a viewing area, the robot used sensors embedded in the museum ceiling to see and come closer to or even follow spectators it found interesting.
Exhibiting Manus in Tianjin for the World Economic Forum marks her second, and significantly more complex, robotics installation, which required custom software to create interactions from
Bringing Manus to Life
Manus wasn’t easy to pull off. Once she arrived in China, Gannon had only 10 days with all 10 robots onsite before the opening of the interactive exhibit. Manus’s robots stand in a row atop a 9-meter base and are encased in plexiglass. Twelve depth sensors placed at the bottom of its base enable the interconnected robots to track and respond to the movements of visitors.
“There is a lot of vision processing in the project — that’s why its brain is using an NVIDIA GPU,” Gannon said.
This vision system enables Manus to move autonomously in response to the people around it: once Manus finds an interesting person, all 10 robot arms reorient as its robotic gaze follows them around.
To create the interaction design for Manus, Gannon needed to develop custom communication protocols and kinematic solvers for her robots as well as custom people-sensing, remote monitoring and human-robot interaction design software.
She says that up until now there haven’t been reliable technical resources for doing atypical things with intelligent robots. As a result, she’s had to reinvent the wheel each time she creates a new robotics piece.
The technical development for Mamus’s software stack took about two-thirds of the project timeline, leaving only one-third of the time to devote to heart of the project — the human-robot interaction design.
Future Robot Training
Using Jetson for vision and Isaac Sim for training robots could help developers turn those ratios around for future such projects. And they’re well-suited for development and simulation of industrial robots used by massive enterprises for warehouses and logistics operations.
Gannon’s mastery of training robots against such obstacles has garnered attention for her pioneering work, and she’s been called a “robot whisperer” or “robot tamer” for years.
She shrugs that off. “Now, with Isaac, I’m hopeful we won’t need robot whisperers anymore.”