Programming agility
Photos by Evan Krape May 05, 2022
UD engineers aim to give robots a leg up
In Professor Ioannis Poulakakis’ lab at the University of Delaware, a human-sized robot is almost ready to play its role in engineering a better future.
The robot, which walks on two legs and looks like something you would be more likely to see on the Star Wars planet of Tatooine than in the City of Newark, is part of an $800,000 research project supported by the National Science Foundation’s Major Research Instrumentation Program. The goal is to pair it with a new treadmill system that will help scientists and engineers better understand and program human-like robots to help in our everyday lives.
Poulakakis, along with Department of Mechanical Engineering Professors Panos Artemiadis, Guoquan (Paul) Huang and Bert Tanner are spearheading the project, which will combine the nearly 5-foot-tall robot and a novel treadmill system that will allow researchers to test the robot’s ability to adjust to dynamic changes in the environment. It’s really about robotic infrastructure, Poulakakis said.
“In the future, these robots will be walking in real environments, such as forests or rubble in a disaster area,” Artemiadis said. “We want to make sure those robots aren’t falling all over and that they are as robust as humans.”
Making two-legged robots walk more effectively across varying terrains — say, from concrete to grass to sand — is more than just understanding the mechanics of the task, though. It’s about programming the robots to comprehend the complexities of their environment and act accordingly so that they don’t fall over an ant hill or get tripped up by the dog.
“It’s like a weird human,” Poulakakis said as he described Digit, the impressive, high-tech two-legged robot purchased from Agility Robotics in Oregon. “But it’s not just about how you make it walk, but how it can see and perceive the environment around it and act accordingly. The key question is how does the robot understand change and how does it take action in correcting itself in response to changes?”
By creating basic controllers that would cause the robot to move in particular ways, and as integrated with the novel treadmill system they’re building, this team of engineers can then collect valuable data from their work, evaluate and verify existing algorithms and programming, and create simulations and models to guide future improvements.
This project also will collect valuable data on two-legged robots, which could ultimately help improve robotics technology enough that legged robots could be used commercially in warehousing or such related industries. And the testing tools developed by faculty and students will be available for researchers around the world to use.
“At the end of the day, each one of us can list different lessons learned or insights drawn out of this experience,” said Tanner, an expert in the field of planning and control of robotic systems. “We’re just scratching the surface of how our machines could operate in and negotiate a variety of conditions in different environments. This instrument is just one way of scratching that surface.”
Sensing, deciding and acting through programming
Despite past predictions that the new millennia would be marked by robotic influences, there are yet to be pizza delivery robots walking up your front steps expecting a tip. But there are quite a few non-legged robots in our lives: self-driving cars are here, drones are pretty much mainstream, and of course, we cannot forget that the automated vacuum cleaners our cats love to ride are robots, too.
And while the military may use some legged robots for special tasks, most of the robots we see today are not yet walking around our homes or workplaces. This project encompasses exactly the kind of research that still needs to be done to make robots more efficient and applicable to the real world.
By integrating the purchased robot with the treadmill being built in Artemiadis’ lab (where a smaller prototype already exists), the team of researchers will be able to profile the machine’s movements and reactions, ultimately to help pave the way to better robots.
“Our platform will be able to simulate dynamic environments,” Artemiadis said. “If you want to feel like you’re walking on sand, I can make that happen, and I can change it interactively.”
The robot itself, at about $250,000, accounts for more than a quarter of the funding provided by NSF for this project, which is why it’s ideal to test a robot on a treadmill in a laboratory setting rather than setting it free to see how it fares along a rickety boardwalk in the marsh, for example. A controlled environment is needed, and in the lab, it’s easier to collect data that can inform future programming.
“All of the guarantees we have for the performance and safety of robots are the guarantees we get from simulations, theoretical analysis and algorithms,” Poulakakis said. “There’s no experiment device that’s going to test whether or not those guarantees work in practice.”
For humans, processing environmental information, planning responsive motions and reacting to changes, can take a few hundred milliseconds of processing time. Meanwhile, the “digital synapses” in two-legged robots are only as good as they’re programmed to be.
A handful of UD engineering students will work on programming the robot, creating dynamic models, designing and controlling the treadmill and creating models that integrate the robot and treadmill, under the guidance of the mechanical engineering faculty involved in the project.
“This project is specifically focused on the transition to different types of terrains,” Tanner said. “Instead of trial and error on mud or sand or gravel or grass and reverse engineering what’s happening on the ground where the foot of the robot hit, with this instrument we can know how soft the surface under the robot was when it slipped or moved faster. We know how it reacts, how it balances itself, and then we can plan more microscopic motions through possible adjustments in the controls.”
A path forward
Motors, cameras and sensors are just the mechanical anatomy of a robot’s ability to move around — whether the robot walks on two legs or is an autonomous vehicle with four wheels. However, two-legged robots are far more complicated to work with than vehicles or simpler machines, Huang said.
“Legged robots have better mobility than their wheeled relatives, they can navigate pretty much anywhere,” Artemiadis said. But from battery technology to energy consumption, there’s still a lot of work to be done, he added. “Putting one leg in front of another and not falling sounds easy, but it’s quite complex.”
From the sensory information being gleaned from your foot putting pressure on a new surface to the years of learning balance and familiar environments, humans have a bit of a leg up when it comes to having senses to guide us. The research being done with the initial prototype of the instrument has focused on how humans walk, which Artemiadis said could help inform what’s needed for programming robots to navigate the world on two legs.
In addition to building the treadmill system and better controls, the team will also create a dynamic model for the robot that captures the physics of its motion. Once they are working together, the goal is to create models of the robot and treadmill that can be used as virtual instruments, accessible from anywhere.
Eventually, a model integrating the physics of the robot’s motions as well as the forces and influences of the treadmill and its changes, will produce a simulation that shows how any amount of change is likely to affect the robot. Those simulations can be compared with the actual interactions between the robot and the treadmill in the lab to verify accuracy.
“The more data we have, the better we can train the machine-learning algorithms and validate our controllers,” Tanner said. “In fact, this data will be available to any researcher wanting to study legged locomotion in real-world conditions. Generating meaningful data these days in itself can contribute to the body of knowledge."
Those models, ultimately, will capture critical aspects of the robot’s physical behavior, Poulakakis said. And the better the models, the easier it is to enhance the robot’s predictive ability, and the more opportunities there will be to have modern day robotics assist us in the most risky or challenging of tasks (and sure, we’ll include picking up your own pizza in that definition, too).
“One day, we hope they can become commonplace consumer electronics,” said Huang. “I think that’s still a long way away. But this is a great project for the robotics program we’re building here, and will be a chance to showcase some of our research results.”
Contact Us
Have a UDaily story idea?
Contact us at ocm@udel.edu
Members of the press
Contact us at 302-831-NEWS or visit the Media Relations website