Underwater robots to take on task of inspecting the nation’s aging piers, bridges, pipelines and dams at a time when people’s relationship with water is changing.
Waves, winds, currents, wakes from passing boats and eddies swirling around structures make water one of the most complex environments for experienced boat captains, let alone robots. Now, researchers at Stevens Institute of Technology are developing algorithms that teach robots to adapt to the constantly changing dynamics of the sea in order to address one of our nation’s greatest concerns: protecting and preserving our aging water-rooted infrastructure, such as piers, pipelines, bridges and dams.
The work, led by Brendan Englot, a professor of mechanical engineering at Stevens, grapples with the ongoing issue of the frequency with which these underwater structures are checked. There are far more underwater structures than there are divers to inspect them with desirable frequency. Sometimes, they must dive below the surface to extreme and dangerous depths, requiring several weeks to recover. Englot is training robots to take on such tasks – but it’s not easy.
“There are so many difficult disturbances pushing the robot around, and there is often very poor visibility, making it hard to give a vehicle underwater the same situational awareness that a person would have just walking around on the ground or being up in the air,” says Englot.
Englot is up for the challenge.
His research group employs a type of artificial intelligence known as reinforcement learning which uses algorithms that are not based on an exact mathematical model; rather the goal-oriented algorithms teach robots how to carry out a complex objective by performing actions and observing the results. As the robot collects data, it updates its “policy” to figure out optimal ways to maneuver and navigate underwater.
The data they are collecting is sonar, the most reliable tool for navigating undersea. Like a dolphin using echolocation, Englot’s robots send out high frequency chirps and measure how long it takes the sound to return after bouncing off surrounding structures – collecting data and gaining situational awareness all while being knocked around by any number of forces.
Englot recently sent a robot on an autonomous mission to map a Manhattan pier. “We didn’t have a prior model of that pier,” says Englot. “We were able to just send our robot down and it was able to come back and successfully locate itself throughout the whole mission.” Guided by algorithms created in the Englot lab, the robot moved independently, gathering information to produce a 3D map showing the location of the pier’s pilings.
These first steps are encouraging, but Englot is working to expand his robots’ capabilities. Englot foresees routine inspections by robots on everything from ship hulls to off-shore oil platforms. In addition, robots can map the Earth’s vast, underwater terrain.
However, achieving these goals means addressing sonar’s limitations. “Imagine walking through a building and navigating the hallways with the same gray-scale, grainy visual resolution as a medical ultrasound,” says Englot.
Once a structure has been mapped an autonomous robot could plan a second pass, a higher resolution inspection of critical areas using a camera. Englot further imagines eel-like robots that can weave through crevices and narrow spaces, maybe even assisting in rescues. “To really take advantage of those kinds of designs first we need to be able to navigate with confidence,” he says. Englot continues to tweak his algorithms to provide that confidence.
Englot is also advancing underwater technology beyond the current patchwork maps tediously created by joystick-controlled robots, like a rover on a faraway planet. “Some of the toughest challenges in robot autonomy are underwater,” he says. There is a long way to go, but overcoming challenges drew Englot to the field of robotics in the first place.
Editor’s Note: This article was republished with permission from Stevens Institute of Technology. The original article can be found HERE.
Tell Us What You Think!