Listen to this article
|
Robotics developer Kabilan KB is bringing autonomous navigation capabilities to wheelchairs with the aim of helping improve mobility for people with disabilities.
KB is an undergraduate at the Karunya Institute of Technology and Sciences in Coimbatore, India. For this project, he aimed to create a device that could be helpful for his cousin, who has a mobility disorder, and other people with disabilities who might not be able to control a manual or motorized wheelchair.
“Sometimes, people don’t have the money to buy an electric wheelchair,” KB said. “In India, only upper- and middle-class people can afford them, so I decided to use the most basic type of motorized wheelchair available and connect it to the Jetson to make it autonomous.”
KB connected a basic motorized wheelchair’s motor hub to depth and lidar sensors, along with USB cameras, to allow it to perceive the environment and used the NVIDIA Jetson platform to help it plan an obstacle-free path toward a user’s destination.
He also trained the AI algorithms for the autonomous wheelchair using YOLO object detection on the Jetson Nano, as well as on ROS. The wheelchair uses these algorithms to perceive and map its environment and plan a collision-free path.
“A person using the motorized wheelchair could provide the location they need to move to, which would already be programmed in the autonomous navigation system or path-planned with assigned numerical values,” KB said. “For example, they could press ‘one’ for the kitchen or ‘two’ for the bedroom, and the autonomous wheelchair will take them there.”
An NVIDIA Jetson Nano Developer Kit processes the data from the cameras and sensors in real time. The system then uses deep learning-based computer vision models to detect obstacles in the environment.
The Developer Kit essentially acts as the brain of the autonomous system. It generates a 2D map of its surroundings to plan a collision-free path to the user’s destination, and sends updated signals to the motorized wheelchair to help ensure safe navigation along the way.
Looking forward, KB imagines that the project can be expanded to allow a user to control the wheelchair using brain signals from electroencephalograms, or EEGs, that are connected to machine-learning algorithms.
The project was funded by the Progam in Global Surgery and Social Change, which is positioned under the Boston Children’s Hospital and Harvard Medical School.
Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.
Tell Us What You Think!