Listen to this article
|
Researchers on Meta’s Fundamental AI Research (FAIR) team have announced three tools for robotics researchers that make it easier, faster, and more affordable to conduct this research. The tools include a simulator, datasets, and an affordable technology stack that encompasses both hardware and software.
FAIR has been working for years on creating generally intelligent embodied AI agents that can perceive and interact with their environment, share that environment safely with human partners, and communicate and assist those human partners in both the digital and physical world. Its latest advancements aim to allow embodied AI agents that can cooperate with and assist humans in their daily lives.
Habitat 3.0
The first tool announced by the FAIR team is its updated Habitat 3.0, which is a high-quality simulator that supports both robots and humanoid avatars and allows for human-robot collaboration in home-like environments.
Habitat 3.0 builds on the advances made in previous Habitat versions by opening up new avenues for research on human-robot collaboration in diverse, realistic, and visually and semantically rich tasks. By supporting human avatars with a realistic appearance, natural gait, and actions, Habitat 3.0 can model realistic low- and high-level interactions.
These human avatars can be controlled both by learned policies and by real humans using a human-in-the-loop interface. The interface can be controlled via a keyboard and a mouse and through VR headsets. Combining humans and robots in the simulated environment allows FAIR to learn robotic AI policies in the presence of humanoid avatars in home-like environments on everyday tasks.
This is important for a few reasons. Reinforcement learning algorithms require millions of iterations to learn something meaningful, meaning it could take years to do experiments in the physical world. In simulation, this number of iterations can be completed in just a few days.
When it comes to household robots, it’s impractical to collect data in different houses in the physical world. To do this, researchers would have to move robots to different places, set up those environments, and more in every setting they’d like to test in. With simulation, FAIR researchers can simply change the environment in a fraction of a second and start experimenting.
Additionally, training in the real world can be expensive and dangerous. If the model isn’t trained well, then there’s a risk that the robot could damage its environment or harm people around it. In simulation, you don’t have to worry about these issues.
Along with Habitat 3.0, FAIR presents two tasks and a suite of baselines to establish benchmarks in the field of socially embodied AI. The first is social rearrangement, which is when a robot and a human work together to tidy up a space, like cleaning up a house. The second is social navigation, which involves a robot locating and following a person while maintaining a safe distance.
Habitat Synthetic Scenes Dataset (HSSD-200)
In addition to Habitat 3.0, FAIR released its Habitat Synthetic Scenes Dataset (HSSD-200). HSSD-200 is an artist-authored 3D dataset of over 18,000 objects across 466 semantic categories and 211 scenes. The dataset can be used to train navigation agents with comparable or better generalization to physical-world 3D reconstructed scenes using two orders of magnitude fewer scenes from prior datasets.
HSSD-200 offers high-quality, fully human-authored 3D interiors that include fine-grained semantic categorization corresponding to WordNet ontology. The scenes are designed using the Floorplanner web interior design interface, and the layouts of mostly made up of recreations of actual houses. Individual items within the scenes were created by professional 3D artists and, in most cases, match specific brands of actual furniture and appliances.
While HSSD-200 is a smaller dataset than comparable ones, its high-quality scenes mean that robots trained on it perform comparably to robots trained on much bigger datasets.
HomeRobot
FAIR’s final release is the HomeRobot library, an affordable home robot assistant hardware and software platform in which the robot can perform open vocabulary tasks in both simulated and physical-world environments. The platform is intended to make it easier to perform reproducible robotics research.
The library implements navigation and manipulation capabilities that support Hello Robot’s Stretch. The platform includes two components: a simulation component, and a physical-world component, which includes a software stack for Hello Robot’s Stretch and Boston Dynamics’ Spot, to encourage replication of physical-world experiments across labs.
FAIR hopes that the platform will provide guiding north-star tasks that can motivate researchers, help shape their world, and allow for comparisons of a variety of methods on interesting, real-world problems. The system’s first task is Open-Vocabulary Mobile Manipulation (OVMM), or picking up any object in any unseen environment and placing it in a specified location.
Tell Us What You Think!