Mobility / Navigation Archives - The Robot Report https://www.therobotreport.com/category/design-development/mobility-navigation/ Robotics news, research and analysis Thu, 11 Apr 2024 16:47:43 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.2 https://www.therobotreport.com/wp-content/uploads/2017/08/cropped-robot-report-site-32x32.png Mobility / Navigation Archives - The Robot Report https://www.therobotreport.com/category/design-development/mobility-navigation/ 32 32 Electric Sheep wins 2024 RBR50 Startup of the Year https://www.therobotreport.com/electric-sheep-wins-2024-rbr50-startup-of-the-year/ https://www.therobotreport.com/electric-sheep-wins-2024-rbr50-startup-of-the-year/#respond Thu, 11 Apr 2024 14:46:02 +0000 https://www.therobotreport.com/?p=578679 Electric Sheep has a novel business model and agile development team that make it first winner of the RBR50 Startup of the Year.

The post Electric Sheep wins 2024 RBR50 Startup of the Year appeared first on The Robot Report.

]]>
field workers stands on lawn surrounded by a fleet of autonomous electric sheep mowers.

Electric Sheep is vertically integrating its field operations team with autonomous mowers. | Credit: Electric Sheep

This year, the annual RBR50 Robotics Innovation Awards added new categories: Application of the Year, Startup of the Year, and Robot of the Year. We received numerous submissions for some incredible startups innovating in some interesting markets. The Robot Report‘s team chose autonomous landscaping company Electric Sheep Robotics as the inaugural RBR50 Startup of the Year.

The San Francisco-based company has a novel business plan that is immediately bringing in revenue while it takes its time to evolve the underlying technology. This is different from many robotics businesses, which simply sell or lease systems to integrators and end users.

“We are honored to be recognized by WTWH Media’s Robotics Group with this inaugural award. I want to also acknowledge our dedicated team at Electric Sheep that are passionate about creating the most advanced robotics that can change an often overlooked industry,” stated Nag Murty, co-founder and CEO of Electric Sheep. “We are doing things differently than other robotic companies by using AI and ML at a higher level for localization and high-level control. We are scaling physical agents across the country to care for our outdoor spaces.”

Founded in 2019, Electric Sheep has grown to over 100 employees, and it has raised more than $25 million in funding to date, according to Crunchbase.

You can also learn more about Murty’s entrepreneurial philosophy and Chief Technology Officer Michael Laskey’s design principles on a recent episode of The Robot Report Podcast.

Acquisitions add data for autonomy AI

Electric Sheep develops autonomous robots for outdoor maintenance. Its flagship robot is an autonomous mower backed by the company’s ES1 foundation model.

Based on recent advances in generative AI, ES1 is a learned-world model that enables reasoning and planning for the Verdie robot. ES1 powers both the RAM robot for mowing and now Verdie for edging and trimming lawns and bushes and blowing leaves.

In addition, Electric Sheep acquired four landscaping companies last year and said that this is a key part of its long-term plan. This strategy isn’t just about revenue. The businesses it acquires can also use ES1 and provide crucial data to make the model more effective.

This information can help improve Electric Sheep’s operations, enabling its robots to start working as soon as they arrive at a job site. 

Since taking this two-pronged approach to development and business, the company reported that its sales have grown eightfold. Electric Sheep has set itself apart from other startups by making sure it always has money coming in and by finding a unique way to get important data about its business.

Meet Electric Sheep at the Robotics Summit & Expo 

This year’s RBR50 award winners will be celebrated at the Robotics Summit & Expo, which will be on May 1 and 2 at the Boston Convention and Exhibition Center. Electric Sheep will be demonstrate its newest robot powered by ES1, Verdie, the RBR50 showcase on the expo floor.

Attendees at the 2024 Robotics Summit and Expo at the Boston Convention and Exhibition Center will have an opportunity to meet members of Electric Sheep’s executive team. Co-founder and CEO Nag Murty will present a session titled “Startup Survival Guide to Lean Times” at 2:30 p.m. EDT on Thursday, May 2.

rbr50 banner logo.

Murty will be joined by Oliver Mitchell, partner of ff Venture Capital; Fiona O’Donnell McCarthy, principal of True Ventures; and Steve Crowe, executive editor of robotics at WTWH Media. This panel will share tips from experienced investors and robotics companies on what they’re looking for and attendees will learn how organizations can navigate the challenging path to commercialization.

In addition, tickets are available for the first RBR50 Robotics Innovation Awards Gala, which will be at the end of Day 1 of the event. The Robotics Summit & Expo will be the biggest yet, with keynotes and sessions from leading companies, more than 200 exhibitors, up to 5,000 attendees, a Women in Robotics Breakfast, and a Robotics Engineering Career Fair.

Co-located events include DeviceTalks Boston, which focuses on medical devices, and the inaugural Digital Transformation Forum. which will focus on manufacturing. Registration is now open for the Robotics Summit.


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


The post Electric Sheep wins 2024 RBR50 Startup of the Year appeared first on The Robot Report.

]]>
https://www.therobotreport.com/electric-sheep-wins-2024-rbr50-startup-of-the-year/feed/ 0
Autopicker wins 2024 RBR50 Application of the Year for Brightpick https://www.therobotreport.com/autopicker-wins-2024-rbr50-application-of-the-year-for-brightpick/ https://www.therobotreport.com/autopicker-wins-2024-rbr50-application-of-the-year-for-brightpick/#respond Wed, 10 Apr 2024 14:50:57 +0000 https://www.therobotreport.com/?p=578671 Autopicker combines AI, vision-guided picking, and a mobile base to be the first winner of the RBR50 Application of the Year.

The post Autopicker wins 2024 RBR50 Application of the Year for Brightpick appeared first on The Robot Report.

]]>
Two Autopicker mobile manipulators in a warehouse aisle.

Two Autopicker mobile manipulators in a warehouse aisle. Source: Brightpick

This year, the annual RBR50 Robotics Innovation Awards added new categories: Application of the Year, Startup of the Year, and Robot of the Year. We received numerous submissions, but the Autopicker system from Brightpick stood out for automating both mobile manipulation and each picking.

Other robots combining mobility with manipulation have come and gone, from Fetch and Freight to Swift, in part because getting to commercially viable levels of reliability has been challenging. Not only has Autopicker added newer artificial intelligence to the mix, but it has also been deployed in existing customer warehouses.

“On the AI side, this was not possible five to six years ago,” Jan Zizka, co-founder and CEO of Brightpick, told The Robot Report. “Serious breakthroughs enable machine learning to generalize to unseen items.”

Autopicker learns with each pick

Autopicker combines a mobile base, a robotic arm, machine vision, and AI for e-commerce order fulfillment. The system reduces the need for warehouse associates to travel with carts, thanks to its patented design, which enables it to pick items from standard shelving and place them in either of two totes.

Brightpick said Autopicker can pick groceries, cosmetics, electronics, pharmaceuticals, apparel, and more with 99.9% accuracy. Its AI algorithms have been trained on more than 500 million picks to date, and they are improving with each pick, added the company.

Announced in February 2023, the system also supports pallet picking, replenishment, dynamic slotting, buffering, and dispatch. It can store up to 50,000 SKUs, said Brightpick. It also offers a goods-to-person option for heavy or hard-to-pick items, and Autopicker can raise its bins to waist height for ergonomic picking.

In the past year, customers such as Netrush and Rohlik Group began deploying the company’s latest system. Autopicker is available for direct purchase or through a robotics-as-a-service (RaaS) model.

See Brightpick at the Robotics Summit & Expo 

Cincinnati-based Brightpick is a unit of Bratislava, Slovakia-based machine vision provider Photoneo s.r.o. The company said its systems can “enable warehouses of any size to fully automate order picking, consolidation, dispatch, and stock replenishment.”

rbr50 banner logo.Brightpick, which has more than 200 employees, claimed that its robots take only weeks to deploy and can reduce labor assigned to picking by 98% and picking costs by half. In January 2023, the company raised $19 million in Series B funding for its U.S. expansion, and it said demand for Autopicker has been strong.

This year’s RBR50 award winners will be celebrated at the Robotics Summit & Expo, which will be on May 1 and 2 at the Boston Convention and Exhibition Center. Brightpick will be part of the RBR50 showcase on the expo floor.

In addition, tickets are available for the first RBR50 Robotics Innovation Awards Gala, which will be at the end of Day 1 of the event. The Robotics Summit & Expo will be the biggest yet, with keynotes and sessions from leading companies, more than 200 exhibitors, up to 5,000 attendees, a Women in Robotics Breakfast, and a Robotics Engineering Career Fair.

Co-located events include DeviceTalks Boston, which focuses on medical devices, and the inaugural Digital Transformation Forum. which will focus on manufacturing. Registration is now open for the Robotics Summit.


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


The post Autopicker wins 2024 RBR50 Application of the Year for Brightpick appeared first on The Robot Report.

]]>
https://www.therobotreport.com/autopicker-wins-2024-rbr50-application-of-the-year-for-brightpick/feed/ 0
Kiwibot acquires AUTO to strengthen delivery robot security https://www.therobotreport.com/kiwibot-acquires-auto-strengthen-delivery-robot-security/ https://www.therobotreport.com/kiwibot-acquires-auto-strengthen-delivery-robot-security/#respond Thu, 04 Apr 2024 15:00:54 +0000 https://www.therobotreport.com/?p=578513 Kiwibot and AUTO Mobility Solutions say their merger will advance data protection and robotic services globally.

The post Kiwibot acquires AUTO to strengthen delivery robot security appeared first on The Robot Report.

]]>
Kiwibot provides robotic deliveries on college campuses.

Kiwibot will add intellectual property from AUTO Mobility Solutions to its delivery robot portfolio. Source: Kiwibot

Consolidation among mobile robot providers is not limited to warehouses. Kiwibot today announced that it has acquired AUTO Mobility Solutions Co.

“This strategic collaboration marks a significant milestone in both companies’ journeys towards innovation and safeguarding privacy in the robotics industry, particularly for intelligent robots sourced from China and deployed in the Western markets,” Kiwibot stated.

“The acquisition of AUTO is a game-changer for us, bringing a wealth of technological innovation and a strong patent portfolio that will significantly enhance our cybersecurity measures for AI-powered robotics,” asserted Felipe Chavez, founder and CEO of Kiwibot. “This move not only strengthens our position in the market, but also connects the manufacturing expertise from Asia with the AI development in the West securely.”

Kiwibot develops delivery robots

Berkeley, Calif.-based Kiwibot has developed autonomous robots using artificial intelligence. The company claimed that it is a market leader of robotic deliveries on U.S. college campuses.

Since 2017, Kiwibot said it has successfully deployed robots across the U.S., Dubai, and Saudi Arabia. In 2020, it raised pre-seed funding and was an early guest on The Robot Report Podcast. It raised $10 million for deliveries as a service (DaaS) in December 2023.

“Kiwibot is actively exploring opportunities to expand our robotic delivery services beyond college campuses,” Chavez told The Robot Report. “We will soon announce customers in two different categories.”


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


AUTO brings cybersecurity expertise

“Becoming a part of Kiwibot opens up new avenues for our technologies and patents,” noted Sming Liao, CEO of AUTO Mobility Solutions. “Together, we are poised to redefine the landscape of autonomous delivery services, ensuring greater security and efficiency.”

The Taipei, Taiwan-based company was incubated by ALi Corp. and develops integrated circuit (IC) chips for AI, self-driving vehicles, robotics, the Internet of Things (IoT), and cybersecurity. Its systems feature advanced path planning, positioning, and obstacle-avoidance technology.

AUTO Mobility Solutions team in Taipei.

AUTO Mobility Solutions has built a patent portfolio in AI, IoT, and cybersecurity in Taipei. Source: Kiwibot

AUTO said its team will add more than 100 licensed patents to Kiwibot’s offerings.

“Our decision to join forces was solidified after recognizing the complementary nature of our technologies and the potential for a synergistic relationship,” said Chavez. “One of our investors from Taiwan introduced us, and we started the relationship as a customer for a custom cybersecurity chip.”

“The acquisition strategically positions us to bolster our cybersecurity infrastructure, especially considering the rising interest in AI and its associated vulnerabilities,” he added. “Together, Kiwibot and AUTO are looking to develop enhanced capabilities in autonomous navigation, AI-powered decision making, and advanced cybersecurity measures.”

Acquisition to expand global presence

The companies also said the acquisition will help the merged entity deliver leading systems globally and meet the evolving needs of both businesses and consumers.

“AUTO’s established presence in Taiwan and Shenzhen [China] will play a crucial role in helping Kiwibot navigate geopolitical and supply chain challenges,” explained Chavez. “Their expertise and strategic locations will aid in diversifying our supply chain and providing stability in our manufacturing and development processes, ensuring Kiwibot’s continued growth and scalability.”

Felipe Chavez, CEO of Kiwibot (left), and Sming Liao, CEO of Auto (right).

Felipe Chavez, CEO of Kiwibot (left), and Sming Liao, CEO of AUTO (right). Source: Kiwibot

Kiwibot is still evaluating consolidation and rebranding, he told The Robot Report.

“The Taipei team will maintain a high degree of autonomy to leverage their specialized expertise and local knowledge effectively. While we are unified in our mission and strategy, we recognize the importance of fostering innovation through autonomous operations,” Chavez said. “We are currently evaluating how best to integrate our brands to reflect our unified strength while honoring the established identity and contributions of AUTO’s team.”

What are Kiwibot’s plans for the near future?

“Looking forward, Kiwibot’s roadmap includes the continuous improvement of our autonomous delivery robots, the expansion of our service areas, and the integration of AUTO’s technological advancement,” Chavez replied. “We are committed to pioneering the future of robotic services and ensuring a seamless and secure experience for our users. Stay tuned for exciting updates as we progress on this journey.”

The post Kiwibot acquires AUTO to strengthen delivery robot security appeared first on The Robot Report.

]]>
https://www.therobotreport.com/kiwibot-acquires-auto-strengthen-delivery-robot-security/feed/ 0
OmniOn looks to power, network next-gen delivery robots https://www.therobotreport.com/omnion-power-power-network-next-gen-delivery-robots/ https://www.therobotreport.com/omnion-power-power-network-next-gen-delivery-robots/#respond Mon, 01 Apr 2024 14:13:19 +0000 https://www.therobotreport.com/?p=578358 OmniOn Power says that mobile robots, AI, and self-driving vehicles need more power and networking innovation.

The post OmniOn looks to power, network next-gen delivery robots appeared first on The Robot Report.

]]>
OmniOn supports multiple technologies, including robotics.

OmniOn supports multiple technologies, including robotics. Click here to enlarge. Source: OmniOn Power

As delivery robots and autonomous vehicles spread, much of the design and development attention has focused on safe navigation and obstacle detection, according to OmniOn Power Inc. However, they will also require reliable charging and communications infrastructure, it said.

“We’ve mainly seen mobile robots indoors in factories, warehouses, or even restaurants,” said Gopal Mitra, global segment leader for industrials at OmniOn. “2023 was a big year for cost optimization for robotics companies. They tried to address space challenges and labor shortages in e-commerce, and power supply for delivery robots outdoors is another real challenge.”

“We look at three basic technologies: cloud and edge computing, which need to be supported by 5G, and power,” he told The Robot Report. “OmniOn Power addresses high-voltage DC, outdoor installations, and products for onboard robotics, including mounted power that should be able to work with fluctuating voltages as batteries deplete.”

OmniOn spun out of ABB

Formerly known as ABB Power Conversion, AcBel Polytech Inc. acquired the division in July 2023 and renamed it OmniOn Power Inc. in October.

The Plano, Texas-based company gained telecommunications experience as a part of Bell Labs and was part of General Electric Co. and ABB Ltd. OmniOn claimed that its “reliable products, industry expertise, and partnerships are helping customers realize the full potential of 5G, supporting expansive data center demands, [and] powering Industry 4.0.”

“Our business has grown in the robotics space, partly because of the lack of innovation as a lot of folks focused on scaling up rather than introducing new designs,” Mitra said. “Channels are trying to adopt the right robots for ‘order online, pick up at store,’ direct fulfillment, and warehouses. The increasing amount of returns is also a big concern, and we’re addressing a $500 million portion of the total addressable market by optimizing for the cost of development and implementation.”


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


Power innovations to enable autonomy

Batteries add weight to robots and drones, and they can be affected by extreme cold. OmniOn said that more innovation is needed.

“There are two schools of thought for batteries — they could be long-lasting, or you can go with capacitors,” said Mitra. “As for the environment, there’s the harmonics on the grid and temperature, which can be up to 120 to 130 degrees [Fahrenheit; 48.8 to 54.4 Celsius] in places like Dallas.”

“Cold is more of an issue on the battery side than the internals, where the 2% heat generated is usually enough to keep power electronics warm,” he added. “We’re looking at the optimal time to charge, as well as discharge and weight.”

“There have been a number of innovations in batteries,” Mitra noted. “Lithium-ion is very popular in robotics and electric vehicles, and sodium-ion and other polymers are being explored. How U.S. investment in the semiconductor industry responds to China’s prevalence will also affect innovation in the next 10 years. Some are now looking at vertical stacking for denser chips.”

“OmniOn already has engineers working on providing power supplies to telecom and 5G networks,” he said. “We’re enablers of autonomy.”

OmniOn is working on powering delivery and warehouse robots.

OmniOn is working on providing power and connectivity to delivery and warehouse robots. Source: Adobe Stock

Other considerations for robotics

Ways to increase robot uptime include hot-swappable batteries, software that directs opportunistic recharging, and persistent wired or wireless charging on embedded grids, mostly indoors.

“Cost is a big deal — wireless charging is usually near-field using inductive charging, which is very attractive for many robots but can be expensive,” said Mitra. “With contact-based charging, you don’t need a converter circuit onboard the robot.”

By contrast, farming equipment or robotic lawnmowers can have wireless docking, eliminating the risk of clippings getting into contacts, he said. Wireless charging pads throughout a warehouse or factory have a high installation cost but can reduce the weight of batteries and operational costs. All of these options require industry consensus to become more widespread, Mitra observed.

How much can fleet management software help with power?

“It depends on the type of fleet,” replied Mitra. “We’re maturing simple routing within the constraints of restaurants, but delivery robots and vehicles have variable package loads. On the software side, we’ll see the impact of artificial intelligence on warehouse management, from machine vision to order processing.” 

Mitra also said that distributed power generation from photovoltaic cells could change the cost of energy.

“There are lots of opportunities to improve overall efficiency, but it’s a chicken-and-egg problem — first, the application has to come,” he said. “In hardware, non-isolated board-mounted products are emerging.”

5G to play a role as edge/cloud computing shifts

“For delivery robots, most of the compute is onboard, with nearby 5G hubs enabling mesh networks,” Mitra explained. “Edge computing needs to be supported by a 5G backbone, and peer-to-peer networks can manage the load.”

While robots and autonomous vehicles (AVs) need onboard processing for a spatial understanding of their environments and to navigate complex surroundings, the delivery function and reporting would benefit from 5G, he said.

“Look at certain regions in San Francisco — AVs are limited to certain areas, where the routes are largely pre-programmed,” said Mitra. “Once we see a prevalence of 5G and edge computing, machine learning for transport will be more scalable.”

“We have an engagement with a robotics company working with a major retailer on managing inventory and goods-to-person materials handling in the warehouse. Multi-tenant warehouses are coming,” Mitra said. “In addition to automated storage and retrieval systems [ASRS], we’re looking at multi-robot scenarios in the parking lot for groceries.”

OnmiOn provides the BPS 48V stackable power system for 5G systems.

The BPS 48V stackable power system is designed for 5G systems. Source: OmniOn

AI, humanoids could create new demands

Growing interest in applying generative AI to robotics will also affect networking and power demands.

“They’re not talked about yet in the context of on-premise or edge computing, but it will be interesting to see if delivery robots get these capabilities,” Mitra said. “AI has helped industry understand the need for high-performance computing, which has put a lot of pressure on power-supply manufacturers for smaller, more efficient systems.”

Similarly, interest in mobile manipulation and the humanoid form factor will also intensify pressure on compute and power management.

“Even if you just put an articulated robot arm on a mobile base, stepper motors require eight times the current to start, just to change from static to movement,” said Mitra.

“We still don’t have a good solution for batteries that can support humanoids for the long term,” he asserted. “They’ll also need a power train that can handle a wide range of discharge, from walking to the necessary strength for lifting boxes.”

OmniOn said it expects the demand for delivery robots, automated warehouses, and connected infrastructure to grow at 12% to 14%. Power management may not be standardized, depending on the size of a robot and its number of sensors, and edge/cloud computing and different charging approaches will continue to evolve, said Mitra. 

“We’re excited see how wireless charging affects the robotics space,” he said. “While the cost has led to different adoption than initially expected, in the long term, the cost of infrastructure could be lower, and it could be more easily managed.”

The post OmniOn looks to power, network next-gen delivery robots appeared first on The Robot Report.

]]>
https://www.therobotreport.com/omnion-power-power-network-next-gen-delivery-robots/feed/ 0
Zoox gets ready to launch robotaxi service in Las Vegas https://www.therobotreport.com/zoox-gets-ready-launch-robotaxi-service-las-vegas/ https://www.therobotreport.com/zoox-gets-ready-launch-robotaxi-service-las-vegas/#respond Sat, 30 Mar 2024 10:06:21 +0000 https://www.therobotreport.com/?p=578281 Zoox is expanding its area of robotaxi operations in Las Vegas as it prepares to launch a public service later this year.

The post Zoox gets ready to launch robotaxi service in Las Vegas appeared first on The Robot Report.

]]>
a zoox robo taxi turns a corner in Las Vegas.

Zoox is expanding the geofence for its operations in Las Vegas. | Credit: Zoox

Over the past year, Zoox Inc. has made significant progress on its autonomous robotaxi service roadmap.

In February, the Amazon.com subsidiary announced that it completed a key milestone: deploying its robotaxi on open public roads with passengers.

In the shadow of the disappointing news from competitor Cruise, which lost its autonomous operating permit from the California Department of Motor Vehicles (DMV), closed its San Francisco service, and laid off 900 employees, Zoox completed rigorous testing on private roads. It received approval from the California DMV to operate its robotaxi on the state’s public roads.

On the way, Zoox has invested heavily in simulation tools necessary to train the robot drivers to handle any on-road situation. Simulation is the key to safely training AI models and logging thousands of hours of drive time without endangering anyone.

“One [key to our success] is obviously through our test vehicle logged miles,” said Qi Hommes, senior director of system design and mission assurance at Zoox. “We drive our test vehicles with safety drivers quite a bit in our launch-intent areas. And anytime we encounter something unexpected, those are inputs into the development of those simulation scenarios.”

Zoox begins service for employees in California

Zoox claimed that it is the only purpose-built robotaxi permitted on California public roads that is self-certified to the Federal Motor Vehicle Safety Standards (FMVSS). The company recently deployed its employee shuttle service in its headquarters in Foster City, Calif. Zoox will offer the shuttle service exclusively to all full-time employees.

“Becoming the first company to operate a purpose-built robotaxi with passengers on open public roads in California is a significant milestone in not only Zoox’s journey, but [also] for the autonomous vehicle industry at large,” stated Aicha Evans, CEO of Zoox, after the DMV approval. “With the announcement of the maiden run of our autonomous employee shuttle, we are adding to the progress this industry has seen over the last year and bringing Zoox one step closer to a commercialized purpose-built robotaxi service for the general public.”

Unlike robotaxi competitors relying on car chassis, Zoox said it has designed its platform from the ground up for autonomous passenger movement. Every design decision was made with the goal of providing a comfortable, interactive experience.

The most obvious difference between Zoox and competitors like Waymo and Cruise is that the Zoox vehicle is missing a steering wheel. It has large doors on both sides of the vehicle and seats up to four passengers, with the riders sit facing one another.

Zoox robo taxi on the street in Foster City CA.

The Zoox autonomous robo-taxi vehicle is omnidirectional and uses four-corner steering. There is no onboard safety driver. | Credit: Zoox

A rider’s view of the robotaxi

At CES 2024, I interviewed Chris Stoffle, director of industrial and creative design at Zoox, and got a tour of the vehicle on the show floor.

The first thing that I noticed was how quiet it was inside the vehicle. CES is a noisy place. The cacophony of tens of thousands of people talking can be overwhelming, and inside the Zoox vehicle, it was quiet and comfortable. 

Stoffle described the Zoox rider experience: “Right now, we’re inside a robotaxi designed from the ground up to provide the best rider experience. The outside of the vehicle has a smaller footprint than a BMW i3. But inside we have this large space where passengers can sit comfortably across from each other. There’s no bad seat in the vehicle. Each rider can see the map, and adjust the temperature for their seat. In addition, there are USB power ports, drink holders, task lights, and an emergency button to contact help immediately.”

people entering into the cabin of a robotaxi.

Inside of the Zoox robotaxi, each rider has their own comfort controls while facing one anther for the ride. | Credit: Zoox

Las Vegas operations expand to five-mile radius

Since its February announcement, Zoox has expanded the geofence for its Las Vegas fleet of robotaxis, expanding the operational area for moving employees as they test the service. The new geofence is a five-mile radius around the company’s Las Vegas headquarters.

The new service area is more complex and includes three-lane roads, harder lane changes, unprotected right turns onto high-speed roadways, and double-right and left-hand turn lanes. 

The robotaxis are now handling more difficult operating situations as the engineering team validates the features and safety of the vehicle operation. In addition, the autonomous vehicles are now driving at speeds of up to 45 mph, in light rain, and at night.

The fact that the vehicles have no steering wheel and no onboard safety driver means that the Zoox team has to monitor each vehicle in real-time. These “human-in-the-loop” operators do not teledrive the vehicles, but they do monitor each ride in real time, looking at the environment and the vehicle’s intentions as it decides to turn, stop, and move through intersections.

If there is an emergency situation, the remote operator can direct the vehicle how to respond and where to go to safely resolve the situation.

“[Zoox’s vehicle] was approved to drive on public roads last year and now is fully homologated in response to emergency vehicles being able to detect them,” explained Stoffel. “Being able to interact with humans outside the vehicle in safe ways is really important. And so we’ve been able to not only update our sensor pod to improve self-driving in inclement weather with some of our sensor cleaning, but also to bring in a better microphone designed on the exterior so we can detect sirens and first-responder vehicles earlier.”

“The door interface module allows us to interact with those outside the vehicle with the human in the loop, whether it be a rider, someone from the public, or even a first responder,” he added. “We believe that being able to have that human in the loop is the right approach for those off-nominal situations that we’re going to be seeing more and more as we expand on public roads.”

To hear about the development of the perception engine and sensor stack used on the Zoox vehicle, listen to the podcast interview with two of the company’s technology leaders: RJ He and Ryan McMichael

zoox robo taxi in traffic on las vegas road.

Zoox is incrementally increasing the parameters for operation of its fleet of robotaxis in Las Vegas. | Credit: Zoox

The post Zoox gets ready to launch robotaxi service in Las Vegas appeared first on The Robot Report.

]]>
https://www.therobotreport.com/zoox-gets-ready-launch-robotaxi-service-las-vegas/feed/ 0
Northeastern University Mars Rover Team wins Winter Canadian International Challenge https://www.therobotreport.com/northeastern-university-mars-rover-team-wins-winter-canadian-international-challenge/ https://www.therobotreport.com/northeastern-university-mars-rover-team-wins-winter-canadian-international-challenge/#respond Wed, 27 Mar 2024 19:58:50 +0000 https://www.therobotreport.com/?p=578286 Northeastern University students won a contest in which four teams' rovers completed tasks in simulated Martian environments.

The post Northeastern University Mars Rover Team wins Winter Canadian International Challenge appeared first on The Robot Report.

]]>
The Northeastern Mars Rover team took home its first gold last month at the inaugural Winter Canadian International Rover Challenge. Photo by Matthew Modoono/Northeastern University

Brooke Chalmers, who studies computer science, and Jason Kobrin, who studies mechanical engineering, work on the Mars Rover in the Richards Hall Makerspace. Credit: Matthew Modoono/Northeastern University

When the student leaders of the Northeastern University Mars Rover Team decided they were going to participate in the inaugural Winter Canadian International Rover Challenge, they thought it would be good practice more than anything else.

They didn’t expect to win the competition. Yet, that’s exactly what happened.

The Northeastern team took home the gold last month, beating McMaster University for the top spot with a score of 237.71 points to McMaster’s 137.13.

“It was pretty huge for us in terms of team morale,” said Brooke Chalmers, a third-year student at Northeastern and the integration lead and software co-lead for the Mars rover group. “It really felt like all the hours that we put in during the prior weeks paid off in a way.”

It’s the first competition win for the six-year-old club, which is composed of students studying computer science, engineering, and life science.

The university team of about 50 students had been hard at work developing and iterating on its latest robotic rover: the Watney, Mark V. 

Coming in at 50 kg (110 lb.), the rover features a 5052 aluminum alloy chassis, six 3D-printed nylon wheels, a robotic arm with end-of-arm tooling (EOAT), a life-detection module for sample collection, and 14 onboard cameras. 

The Canadian competition was broken up into four challenges designed to put students’ rovers through simulated environments similar to tasks a rover might have to complete while on Mars’ surface. Each challenge was ranked using a 100-point scale.  

In the Arm Dexterity Challenge, for example, students were tasked with controlling the rover’s robotic arm to restore power to a campsite. The challenge involved navigating the robot through four control panels where the robotic arm had to press buttons and flip switches, explained Jason Kobrin, a fourth-year student at Northeastern and a mechanical operations co-lead for the Mars rover group.

The robotic arm on the Northeastern team's Mars Rover.

The robotic arm on the Northeastern team’s Mars Rover. Credit: Matthew Modoono/Northeastern University

Northeastern team redesigns robot arm for strength

Of the four teams taking part in the challenge, Northeastern scored the highest for the challenge, with a score of 49.49 points. 

Kobrin said the team has spent the past year completing redesigning the robot’s arm, which used to be one of the rover’s weak points during previous competitions. It’s now one of the rover’s biggest strengths. The robot arm has six degrees of freedom and can carry loads up to about 10 kg (22 lb.). 

“In order to improve that, we redesigned our arm this year to use better motors and to be easier to control overall,” he said. 

It’s by taking part in these competitions and through regular testing where the team was able to narrow in on the rover’s shortcomings and improve its capabilities, Kobrin said. By working on the rover, students are also getting the opportunity to improve their own skills. 

“Every week, it’s continuous improvement,” he noted. “Whether it’s adding a new portion of software code [or] whether designing a new mount for our cameras, every little improvement makes a huge difference.” 

“For everybody to be able to design and build this robot to function well but also to be able to control it in high-pressure situations and to reach the goals we were seeking to reach, is just really impressive,” added Kobrin. 

The team thought the two-day event hosted in Niagara Falls, Ontario, would be a great primer to test out the capabilities of the machine before the team took part in the upcoming annual University Rover Challenge (URC). The URC is the Mars Society’s premier Mars rover student competition held at the Mars Desert Research Center outside Hanksville, Utah. 

The URC competition is old hat for the group, having participated in the challenge in 2019, 2022, and 2023. The competition was canceled in 2020 and 2021 because of the pandemic.

“We went into this competition thinking, ‘OK, we’re going to use this as an opportunity to prepare for URC. We’re going to test stuff to make sure it all works,’” Chalmers said.

Connecting parts for the University Rover Challenge.

The students will be competing at the University Rover Challenge this spring. Photos by Matthew Modoono/Northeastern University

Difficult terrain and team excitement

The team had its best showing during the Winter Transversal Challenge, with a finishing score of 84.72 points. For the challenge, the rover had to roll through treacherous and uneven terrain while avoiding obstacles.

“All the challenges involved some degree of the rover driving around and moving over difficult terrain, but this challenge was focused entirely on that,” said Chalmers. 

With the overall win, Chalmers said she’s hopeful that new members will be excited to join. 

“Most people on the team have been talking about this with their friends and family and talking about what we are doing, which is really cool,” she said. “I know a few of my friends have expressed interest in joining the team since. It’s very exciting to have something to talk about and have something to show for all the effort we put in.”


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


Cesareo Contreras, Northeastern University.About the author

Cesareo Contreras is a Northeastern Global News reporter and has covered robotics extensively. This article is reposted with permission.

 

 

The post Northeastern University Mars Rover Team wins Winter Canadian International Challenge appeared first on The Robot Report.

]]>
https://www.therobotreport.com/northeastern-university-mars-rover-team-wins-winter-canadian-international-challenge/feed/ 0
Opteran to bring natural intelligence to SAFELOG mobile robots https://www.therobotreport.com/opteran-to-bring-natural-intelligence-to-safelog-mobile-robots/ https://www.therobotreport.com/opteran-to-bring-natural-intelligence-to-safelog-mobile-robots/#respond Fri, 22 Mar 2024 18:00:45 +0000 https://www.therobotreport.com/?p=578250 By working with Opteran, SAFELOG says it is developing a new generation of mobile robots with robust and efficient navigation. 

The post Opteran to bring natural intelligence to SAFELOG mobile robots appeared first on The Robot Report.

]]>
SAFELOG's mobile robots can operate in a range of warehouse and factory settings. | Source: SAFELOG.

SAFELOG’s mobile robots can operate in a range of warehouse and factory settings using Opteran Mind. | Source: SAFELOG

Opteran Technologies this week announced at LogiMAT a partnership with SAFELOG GmbH, a manufacturer of order-picking and transportation robots for warehouses and factories. SAFELOG will integrate its mobile robots with Opteran Mind, a general-purpose autonomy product.

“We are delighted to announce our partnership with SAFELOG, as this is another significant milestone on our path to commercializing Opteran Mind,” stated David Rajan, co-founder and CEO of Opteran Technologies.

“We are seeing a rapid take up of our technology across the U.S., Japan, and Europe, so today’s agreement with SAFELOG underlines why our technology is best in class for localization and mapping for mobile robots,” he added. “It also shows that while ‘natural intelligence’ is unique in the market, our inputs and outputs are standard, making Opteran Mind a simple and attractive solution to integrate with existing mobile robots.”

The companies said the multi-year agreement will enable SAFELOG to address the urgent need for greater productivity from autonomous mobile robots (AMRs) operating in hazardous and dusty environments. Opteran claimed that its technology can enable AMRs to handle dynamic lighting and ever-changing obstacles without GPS.


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


SAFELOG aims to reduce failure rates

Markt Schwaben, Germany-based SAFELOG said it is developing a new generation of mobile robots that combine robustness and efficiency. A key objective of its project with Opteran is to reduce robot failure rates because of localization errors with existing 2D and 3D lidar, as well as with visual simultaneous localization and mapping (vSLAM).

Another challenge to productivity is when hundreds of automated guided vehicles (AGVs) operate together in a warehouse setting because each installation requires an infrastructure consisting of magnetic tracks and QR code reflectors. This can increase commissioning time and operating costs.

Opteran said its localization software enables new projects to be activated quickly and efficiently without additional infrastructure. 

“There are a lot of challenges for existing autonomy solutions to overcome in the complex conditions of a warehouse, so we have been amazed by what Opteran Mind can achieve,” said Michael Reicheicher, managing director of SAFELOG, in a release. “Opteran’s technology performs significantly better in our mobile robots, which will be hugely beneficial for our customers. Natural Intelligence, their approach to AI, offers a robust technology that we are confident will differentiate our AMRs in the global market.”

Opteran Mind promises navigation breakthrough

Opteran Mind is based on 10 years of research into insect brains. The company, which has facilities in London and Sheffield in the U.K. and Boston in the U.S., said it reverse-engineered natural brain algorithms. 

“Fundamentally, nature does navigation more efficiently than robots,” said Opteran Technologies. By replicating nature’s approach in a model that the company calls “natural intelligence,” it said it has delivered a “dramatic breakthrough.”

Opteran estimated that its system could cost less than $160 running on a Sony and ARM Core and using Sony IMX219 cameras and RK4566 ARM chips. In comparison, current systems can range in cost from $8,400 for a 2D lidar setup to $27,000 for a 3D lidar setup, it said.

Opteran and SAFELOG demonstrated their collaboration at LogiMAT in Stuttgart, Germany. They showed a SAFELOG mobile robot using Opteran Mind, which they said could increase adaptability and minimize downtime.

The partners said Opteran Mind can be embedded in ground-based robots and aerial drones for a wide variety of applications, from logistics and warehouse distribution to oil and gas inspection, mining, and autonomous vehicles.

The post Opteran to bring natural intelligence to SAFELOG mobile robots appeared first on The Robot Report.

]]>
https://www.therobotreport.com/opteran-to-bring-natural-intelligence-to-safelog-mobile-robots/feed/ 0
Slamcore Aware provides visual spatial intelligence for intralogistics fleets https://www.therobotreport.com/slamcore-aware-provides-visual-spatial-intelligence-for-intralogistics-fleets/ https://www.therobotreport.com/slamcore-aware-provides-visual-spatial-intelligence-for-intralogistics-fleets/#respond Mon, 11 Mar 2024 12:01:31 +0000 https://www.therobotreport.com/?p=578119 Slamcore Aware combines the Slamcore SDK with industrial-grade hardware to provide robot-like localization for manually driven vehicles.

The post Slamcore Aware provides visual spatial intelligence for intralogistics fleets appeared first on The Robot Report.

]]>
Slamcore Aware identifies people and other vehicles for enhanced safety and efficiency. Source: Slamcore

Slamcore Aware is designed to be simple and quick to commission. Source: Slamcore

Just as advanced driver-assist systems, or ADAS, mark progress toward autonomous vehicles, so too can spatial intelligence assist manually driven vehicles in factories and warehouses. At MODEX today, Slamcore Ltd. launched Slamcore Aware, which it said can improve the accuracy, robustness, and scalability of 3D localization data for tracking intralogistics vehicles.

“Prospective customers tell us that they are looking for a fast-to-deploy and scalable method that will provide the location data they desperately need to optimize warehouse and factory intralogistics for speed and safety,” stated Owen Nicholson, CEO of Slamcore. “Slamcore Aware marks a significant leap forward in intralogistics management bringing the power of visual spatial awareness to almost any vehicle in a way that is scalable and can cope with the highly dynamic and complex environments inside today’s factories and warehouses.”

Robots and autonomous machines need to efficiently locate themselves, plus map and understand their surroundings in real time, according to Slamcore. The London-based company said its hardware and software can help developers and manufacturers with simultaneous localization and mapping (SLAM).

Slamcore asserted that its spatial intelligence software is accurate, robust, and computationally efficient. It works “out of the box” with standard sensors and can be tuned for a wide range of custom sensors or compute, accelerating time to market, said the company.

Slamcore Aware brings AMR accuracy to vehicles

Slamcore Aware collects and processes visual data to provide rich, real-time information on the exact position and orientation of manually driven vehicles, said Slamcore. Unlike existing systems, the new product can scale easily across large, complex, and ever-changing industrial sites, the company claimed.

Slamcore Aware combines the Slamcore software development kit (SDK) with industrial-grade hardware, providing a unified approach for fast installation on intralogistics vehicles and integration with new and existing Real Time Location Systems (RTLS).

It incorporates AI to perceive and classify people and other vehicles, said Slamcore. RTLS applications can use this enhanced data to significantly improve efficiency and safety of operations, it noted.

The new product brings SLAM technologies developed for autonomous mobile robots (AMRs) to manual vehicles, providing estimation of location and orientation of important assets with centimeter-scale precision, said the company.

With stereo cameras and advanced algorithms, the Slamcore Aware module can automatically calculate the location of the vehicle it is fitted to and then create a map of a facility as the vehicle moves around. It can note changes to layout and the position of vehicles, goods, and people, even in highly dynamic environments, Slamcore said.


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


‘Inside-out’ approach offers scalability

Existing navigation systems require the installation of receiver antennas across facilities to provide “line-of-sight” connectivity, said Slamcore. However, they become more expensive as facilities scale, with large or complex sites needing hundreds of antennas to track even a handful of vehicles.

Even with this expensive infrastructure, coverage is often unreliable, reducing the effectiveness of RTLS and warehouse robots, Slamcore said. The company said Slamcore Aware addresses these industry pain points.

The system takes an “inside-out” approach that scales in line with the number of vehicles deployed, regardless of the areas they must cover or the complexity of internal layouts. As new vehicles are added to the fleet, an additional module can be simply fitted to each one so that every vehicle automatically and continuously determines its location wherever it is across the whole site, said Slamcore in a release.

Visual spatial intelligence data is processed at the edge, onboard the vehicle, explained the company. Position and orientation data is shared via a lightweight and flexible application programming interface (API) for use in nearly any route-planning, analytics, and optimization platform without compromising performance, it said.

Slamcore is offering Slamcore Aware to facility operators, fleet management and intralogistics specialists, systems integrators, and other RTLS specialists. The company is exhibiting at MODEX in Atlanta for the first time this week at Booth A13918. It will also be at LogiMAT in Stuttgart, Germany.

 

The post Slamcore Aware provides visual spatial intelligence for intralogistics fleets appeared first on The Robot Report.

]]>
https://www.therobotreport.com/slamcore-aware-provides-visual-spatial-intelligence-for-intralogistics-fleets/feed/ 0
Afara launches autonomous picker to clean up after cotton harvest https://www.therobotreport.com/afara-launches-autonomous-picker-to-clean-up-after-cotton-harvest/ https://www.therobotreport.com/afara-launches-autonomous-picker-to-clean-up-after-cotton-harvest/#respond Sat, 09 Mar 2024 14:05:38 +0000 https://www.therobotreport.com/?p=578115 AFARA-COTTON uses a variety of sensors to autonomously detect and pick up cotton dropped during mechanical harvest.

The post Afara launches autonomous picker to clean up after cotton harvest appeared first on The Robot Report.

]]>

Afara Agricultural Technologies Inc. has developed AFARA-COTTON, an autonomous mobile robot designed to collect cotton spilled on the ground after the mechanical harvesting process. The Turkish company has also developed automation for seeding, irrigation, disinfestation, and weeding.

According to Afara, 5% to 20% of annual cotton yields are unpicked by mechanical harvesters or are dropped to the ground during harvest. This valuable resource is currently either wasted or must be gathered by hand, it said.

The company said AFARA-COTTON is an all-electric, self-driving platform to address this waste. It is currently selling its systems only in Turkey and select European countries.


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


AFARA-COTTON cleans two rows at once

Computer engineer Ömer Muratlı, the son of a farmer, invented the Afara Agricultural Robot in 2019. His family, which harvested cotton, had asked him to build a robot to collect remaining cotton. He patented the agricultural platform and brought it to the working prototype stage.

Afara Agricultural Technologies was established in 2023 with investment from the crowdfunding platform Fonlabüyüsün. The company is continuing work on a mass-production model.

The video animation above shows that the robot has four cameras, two lidar sensors, and ultrasonic sensors. Cameras scan the ground looking for the stray seed cotton.

An AI-based perception model identifies the seed cotton and deploys a suction cup to vacuum up the seed cotton and collect it in a central bailing area in the middle of the machine. The startup said it is targeting a 90% efficiency rate.

AFARA-COTTON autonomously traverses the field, avoiding obstacles as it cleans up two rows at a time. Currently, as the robot completes a row, the operator needs to manually realign it onto the next row and reinitiate the collection process.

The robot can accumulate up to 200 kg (440 lb.) before it needs to be emptied. While collecting the cotton, the picker drives up to 3.2 kph (2 mph), and it can operate for up to six hours on a single charge. The current two-row model will be available for €120,000 to €130,000 ($131,275 to $142,231 U.S.).

image of the afara cotton picker.

AFARA-COTTON is designed to clean up cotton wastage from the field after the harvest. | Credit: AFARA

The post Afara launches autonomous picker to clean up after cotton harvest appeared first on The Robot Report.

]]>
https://www.therobotreport.com/afara-launches-autonomous-picker-to-clean-up-after-cotton-harvest/feed/ 0
Electric Sheep Verdie robot uses large world models for autonomous landscaping https://www.therobotreport.com/meet-electric-sheep-latest-autonomous-lawn-robot-verdie/ https://www.therobotreport.com/meet-electric-sheep-latest-autonomous-lawn-robot-verdie/#comments Wed, 28 Feb 2024 19:30:41 +0000 https://www.therobotreport.com/?p=578010 Electric Sheep Robotics said its latest robot, Verdie, uses AI and large-world models to work alongside landscaping crews.

The post Electric Sheep Verdie robot uses large world models for autonomous landscaping appeared first on The Robot Report.

]]>
Electric Sheep Verdie.

Verdie uses AI to edge and trim lawns and bushes and to blow leaves. | Source: Electric Sheep

Electric Sheep Robotics Inc. today launched Verdie, a new robot using its proprietary artificial intelligence and software. It said it aims to be the first large-scale outdoor maintenance company powered by AI and robotics.

The San Francisco-based company said Verdie can autonomously edge and trim lawns and bushes, as well as blow leaves. Electric Sheep added that its AI agent, ES1, enables Verdie and its RAM lawnmowing robot “to operate in any outdoor setting with zero teaching.”

“The debut of our Verdie robot is the first AI robot for tasks like trimming and edging in the world of landscaping, and it’s exciting to see our ES1 technology power multiple robots that can work alongside a crew without an engineer on-site setting a specific path for them,” said Nag Murty, co-founder and CEO of Electric Sheep, in a release. “We will be rolling out the Verdie to our customer sites throughout 2024 and continuing to build out this fleet of robots as autonomous agents trained on outdoor services.”

“Verdie is inspired by robots such as WALL-E, R2-D2 and BB-8 — moderately complex, non-humanoid agents that perform meaningful work with embodied AI,” said the company in a blog post.

ES1 operates like ChatGPT but for spatial reasoning

Verdie and RAM use AI to understand the lawns around them and efficiently care for them, said Electric Sheep. Based on recent advances in generative AI, ES1 is a learned-world model that enables reasoning and planning for both robots.

To execute tasks like mowing and inventory management, ES1 needs to understand the semantics of the world, create a map that can be used for coverage planning, and highlight the edges of the workable area, the company explained. ES1 can do all these things through dense prediction of a world state with a single model, claimed Electric Sheep.

ES1 is similar to ChatGPT, but it works with spatial AI instead of language, it said. In developing its robots, Electric Sheep said it needed to give them complex, real-time reasoning that can generalize across changes in the environment.

“To tackle this, we utilized two techniques: our foundation world model ES-1 and reinforcement learning (RL) in simulation,” it explained. “ES-1 provides an agent with concepts needed for outdoor work; and has been trained from our robot mowing fleet on thousands of diverse properties.”

“Given this robust representation, we can then perform RL- which teaches the agent to solve a specific new task,” said Electric Sheep. “We found that RL on top of our world model can 1) enable simulation based training of new policies such as string trimming and 2) be generalized to work on a diverse number of sites across the country.”

Electric Sheep's ES1 multimodal model

ES1’s model for reinforcement learning and robot training. Source: Electric Sheep

Verdie, RAM learn to handle tools 

Electric Sheep also needed to enable Verdie to handle tools, so it turned to simulation

“For a specific example of this, we consider the task of string trimming, and train a small fully-connected policy on top of the learned embeddings using RL,” said the company. “The policy takes as input time-series data via ES-1’s projected embedding space, then outputs velocity commands for the motors to track at 10 hz.”

“We used NVIDIA’s ISAAC simulation for all training,” it added. “The reward function was set to follow the perimeter of the property with the tool tip of the trimmer. Since our models are already designed to run on Jetson platforms, we can train the entire policy on a single desktop GPU.”

Electric Sheep said it designed both of its robots to start working once they’re on a property and turned on. As a result, Verdie can start working “out of the box,” said the company.

Because the robots don’t need on-site engineers to operate, they can simply be shipped to a campus, homeowners association (HOA), or park and begin tasks alongside the crew. The company credited its full-stack data channel and the large volume of data that the robots are continually trained on. 

Electric Sheep is currently running the ES1 agent on a fleet of 40 RAM robots in hundreds of yards across North America. It said it plans to deploy Verdie with customers in the second quarter of 2024. 

Electric Sheep touts business model as differentiator 

Electric Sheep said it has acquired traditional outdoor service providers to progressively transform operations by deploying its software and robots. Electric Sheep acquired two of these landscaping companies in October 2023, and then acquired two more just a few months later in December. 

The acquisitions are also a way to acquire data that the company can use in a reinforcement learning operational sandbox. In that sandbox, Electric Sheep can build on its foundational model and apply it to its Verdie and RAM robots, according to Murty.

“We are building an RL reinforcement learning factory to train autonomous AI agents to do sustainable outdoor work,” he said.

The acquisitions also allow Electric Sheep to start making money from Day 1, as it operates existing businesses with existing customers. The company, which has funding from Tiger Global and Foundation Capital, has grown its revenue by eight times since implementing this model.

Electric Sheep asserted that its growing pipeline of interested businesses will enable it to grow by 10 times in 2024.


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


The post Electric Sheep Verdie robot uses large world models for autonomous landscaping appeared first on The Robot Report.

]]>
https://www.therobotreport.com/meet-electric-sheep-latest-autonomous-lawn-robot-verdie/feed/ 1
Locus Lock promises to protect autonomous systems from GPS spoofing https://www.therobotreport.com/locus-lock-promises-protect-autonomous-systems-gps-spoofing/ https://www.therobotreport.com/locus-lock-promises-protect-autonomous-systems-gps-spoofing/#respond Mon, 26 Feb 2024 15:21:58 +0000 https://www.therobotreport.com/?p=577991 Locus Lock has developed software-defined radio to overcome GPS spoofing for more secure autonomous navigation.

The post Locus Lock promises to protect autonomous systems from GPS spoofing appeared first on The Robot Report.

]]>
Locus Lock is designing RF systems to provide navigational security.

Locus Lock is designing RF systems to provide navigational security. Source: Locus Lock

Flying back from Miami last week, I put my life in the hands of two strangers, just because they wore gold epaulets. These commercial pilots, in turn, trusted their onboard computers to safely navigate the passengers home. The computers accessed satellite data from the Global Positioning System to set the course.

This chain of command is very fragile. The International Air Transport Association (IATA) reported last month an increased level of GPS spoofing and signal jamming since the outbreak of the wars in Ukraine and Israel. This poses the threat of catastrophe to aviators everywhere.

For example, last September, OPS Group reported that a European flight en route to Dubai almost entered into Iranian airspace without clearance. In 2020, Iran shot down an uncleared passenger aircraft that entered its territory. This has made the major airlines, avionics manufacturers, and NATO militaries and governments scramble to find solutions.

Navigational problems can be risky for commercial aircraft. Source: OPS Group

Navigational errors can be very dangerous for commercial aircraft. Source: OPS Group

Locus Lock founder came out of drone R&D

At ff Venture Capital, we recognize that GPS spoofing and jamming are fundamental problems for aerial, terrestrial, and marine autonomous systems in moving the industry forward. This investment thesis is grounded on a simple belief that the deployment of cost-effective uncrewed systems requires the trust of human operators who can’t afford to question the data.

When machines go awry, so does the industry. Just ask Cruise! This conviction led us to invest in Locus Lock. The company said it is taking an innovative software approach to GNSS signal processing using radio frequency, at a fraction of the cost of comparable hardware sold by military contractors.

Last week, I sat down with Locus Lock founder Hailey Nichols, a former University of Texas researcher in the school’s Radionavigation Laboratory. UT’s Lab is best known for its work with SpaceX and Starlink.

Nichols explained her transition from academic to founder: “I was always enthralled with the idea of aerospace and studied at MIT, where I was obsessed with the control and robotic side of aerospace. After I graduated, I worked at Aurora Flight Sciences, which is a subsidiary of Boeing, and I was a UAV software engineer.”

At Aurora, Nichols focused on integrating suites of sensors such as lidar, GPS, radar, and computer vision for uncrewed aerial vehicles (UAVs). However, she quickly became frustrated with the costs and quality of the sensors.

“They were physically heavy [and] power-intensive, and it made it quite hard for engineers to integrate,” she recalled. “This problem frustrated me so much that I went back to grad school to study it further, and I joined a lab down at the University of Texas.”

In Austin, the roboticist saw a different approach to sensor data, using software for signal processing.

“The radio navigation lab was very highly specialized in signal processing, specifically bringing in advanced software algorithms and robust estimation techniques forward to sensor technology,” explained Nichols. “This enabled more precise, secure, and reliable data, like positioning, navigation, and timing.”

Her epiphany came when she saw the market demand for the lab’s GNSS receiver from the U.S. Department of Defense and commercial partners after Locus Lock published research on autonomous vehicles accurately navigating urban canyons.

Navigating urban canyons is a challenge for conventional satellite-based systems.

Navigating urban canyons is a challenge for conventional satellite-based systems. Source: Quora

Reliable navigation needed for dual-use applications

Today, Locus Lock is ready to market its product more widely for dual-use applications across the spectrum of autonomy for commercial and defense use cases.

“Current GPS receivers often fail in what’s called ‘urban multipath,'” said Nichols. “This is where there’s building interference and shrouding of the sky can cause positioning errors. This can be problematic for autonomous cars, drones, and outdoor robotics that need access to centimeter-level positioning to make safe and informed decisions about where they are on the road or in the sky.”

The RF engineer continued: “Our other applicable industry is defense tech. With the rise of the Ukraine conflict and the Israel conflict in the Middle East, we’ve seen a massive amount of deliberate interference. So bad actors that are either spoofing or jamming, causing major outages or disruptions in GPS positioning.”

Locus Lock addresses this problem by enabling its GPS processing suite as a software solution, and unlike hardware, it’s affordable and extremely flexible.

“The ability to be backward-compatible and future-proof where we can constantly update and evolve our GPS processing suite to evolving attack vectors ensures that our customers are given the most cutting-edge and up-to-date processing techniques to enable centimeter-level positioning globally,” added Nichols.

“So our GNSS receivers are software-defined radio [SDR] with a specialized variant of inertially aided RTK [real-time kinematics],” she said, claiming that it provides a differentiator from competing products. “What that means is we’re doing some advanced sensor-fusion techniques with GNSS signals in addition to inertial navigation to ensure that, even in these pockets of urban canyons where you may not have access to GNSS signals … the GPS receiver [will] still provide centimeter-level positioning.”

As Nichols boasted, Locus Lock is an enabler of “next generation autonomous mobility.”

Locus Lock looks to affordable centimeter-level accuracy

While traditional GPS components cost around $40,000, Locus Lock said its its proprietary software and a 2-in. board cost around $2,000. Today, centimeter accuracy is inaccessible to most robot companies because most suppliers of robust hardware are military contractors, including L3Harris Technologies, BAE Systems, Northrop Grumman, and Elbit Systems.

“We’ve specifically made sure to cater our solution towards more low-cost environments that can proliferate mass-market autonomy and robotics into the ecosystem,” stated Nichols.

Locus Lock puts its software on a 2-in. board.

Locus Lock puts its software on a 2-in. board. Source: Oliver Mitchell

Nichols added that Locus Lock’s GNSS receiver is able to pull in data from global and regional satellite constellations.

“[This gives] us more access to any signals in the sky at any given time,” said the startup founder. “Diversity is also increasingly important in next-generation GPS receivers because it allows the device to evade jammed or afflicted channels.”

Grand View Research estimated that the SDR market will climb to nearly $50 billion by 2030. As uncrewed systems proliferate, Locus Lock’s price point should also come down, asserted Nichols.

“And while there are some companies that have progressed their autonomy stacks to be quite high, they haven’t gotten their prices down to make sense in a mass-market scenario,” she said. “And so it’s crucial to enable this next generation of autonomous mobility at large to not compromise on performance but to be able to provide this at an affordable price. Locus Lock is providing high-end performance at a much lower price point.”

Nichols even predicted that the company could eventually get product to under $1,000, if not less, with more adoption.

Global software defined radio market, research by Grand View Research

Source: Grand View Research

Tesla Optimus takes steps toward more mobile systems

Yesterday, Tesla published on X the latest video of its Optimus humanoid moving fluidly at an incredible gait for a robot. Pitchbook recently predicted that this could be a breakout period for humanoids, with 84 leading companies now having raised over $4.6 billion.

At the same time, the prospect of such advanced machines being hijacked via GPS spoofing into the service of terrorists, cybercriminals, or hostile governments is very real and horrifying. Thankfully, Nichols and her team are working with the Army Futures Command.

“A lot of this work has been done in spoofing and jamming — not only detection, but also mitigation,” she said. “We detect the type of RF environment that we are operating in to mitigate it and inform that end user with the situational awareness that is needed to assess ongoing attacks.”

“In addition, we can iterate much faster and bring in world-class experts on security and encryption to ensure that we protect secure military signals as much as possible,” said Nichols. “Our software can find assured reception that is demanded by these increasingly expensive and important assets that the military needs to protect.”

In ffVC’s view, our newest portfolio company is mission-critical to operating drones, robots, and other autonomous vessels safely, affordably, and securely in an increasingly dangerous world.

The post Locus Lock promises to protect autonomous systems from GPS spoofing appeared first on The Robot Report.

]]>
https://www.therobotreport.com/locus-lock-promises-protect-autonomous-systems-gps-spoofing/feed/ 0
RoboGuide robot dog uses AI to assist the visually impaired https://www.therobotreport.com/roboguide-robot-dog-uses-ai-assist-visually-impaired/ https://www.therobotreport.com/roboguide-robot-dog-uses-ai-assist-visually-impaired/#respond Mon, 19 Feb 2024 13:15:28 +0000 https://www.therobotreport.com/?p=577925 Technologies such as RoboGuide could enable people with visual impairments to more fully navigate and interact with the world.

The post RoboGuide robot dog uses AI to assist the visually impaired appeared first on The Robot Report.

]]>
RoboGuide robot dog in action

RoboGuide uses sophisticated sensors and AI to help visually impaired users interact with the world around them. Credit: Royal National Institute of Blind People

Visually impaired people may soon be able to use AI-powered robotic service dogs to navigate the world around them. Researchers from the University of Glasgow, along with industry and charity partners, have unveiled RoboGuide.

The robot uses artificial intelligence to not only assist its users to move independently, but also to speak to them about what’s going on around them. RoboGuide employs a variety of cutting-edge technologies mounted onto an off-the-shelf robot body, according to the University of Glasgow.

RoboGuide relies on computer vision to navigate

RoboGuide isn’t the first robotic assistant developed for visually impaired people. Another example is Gilde from MassRobotics resident startup and Pitchfire winner Glidance Inc. 

The University of Glasgow researchers said RoboGuide does address two significant challenges for such robots. The first is that the technology these robots use to navigate their surroundings can limit their usefulness as guides.

“Robots which use GPS to navigate, for example, can perform well outdoors, but often struggle in indoor settings, where signal coverage can weaken,” said Dr. Olaoluwa Popoola, the project’s principal investigator at the University of Glasgow’s James Watt School of Engineering. “Others, which use cameras to see, are limited by line of sight, which makes it harder for them to safely guide people around objects or around bends.”

RoboGuide is equipped with a series of sophisticated sensors to accurately map its environment.

“We use computer vision and 3D technology, where it scans the whole environment and it understands where each object, each pillar, each obstacle is,” said Dr. Wasim Ahmad, project co-investigator.

The team has developed software that assesses the mapping data and uses simultaneous localization and mapping (SLAM) algorithms to  determine optimal routes from one location to another. The software interprets sensor data in real time, enabling the robot to track and avoid moving obstacles.


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


University of Glasgow adds interactivity

In addition to improved navigation, RoboGuide includes the ability to talk to its user. The robot uses a large language model (LLM) — similar to the technology that powers ChatGPT — to understand and respond to questions from people around it.

LLMs are deep learning algorithms that process natural language inputs, such as spoken questions, and predicts responses based on those inputs. RoboGuide’s AI is trained on massive data sets of language to be able to predict and generate context-appropriate responses to questions and commands.

RoboGuide recently reached a significant milestone in its development. In December 2023, volunteers with visual impairments tested it for the first time.

The quadruped robot guided them through the Hunterian Museum in Glasgow. RoboGuide helped the volunteers tour exhibits on the first floor and provided interactive spoken information on six exhibits.

The results were promising. “One hundred percent I would use this in the future,” said volunteer Kyle Somerville (see video below). “As well, there are a lot of people I know that would definitely either want to try this or would definitely use it.” 

More refinements to come

The research team plans to use data from this demonstration to further refine the RoboGuide platform, with the aim of bringing a more complete version to market.

“Ultimately, our aim is to develop a complete system which can be adapted for use with robots of all shapes and sizes to help blind and partially sighted people in a wide range of indoor situations,” said Ahmad. “We hope that we can create a robust commercial product which can support the visually impaired wherever they might want extra help.”

The World Health Organization has estimated that 2.2 billion people worldwide live with impaired vision. Assistive devices such as RoboGuide could make it significant easier for their users to live more independently and enhance their quality of life.

Matt Greenwood

About the author

Matt Greenwood is an experienced writer with more than 20 years of experience in public-sector communications.

The post RoboGuide robot dog uses AI to assist the visually impaired appeared first on The Robot Report.

]]>
https://www.therobotreport.com/roboguide-robot-dog-uses-ai-assist-visually-impaired/feed/ 0
Stretch 3 from Hello Robot designed for open-source mobile manipulation https://www.therobotreport.com/stretch-3-mobile-manipulator-hello-robot-designed-open-source-development/ https://www.therobotreport.com/stretch-3-mobile-manipulator-hello-robot-designed-open-source-development/#respond Thu, 15 Feb 2024 16:14:28 +0000 https://www.therobotreport.com/?p=577891 Stretch 3 is a portable and lightweight platform for robotics developers and could lead to household applications.

The post Stretch 3 from Hello Robot designed for open-source mobile manipulation appeared first on The Robot Report.

]]>

Hello Robot today launched the third edition of its Stretch mobile manipulator robot. The company described Stretch 3 as a refinement over the previous edition, which was popular as a research platform. Hello Robot said it has improved the manufacturability and the usability of the robot.

New features in Stretch 3 include a rotating 3D camera at the top of the mast, designed for perception functions and observing the environment around the robot. Another key feature is a more robust DexWrist 3 gripper, which now includes a built-in 3D camera to enable vision servoing of the gripper fingers.

The wrist is equipped with a quick-change feature that enables the gripper to be quickly swapped out for specialized end effectors or even an iPad (as seen in the video above).

view of the mobile manipulator and wrist of the Hello Robot.

Stretch 3 includes several updates, including a quick-change wrist, a wrist-mounted camera, and strengthened materials. | Credit: Hello Robot

Hello Robot serves growing open-source community

Stretch 3 empowers a growing community of developers to create a future in which friendly robots fold laundry, feed pets, support older adults, and enhance life in new ways, according to Hello Robot. If there is a “secret sauce” to the go-to market plan for Stretch, it has to be the vibrant research community that has grown to support the platform.

“With Stretch 3, we are taking a real step towards a future with home robots,” said Dr. Aaron Edsinger, co-founder and CEO of the company. “We designed Stretch 3 to help our community leverage recent advances in AI.”

Charlie Kemp, co-founder and chief technology officer of Hello Robot, was a professor at Georgia Tech University and brought the research credibility and connections that fueled the initial development of Stretch. His robotics laboratory at Georgia Tech deployed the first few versions of the robot and put it through its paces as a research platform.

Stretch 3 specs

  • Payload: 2 kg (4.4 lb.)
  • Weight: 24.5 kg (54 lbs.)
  • Size: 33 x 34 x 141 cm (13 x 13.4 x 55.5 in.)
  • Runtime: Two to five hours
  • Software development kit: ROS 2 and Python
top of mast view, including a new rotating 3D camera.

Stretch 3 includes a rotating 3D camera at the top of its mast, enabling perception of the surrounding area and AI-based motion. | Credit: Hello Robot

‘App store’ approach could extend mobile manipulation

These robots will need applications for versatile uses. Hello Robot’s open platform has attracted innovators from across the world, including Fortune 500 companies, top-tier research labs, and universities in over 14 countries. Members of its developer community regularly release open code, data, models, publications, and educational materials, accelerating progress toward a future with household robots.

Edsinger told The Robot Report that he envisions an online “app store” for Stretch where the community can share new skills that users can download and install onto the robot. Each user could then customizing the robot with the desired capabilities for their unique needs.

“Thanks to advances in AI, robots like Stretch are developing faster than expected,” said Edsinger. “A robot autonomously doing laundry was once considered a long-term ‘grand challenge’ but is now within reach.”

Stretch 3 could help with household chores

During a recent visit to Hello Robot headquarters in Martinez, Calif., I had the opportunity to observe V Nguyen, an occupational therapist at Hello Robot, as she demonstrated the Stretch 3 to a end user with disabilities. I asked the individual about how he might envision using the robot.

The most important goal of this end user is to regain some agency and independence with some of the most basic in-home tasks. They include retrieving a pair of pants from the floor or even helping to dress in the morning.

The user cited other tasks like opening and closing a deadbolt on the front door, or removing a hot dish from the microwave. Stretch offers the potential of improving the daily lives of numerous people while enabling them to maintain their independence.

In January 2023, Hello Robot earned a $2.5 million grant from the National Institute of Health to help commercialize its mobile manipulator technology.

Stretch 3 is priced at $24,950 and is available now on Hello Robot’s website for researchers, educators, developers, and enthusiasts.

hello robot stretch3.

Stretch 3 is portable, lightweight, and designed from the ground up to work around people. | Credit: Hello Robot

The post Stretch 3 from Hello Robot designed for open-source mobile manipulation appeared first on The Robot Report.

]]>
https://www.therobotreport.com/stretch-3-mobile-manipulator-hello-robot-designed-open-source-development/feed/ 0
RoboXchange examines real-world operational challenges and new brains for robots https://www.therobotreport.com/roboxchange-examines-real-world-operational-challenges-new-brains-for-robots/ https://www.therobotreport.com/roboxchange-examines-real-world-operational-challenges-new-brains-for-robots/#respond Fri, 02 Feb 2024 15:18:26 +0000 https://www.therobotreport.com/?p=577730 At the first MassRobotics RoboXchange, Locus Robotics shared tips on warehouse challenges, and Opteran showed a new navigational approach.

The post RoboXchange examines real-world operational challenges and new brains for robots appeared first on The Robot Report.

]]>
MassRobotics hosts RoboXchange

MassRobotics Executive Director Tom Ryden introduces Locus Robotics at RoboXchange. Credit: Eugene Demaitre

BOSTON — MassRobotics last night hosted the first in its RoboXchange series of events intended to encourage collaboration among academia, startups, industry, and global partners around robotics and artificial intelligence. It included presentations by Opteran and Locus Robotics spanning research and development to use cases for mobile robots and software.

MassRobotics started RoboXchange in accordance with its mission to help the Massachusetts and global robotics ecosystems through informative and networking events, said Tom Ryden, executive director of the nonprofit. With its sponsors and partners, MassRobotics also provides workspace and other resources for startups, as well as support for STEM (science, technology, engineering, and mathematics) educational programs.

Editor’s note: MassRobotics is a strategic partner of WTWH Media, which produces The Robot Report and the Robotics Summit & Expo.


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


Warehouses should expect the unexpected, says Locus

While warehouses are increasingly adopting autonomous mobile robots (AMRs), unexpected challenges can still derail deployments, noted Steve Branch, vice president of sales engineering at Locus Robotics. The Wilmington, Mass.-based company has produced over 13,000 robots that have collaboratively picked more than 2 billion units at more than 280 facilities in 18 countries.

Ten percent of Locus’ customer sites have more than 100 robots, its largest has an area of 1.2 million sq. ft. (111,000 sq. m), and the biggest deployment included more than 800 robots at peak, Branch said. Customers include third-party logistics providers (3PLs), retailers and e-commerce companies, and healthcare firms, he added.

Not everyone welcomes robots at first, Branch acknowledged. There have been instances of vandalism, “mostly from the uninformed or the older generation,” he said. “But we’ve been surprised at how quickly the user base adopts the technology once it realizes how robots can make their jobs easier.”

“Robots are cool,” said Branch. Since the COVID-19 pandemic, Locus’ clients have become its best promoters, and warehouse associates increasingly realize that AMRs can help extend their careers and offer new opportunities as “robot wranglers,” he said.

Surprises can include environmental challenges, such as changing lighting conditions or dust covering sensors. They can prompt redesigns.

“People do dumb things, like blocking banks of chargers overnight,” said Branch. In addition, users might overload AMRs, add unapproved accessories such as bungee cords, or change throughput for “buy one, get one free” promotions without taking systems into account, he said.

“You can’t control everything, and warehouses already struggle with changing numbers of operators,” Branch observed.

Locus Robotics at RoboXchange

Locus Robotics’ Steve Branch discusses automation challenges at RoboXchange. Credit: Eugene Demaitre

Locus shares tips at RoboXchange

Branch explained that there are ways to prepare for unpredictable problems, including setting expectations, adopting change management best practices like go-live readiness teams, and increasing visibility into processes. Locus Robotics‘ robots-as-a-service (RaaS) model also helps with maintenance and support, according to Branch. It is based on monthly fees, with typical contracts covering three years, he said.

Robotics developers and vendors should focus on demonstrating value at scale to chief financial officers, Branch advised. “We want customers to reach their goals in 60 days,” he said. “Locus has a savings calculator to help before big deployments.”

By visiting clients and soliciting feedback, Locus got pulled into the European market and added the heavy-duty Vector from its Waypoint acquisition to its Origin AMR lineup, he recalled.

In response to questions at RoboXchange, Branch said that Locus supports efforts to make AMRs work across brands and with other equipment. They include the MassRobotics AMR Interoperability Standard. “We’re getting more requests to work with robot arms and workcells for picking and autonomous loading/unloading,” he said.

“We also work with warehouse management systems,” Branch said. “We see ourselves as a software or total solution company, not just a robotics company.”

Opteran looks to insects for new AI model for robots

Conventional machine learning models are based on decades-old understandings of the human visual cortex, asserted Prof. James Marshall. The researcher at the University of Sheffield and chief scientific officer of Opteran presented on “How to Build a Robot Mind” at RoboXchange

“An estimated $75 billion was spent on autonomous vehicles by 2022, and the Apollo program cost $260 billion, adjusted for inflation,” he told attendees. “Imagine if that had been spent on really understanding how intelligence works.”

“Current AI algorithms copy the brain’s spiky processing, but that’s likely a product of how our wetware conserves energy,” Marshall said. “Current navigation such as SLAM, or simultaneous localization and mapping, is data-intensive, compute-heavy, and dependent on network connections.”

Animal brains evolved to solve for motion first, not points in space, he said. Marshall cited examples including the sea squirt, an invertebrate that digests its own brain once it’s no longer needed for its mobile phase of life.

“The fruit fly has only 100,000 neurons, and its brain’s ellipsoid body can still solve navigation and motion,” explained Marshall. “A honeybee with less than 1 million neurons can fly out miles, return to its hive, and communicate where it found flowers as well as conduct other behaviors.”

Other models based on the neural network of C. elegans, or fruit fly, don’t provide the “navigational richness” that Opteran is aiming for, he added. Marshall co-authored a peer-reviewed research paper on “Insect-inspired AI for autonomous robots.”

Opteran's Prof. James Marshall at RoboXchange

Opteran’s Prof. James Marshall discusses a new bio-inspired approach to AI at RoboXchange. Credit: Eugene Demaitre

Opteran shows new navigation approach at RoboXchange

Opteran said it has reverse-engineered information-processing principles for a small development kit that works with cheap sensors. The U.K. company provides vision and perception that it claimed is “orders of magnitude” more robust than more deterministic AI. In fact, it doesn’t require plath planning and can dynamically avoid objects, said Marshall.

“We provide I/O for collision prediction and robust output for panoramic, stabilized 3D visualization,” he said.

“To share the technology now, we’ve productized Opteran Mind 4.1,” said Jack Pearson, commercial director at Opteran. “It solves for localization at lower cost than SLAM.”

In a live demonstration at RoboXchange, Opteran’s team showed how a robot could use its technology to autonomously navigate a course lined with mirrors and changing lights — normally a problem for SLAM.

“We’re working toward safety certifications and cognizant engines and are on a journey to build decision engines,” said Marshall.

The post RoboXchange examines real-world operational challenges and new brains for robots appeared first on The Robot Report.

]]>
https://www.therobotreport.com/roboxchange-examines-real-world-operational-challenges-new-brains-for-robots/feed/ 0
The role of ToF sensors in mobile robots https://www.therobotreport.com/the-role-of-tof-sensors-in-mobile-robots/ https://www.therobotreport.com/the-role-of-tof-sensors-in-mobile-robots/#respond Tue, 23 Jan 2024 17:52:25 +0000 https://www.therobotreport.com/?p=568708 Time-of-flight or ToF sensors provide mobile robots with precise navigation, low-light performance, and high frame rates for a range of applications.

The post The role of ToF sensors in mobile robots appeared first on The Robot Report.

]]>
ToF sensors provide 3D information of the world around a mobile robot, providing important data to the robots perception algorithms. | Credit: E-con Systems

ToF sensors provide 3D information of the world around a mobile robot, providing important data to the robots perception algorithms. | Credit: E-con Systems

In the ever-evolving world of robotics, the seamless integration of technologies promises to revolutionize how humans interact with machines. An example of transformative innovation, the emergence of time-of-flight or ToF sensors is crucial in enabling mobile robots to better perceive the world around them.

ToF sensors have a similar application to lidar technology in that both use multiple sensors for creating depth maps. However, the key distinction lies in these cameras‘ ability to provide depth images that can be processed faster, and they can be built into systems for various applications.

This maximizes the utility of ToF technology in robotics. It has the potential to benefit industries reliant on precise navigation and interaction.

Why mobile robots need 3D vision

Historically, RGB cameras were the primary sensor for industrial robots, capturing 2D images based on color information in a scene. These 2D cameras have been used for decades in industrial settings to guide robot arms in pick-and-pack applications.

Such 2D RGB cameras always require a camera-to-arm calibration sequence to map scene data to the robot’s world coordinate system. 2D cameras are unable to gauge distances without this calibration sequence, thus making them unusable as sensors for obstacle avoidance and guidance.

Autonomous mobile robots (AMRs) must accurately perceive the changing world around them to avoid obstacles and build a world map while remaining localized within that map. Time-of-flight sensors have been in existence since the late 1970s and have evolved to become one of the leading technologies for extracting depth data. It was natural to adopt ToF sensors to guide AMRs around their environments.

Lidar was adopted as one of the early types of ToF sensors to enable AMRs to sense the world around them. Lidar bounces a laser light pulse off of surfaces and measures the distance from the sensor to the surface.

However, the first lidar sensors could only perceive a slice of the world around the robot using the flight path of a single laser line. These lidar units were typically positioned between 4 and 12 in. above the ground, and they could only see things that broke through that plane of light.

The next generation of AMRs began to employ 3D stereo RGB cameras that provide 3D depth information data. These sensors use two stereo-mounted RGB cameras and a “light dot projector” that enables the camera array to accurately view the projected light on the science in front of the camera.

Companies such as Photoneo and Intel RealSense were two of the early 3D RGB camera developers in this market. These cameras initially enabled industrial applications such as identifying and picking individual items from bins.

Until the advent of these sensors, bin picking was known as a “holy grail” application, one which the vision guidance community knew would be difficult to solve.

The camera landscape evolves

A salient feature is the cameras’ low-light performance which prioritizes human-eye safety. The 6 m (19.6 ft.) range in far mode facilitates optimal people and object detection, while the close-range mode excels in volume measurement and quality inspection.

The cameras return the data in the form of a “point cloud.” On-camera processing capability mitigates computational overhead and is potentially useful for applications like warehouse robots, service robots, robotic arms, autonomous guided vehicles (AGVs), people-counting systems, 3D face recognition for anti-spoofing, and patient care and monitoring.

Time-of-flight technology is significantly more affordable than other 3D-depth range-scanning technologies like structured-light camera/projector systems.

For instance, ToF sensors facilitate the autonomous movement of outdoor delivery robots by precisely measuring depth in real time. This versatile application of ToF cameras in robotics promises to serve industries reliant on precise navigation and interaction.

How ToF sensors take perception a step further

A fundamental difference between time-of-flight and RGB cameras is their ability to perceive depth. RGB cameras capture images based on color information, whereas ToF cameras measure the time taken for light to bounce off an object and return, thus rendering intricate depth perception.

ToF sensors capture data to generate intricate 3D maps of surroundings with unparalleled precision, thus endowing mobile robots with an added dimension of depth perception.

Furthermore, stereo vision technology has also evolved. Using an IR pattern projector, it illuminates the scene and compares disparities of stereo images from two 2D sensors – ensuring superior low-light performance.

In comparison, ToF cameras use a sensor, a lighting unit, and a depth-processing unit. This allows AMRs to have full depth-perception capabilities out of the box without further calibration.

One key advantage of ToF cameras is that they work by extracting 3D images at high frame rates — with the rapid division of the background and foreground. They can also function in both light and dark lighting conditions through the use of active lighting components.

In summary, compared with RGB cameras, ToF cameras can operate in low-light applications and without the need for calibration. ToF camera units can also be more affordable than stereo RGB cameras or most lidar units.

One downside for ToF cameras is that they must be used in isolation, as their emitters can confuse nearby cameras. ToF cameras also cannot be used in overly bright environments because the ambient light can wash out the emitted light source.

what is a tof camera illustration.

A ToF sensor is nothing but a sensor that uses time of flight to measure depth and distance. | Credit: E-con Systems

Applications of ToF sensors

ToF cameras are enabling multiple AMR/AGV applications in warehouses. These cameras provide warehouse operations with depth perception intelligence that enables robots to see the world around them. This data enables the robots to make critical business decisions with accuracy, convenience, and speed. These include functionalities such as:

  • Localization: This helps AMRs identify positions by scanning the surroundings to create a map and match the information collected to known data
  • Mapping: It creates a map by using the transit time of the light reflected from the target object with the SLAM (simultaneous localization and mapping) algorithm
  • Navigation: Can move from Point A to Point B on a known map

With ToF technology, AMRs can understand their environment in 3D before deciding the path to be taken to avoid obstacles. 

Finally, there’s odometry, the process of estimating any change in the position of the mobile robot over some time by analyzing data from motion sensors. ToF technology has shown that it can be fused with other sensors to improve the accuracy of AMRs.

About the author

Maharajan Veerabahu has more than two decades of experience in embedded software and product development, and he is a co-founder and vice president of product development services at e-con Systems, a prominent OEM camera product and design services company. Veerabahu is also a co-founder of VisAi Labs, a computer vision and AI R&D unit that provides vision AI-based solutions for their camera customers.

The post The role of ToF sensors in mobile robots appeared first on The Robot Report.

]]>
https://www.therobotreport.com/the-role-of-tof-sensors-in-mobile-robots/feed/ 0