Artificial Intelligence / Cognition Archives - The Robot Report https://www.therobotreport.com/category/design-development/ai-cognition/ Robotics news, research and analysis Wed, 17 Apr 2024 12:19:27 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.2 https://www.therobotreport.com/wp-content/uploads/2017/08/cropped-robot-report-site-32x32.png Artificial Intelligence / Cognition Archives - The Robot Report https://www.therobotreport.com/category/design-development/ai-cognition/ 32 32 Boston Dynamics debuts electric version of Atlas humanoid robot https://www.therobotreport.com/boston-dynamics-debuts-electric-version-of-atlas-humanoid-robot/ https://www.therobotreport.com/boston-dynamics-debuts-electric-version-of-atlas-humanoid-robot/#respond Wed, 17 Apr 2024 13:15:29 +0000 https://www.therobotreport.com/?p=578728 Boston Dynamics has retired the hydraulic version of its Atlas and will begin testing an all-electric humanoid robot in the coming year.

The post Boston Dynamics debuts electric version of Atlas humanoid robot appeared first on The Robot Report.

]]>

Goodbye to the hydraulic version of Atlas and hello to the electric model designed for commercialization. That’s the message from Boston Dynamics Inc., which yesterday retired the older version of its humanoid robot after 15 years of development and today showed a preview of its successor.

“The next generation of the Atlas program builds on decades of research and furthers our commitment to delivering the most capable, useful mobile robots solving the toughest challenges in the industry today: with Spot, with Stretch, and now with Atlas,” said the company in a blog post. Spot is a quadruped used in facilities inspection and other tasks, and Stretch is designed to unload trucks.

Boston Dynamics began with humanoids by sawing one of its pneumatically powered quadrupeds in half back in 2009. By 2016, the Waltham, Mass.-based company showed that its robot could walk, open a door, and maintain its balance while being shoved by a person holding a hockey stick, all without a tether.

Roboticists continued to improve Atlas, giving it a smaller form factor and more sensors, training its artificial intelligence, and enabling it to do increasingly impressive feats. They ranged from parkour and dancing to taking tools through a mock construction site.

In fact, it was that demonstration of Atlas manipulating a plank, picking up a bag of tools, and taking it to a worker that earned Boston Dynamics an RBR50 Robotics Innovation Award. The company will be exhibiting at the RBR50 Showcase at the Robotics Summit & Expo on May 1 and 2.


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


Boston Dynamics evolves with the times

As capable as the YouTube darling was, the older version of Atlas still had limitations, both in range of motion and in terms of size and power usage. Boston Dynamics noted that it designed its legged robots to operate in unstructured environments, and it acknowledged that Atlas was initially a research and development project rather than a commercial product.

In the meantime, the company itself changed owners, from Google in 2013 to SoftBank in 2017 and most recently to Hyundai in 2020. Along with those changes came an increasing focus on robots such as Spot and Stretch serving industrial needs. To continue pure research, Hyundai founded the Boston Dynamics AI Institute in 2022.

“The AI Institute recently launched a new version of Spot with an API [application programming interface] designed for researchers,” said Robert Playter, CEO of Boston Dynamics. “We’re talking about how to jointly solve some big challenges — the diversity of manipulation tasks we need to do with this robot [Atlas] is huge, and AI is essential to enabling that generality.”

Playter told The Robot Report that Boston Dynamics needs results within two to three years, while the AI Institute has more of a five-year timeframe.

Robot lessons apply to fleets, new Atlas

“It takes a solid year from a clean sheet to a new robot,” said Playter. “We wanted to know that we could solve essential dexterous manipulation problems before releasing the product.”

Boston Dynamics learned numerous lessons from commercializing Spot and Stretch, he said. It has improved control policies, upgraded actuation, and minimized joint complexity. The new Atlas has three-fingered grippers.

The Orbit fleet management software, which initially applies to indoor deployments of Spot, could also help supervise Stretch and Atlas.

Atlas will be ready for mobile manipulation.

Atlas gets ready for mobile manipulation in industrial settings. Source: Boston Dynamics

“Everything we understood, from the time of launching Spot as a prototype to it being a reliable product deployed in fleets, is going into the new Atlas,” Playter said. “We’re confident AI and Orbit will help enhance behaviors. For instance, by minimizing slipping on surfaces at Anheuser-Busch, we proved that we can develop algorithms and make it reliable.”

“Now, 1,500 robots in our fleet have them running,” he added. “It’s essential for customers like Purina to monitor and manage fleets as a vehicle for collecting data. As we develop and download new capabilities, Orbit becomes a hub for an ecosystem of different robots.”

Safety and autonomy are basic building blocks

Boston Dynamics has considered safe collaboration in its development of the new Atlas. ASTM International is developing safety standards for legged robots.

“We recognized early on that Atlas is going to work in spaces that have people in them,” said Playter. “This sets the bar much higher than lidar with AMRs [autonomous mobile robots].”

“We started thinking about functionally safe 3D vision,” he recalled. “We started with Stretch inside a container, but ultimately, we want it going everywhere in a warehouse. Advanced, functionally safe, remote vision and onboard systems are essential to solving safety.”

While Spot and Atlas are often teleoperated, Playter said this is a necessary step toward greater levels of autonomy.

“Making the robots knowledgeable about different types of objects and how to grasp them, teleoperation is just a tool for providing examples and data to the robot,” he explained. “It’s not a useful way of building intuition, but it’s easier if you can operate robots at a higher and higher level. Like you don’t need to tell Spot where to plant its feet, you don’t want to tell Atlas where to grasp.”

In the new video below, the previous version of Atlas handles automotive parts and real products weighing up to 25 lb. (11.3 kg).

Atlas ready for rivals in the humanoid race

Over the past two years, the number of humanoid robots in development has rapidly grown. It now includes Agility Robotics‘ Digit, Tesla’s Optimus, and Figure AI‘s Figure 01. In the two past weeks alone, Rainbow Robotics, Sanctuary AI, and Mentee Robotics have all made announcements.

Investment has also been flowing to humanoid companies, with 1X Technologies raising $100 million in January, Figure AI raising $675 million in February, and Accenture investing in Sanctuary AI in March.

Humanoid robots have advanced in parallel with generative AI, and Playter said he welcomes the competition.

“There were three seminal events: Boston Dynamics got acquired for $1 billion, interest in Tesla’s robot validated what we’ve done for a long time, and the emergence of new AI holds the promise of generalization of tasks,” he said. “They’ve inspired lots of new players, but having new tech isn’t all you need to have a commercial product. You need to focus on a use case, build a reliable machine, and manufacture it in a way to build a business. We want to avoid a ‘humanoid winter,’ so rollouts have to be real.”

Playter added that practical design and proper implementation of AI will help differentiate robots rather than focusing on making them more human-like. The new version of Atlas demonstrated that point in how it stood up in the video at the top of this article.

“It’s not talking to a robot that moves the needle, but whether you can build a robot that eventually does 500 tasks,” he said. “Anthropomorphism blows things out of perspective. We did not want a human-shaped head for Atlas. We want people to remember it’s a machine and that it can move in ways humans can’t.”

The financial stability of the businesses involved will also be relevant for commercial success, said Playter. 

“It takes sustained investment; these are expensive products to launch,” he noted. “Having products already out helps build momentum.”

Atlas is humanoid -- to a point.

Atlas is humanoid — to a point. Source: Boston Dynamics

When will we see the new robot in the wild?

Boston Dynamics will begin testing the all-electric version of Atlas with parent company Hyundai and select partners next year, said Playter.

“We’re beginning in their factory,” he told The Robot Report. “In addition to the target application of a lot of parts movement — a special kind of logistics in automotive production — I think that will evolve as the dexterity of the robots improves over time.”

“We see robots in the workplace as an evolution, a continuum from Spot to Atlas,” asserted Playter. “Each product in the series informs the launch of the next.”

“Industries will have to figure out how to adapt and incorporate humanoids into their facilities,” he said. “We’ll actually see robots in the wild in factories beginning next year. We want a diversity of tasks.”

The post Boston Dynamics debuts electric version of Atlas humanoid robot appeared first on The Robot Report.

]]>
https://www.therobotreport.com/boston-dynamics-debuts-electric-version-of-atlas-humanoid-robot/feed/ 0
Electric Sheep wins 2024 RBR50 Startup of the Year https://www.therobotreport.com/electric-sheep-wins-2024-rbr50-startup-of-the-year/ https://www.therobotreport.com/electric-sheep-wins-2024-rbr50-startup-of-the-year/#respond Thu, 11 Apr 2024 14:46:02 +0000 https://www.therobotreport.com/?p=578679 Electric Sheep has a novel business model and agile development team that make it first winner of the RBR50 Startup of the Year.

The post Electric Sheep wins 2024 RBR50 Startup of the Year appeared first on The Robot Report.

]]>
field workers stands on lawn surrounded by a fleet of autonomous electric sheep mowers.

Electric Sheep is vertically integrating its field operations team with autonomous mowers. | Credit: Electric Sheep

This year, the annual RBR50 Robotics Innovation Awards added new categories: Application of the Year, Startup of the Year, and Robot of the Year. We received numerous submissions for some incredible startups innovating in some interesting markets. The Robot Report‘s team chose autonomous landscaping company Electric Sheep Robotics as the inaugural RBR50 Startup of the Year.

The San Francisco-based company has a novel business plan that is immediately bringing in revenue while it takes its time to evolve the underlying technology. This is different from many robotics businesses, which simply sell or lease systems to integrators and end users.

“We are honored to be recognized by WTWH Media’s Robotics Group with this inaugural award. I want to also acknowledge our dedicated team at Electric Sheep that are passionate about creating the most advanced robotics that can change an often overlooked industry,” stated Nag Murty, co-founder and CEO of Electric Sheep. “We are doing things differently than other robotic companies by using AI and ML at a higher level for localization and high-level control. We are scaling physical agents across the country to care for our outdoor spaces.”

Founded in 2019, Electric Sheep has grown to over 100 employees, and it has raised more than $25 million in funding to date, according to Crunchbase.

You can also learn more about Murty’s entrepreneurial philosophy and Chief Technology Officer Michael Laskey’s design principles on a recent episode of The Robot Report Podcast.

Acquisitions add data for autonomy AI

Electric Sheep develops autonomous robots for outdoor maintenance. Its flagship robot is an autonomous mower backed by the company’s ES1 foundation model.

Based on recent advances in generative AI, ES1 is a learned-world model that enables reasoning and planning for the Verdie robot. ES1 powers both the RAM robot for mowing and now Verdie for edging and trimming lawns and bushes and blowing leaves.

In addition, Electric Sheep acquired four landscaping companies last year and said that this is a key part of its long-term plan. This strategy isn’t just about revenue. The businesses it acquires can also use ES1 and provide crucial data to make the model more effective.

This information can help improve Electric Sheep’s operations, enabling its robots to start working as soon as they arrive at a job site. 

Since taking this two-pronged approach to development and business, the company reported that its sales have grown eightfold. Electric Sheep has set itself apart from other startups by making sure it always has money coming in and by finding a unique way to get important data about its business.

Meet Electric Sheep at the Robotics Summit & Expo 

This year’s RBR50 award winners will be celebrated at the Robotics Summit & Expo, which will be on May 1 and 2 at the Boston Convention and Exhibition Center. Electric Sheep will be demonstrate its newest robot powered by ES1, Verdie, the RBR50 showcase on the expo floor.

Attendees at the 2024 Robotics Summit and Expo at the Boston Convention and Exhibition Center will have an opportunity to meet members of Electric Sheep’s executive team. Co-founder and CEO Nag Murty will present a session titled “Startup Survival Guide to Lean Times” at 2:30 p.m. EDT on Thursday, May 2.

rbr50 banner logo.

Murty will be joined by Oliver Mitchell, partner of ff Venture Capital; Fiona O’Donnell McCarthy, principal of True Ventures; and Steve Crowe, executive editor of robotics at WTWH Media. This panel will share tips from experienced investors and robotics companies on what they’re looking for and attendees will learn how organizations can navigate the challenging path to commercialization.

In addition, tickets are available for the first RBR50 Robotics Innovation Awards Gala, which will be at the end of Day 1 of the event. The Robotics Summit & Expo will be the biggest yet, with keynotes and sessions from leading companies, more than 200 exhibitors, up to 5,000 attendees, a Women in Robotics Breakfast, and a Robotics Engineering Career Fair.

Co-located events include DeviceTalks Boston, which focuses on medical devices, and the inaugural Digital Transformation Forum. which will focus on manufacturing. Registration is now open for the Robotics Summit.


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


The post Electric Sheep wins 2024 RBR50 Startup of the Year appeared first on The Robot Report.

]]>
https://www.therobotreport.com/electric-sheep-wins-2024-rbr50-startup-of-the-year/feed/ 0
Autopicker wins 2024 RBR50 Application of the Year for Brightpick https://www.therobotreport.com/autopicker-wins-2024-rbr50-application-of-the-year-for-brightpick/ https://www.therobotreport.com/autopicker-wins-2024-rbr50-application-of-the-year-for-brightpick/#respond Wed, 10 Apr 2024 14:50:57 +0000 https://www.therobotreport.com/?p=578671 Autopicker combines AI, vision-guided picking, and a mobile base to be the first winner of the RBR50 Application of the Year.

The post Autopicker wins 2024 RBR50 Application of the Year for Brightpick appeared first on The Robot Report.

]]>
Two Autopicker mobile manipulators in a warehouse aisle.

Two Autopicker mobile manipulators in a warehouse aisle. Source: Brightpick

This year, the annual RBR50 Robotics Innovation Awards added new categories: Application of the Year, Startup of the Year, and Robot of the Year. We received numerous submissions, but the Autopicker system from Brightpick stood out for automating both mobile manipulation and each picking.

Other robots combining mobility with manipulation have come and gone, from Fetch and Freight to Swift, in part because getting to commercially viable levels of reliability has been challenging. Not only has Autopicker added newer artificial intelligence to the mix, but it has also been deployed in existing customer warehouses.

“On the AI side, this was not possible five to six years ago,” Jan Zizka, co-founder and CEO of Brightpick, told The Robot Report. “Serious breakthroughs enable machine learning to generalize to unseen items.”

Autopicker learns with each pick

Autopicker combines a mobile base, a robotic arm, machine vision, and AI for e-commerce order fulfillment. The system reduces the need for warehouse associates to travel with carts, thanks to its patented design, which enables it to pick items from standard shelving and place them in either of two totes.

Brightpick said Autopicker can pick groceries, cosmetics, electronics, pharmaceuticals, apparel, and more with 99.9% accuracy. Its AI algorithms have been trained on more than 500 million picks to date, and they are improving with each pick, added the company.

Announced in February 2023, the system also supports pallet picking, replenishment, dynamic slotting, buffering, and dispatch. It can store up to 50,000 SKUs, said Brightpick. It also offers a goods-to-person option for heavy or hard-to-pick items, and Autopicker can raise its bins to waist height for ergonomic picking.

In the past year, customers such as Netrush and Rohlik Group began deploying the company’s latest system. Autopicker is available for direct purchase or through a robotics-as-a-service (RaaS) model.

See Brightpick at the Robotics Summit & Expo 

Cincinnati-based Brightpick is a unit of Bratislava, Slovakia-based machine vision provider Photoneo s.r.o. The company said its systems can “enable warehouses of any size to fully automate order picking, consolidation, dispatch, and stock replenishment.”

rbr50 banner logo.Brightpick, which has more than 200 employees, claimed that its robots take only weeks to deploy and can reduce labor assigned to picking by 98% and picking costs by half. In January 2023, the company raised $19 million in Series B funding for its U.S. expansion, and it said demand for Autopicker has been strong.

This year’s RBR50 award winners will be celebrated at the Robotics Summit & Expo, which will be on May 1 and 2 at the Boston Convention and Exhibition Center. Brightpick will be part of the RBR50 showcase on the expo floor.

In addition, tickets are available for the first RBR50 Robotics Innovation Awards Gala, which will be at the end of Day 1 of the event. The Robotics Summit & Expo will be the biggest yet, with keynotes and sessions from leading companies, more than 200 exhibitors, up to 5,000 attendees, a Women in Robotics Breakfast, and a Robotics Engineering Career Fair.

Co-located events include DeviceTalks Boston, which focuses on medical devices, and the inaugural Digital Transformation Forum. which will focus on manufacturing. Registration is now open for the Robotics Summit.


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


The post Autopicker wins 2024 RBR50 Application of the Year for Brightpick appeared first on The Robot Report.

]]>
https://www.therobotreport.com/autopicker-wins-2024-rbr50-application-of-the-year-for-brightpick/feed/ 0
AMD releases Versal Gen 2 to improve support for embedded AI, edge processing https://www.therobotreport.com/amd-releases-versal-gen-2-to-support-ai-edge-processing/ https://www.therobotreport.com/amd-releases-versal-gen-2-to-support-ai-edge-processing/#respond Tue, 09 Apr 2024 08:15:20 +0000 https://www.therobotreport.com/?p=578606 The first devices in AMD Versal Series 2 target high-efficiency for AI Engines, and Subaru is one of its first customers.

The post AMD releases Versal Gen 2 to improve support for embedded AI, edge processing appeared first on The Robot Report.

]]>
AMD Versal AI Edge and Prime Gen 2.

The AMD Versal AI Edge and Prime Gen 2 are next-gen SoCs. Source: Advanced Micro Devices

To enable more artificial intelligence on edge devices such as robots, hardware vendors are adding to their processor portfolios. Advanced Micro Devices Inc. today announced the expansion of its adaptive system on chip, or SoC, line with the new AMD Versal AI Edge Series Gen 2 and Versal Prime Series Gen 2.

“The demand for AI-enabled embedded applications is exploding and driving the need for solutions that bring together multiple compute engines on a single chip for the most efficient end-to-end acceleration within the power and area constraints of embedded systems,” stated Salil Raje, senior vice president and general of the Adaptive and Embedded Computing Group at AMD.

“Based on over 40 years of adaptive computing leadership in high-security, high-reliability, long-lifecycle, and safety-critical applications, these latest-generation Versal devices offer high compute efficiency and performance on a single architecture that scales from the low end to high end,” he added.

For more than 50 years, AMD said it has been a leading innovator in high-performance computing (HPC), graphics, and visualization technologies. The Santa Clara, Calif.-based company noted that billions of people, Fortune 500 businesses, and scientific research institutions worldwide rely on its technology daily.

Versal Gen 2 addresses three phases of accelerated AI

Advanced Micro Devices said the Gen 2 systems put preprocessing, AI inference, and postprocessing on a single device to deliver accelerated AI. This provides the optimal mix for accelerated AI meet the complex processing needs of real-world embedded systems, it asserted.

  • Preprocessing: The new systems include FPGA (field-programmable gate array) logic fabric for real-time preprocessing; flexible connections to a wide range of sensors; and implementation of high-throughput, low-latency data-processing pipelines.
  • AI inference: AMD said it provides an array of vector processes in the form of next-generation AI Engines for efficient inference.
  • Postprocessing: Arm CPU cores provide the power needed for complex decision-making and control for safety-critical applications, said AMD.

“This single-chip intelligence can eliminate the need to build multi-chip processing solutions, resulting in smaller, more efficient embedded AI systems with the potential for shorter time to market,” the company said.


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


AMD builds to maximize power and compute

AMD said its latest systems offer up to 10x more scalar compute compared with the first generation, so the devices can more efficiently handle sensor processing and complex scalar workloads. The Versal Prime Gen 2 devices include new hard IP for high-throughput video processing, including up to 8K multi-channel worflows.

This makes the scalable portfolio suitable for applications such as ultra-high-definition (UHD) video streaming and recording, industrial PCs, and flight computers, according to the company.

In addition, the new SoCs include new AI Engines that AMD claimed will deliver three times the TOPS (trillions of operations per second) per watt than the first-generation Versal AI Edge Series devices.

“Balancing performance, power, [and] area, together with advanced functional safety and security, Versal Series Gen 2 devices deliver new capabilities and features,” said AMD. It added that they “enable the design of high-performance, edge-optimized products for the automotive, aerospace and defense, industrial, vision, healthcare, broadcast, and pro AV [autonomous vehicle] markets.”

“Single-chip intelligence for embedded systems will enable pervasive AI, including robotics … smart city, cloud and AI, and the digital home,” said Manuel Uhm, director for Versal marketing at AMD, in a press briefing. “All will need to be accelerated.”

The Versal Prime Gen 2 SoC.

The Versal Prime Gen 2 is designed for high-throughput applications such as video processing. Source: AMD

Versal powers Subaru’s ADAS vision system

Subaru Corp. is using AMD’s adaptive SoC technology in current vehicles equipped with its EyeSight advanced driver-assistance system (ADAS). EyeSight is integrated into certain car models to enable advanced safety features including adaptive cruise control, lane-keep assist, and pre-collision braking.

“Subaru has selected Versal AI Edge Series Gen 2 to deliver the next generation of automotive AI performance and safety for future EyeSight-equipped vehicles,” said Satoshi Katahira. He is general manager of the Advanced Integration System Department and ADAS Development Department, Engineering Division, at Subaru.

“Versal AI Edge Gen 2 devices are designed to provide the AI inference performance, ultra-low latency, and functional safety capabilities required to put cutting-edge AI-based safety features in the hands of drivers,” he added.

Vivado and Vitis part of developer toolkits

AMD said its Vivado Design Suite tools and libraries can help boost productivity and streamline hardware design cycles, offering fast compile times and enhanced-quality results. The company said the Vitis Unified Software Platform “enables embedded software, signal processing, and AI design development at users’ preferred levels of abstraction, with no FPGA experience needed.”

Earlier this year, AMD released the Embedded+ architecture for accelerated edge AI, as well as the Spartan UltraScale+ FPGA family for edge processing.

Early-access documentation for Versal Series Gen 2 is now available, along with first-generation Versal evaluation kits and design tools. AMD said it expects Gen 2 silicon samples to be available in the first half of 2025, followed by evaluation kits and system-on-modules samples in mid-2025, and production silicon in late 2025.

The post AMD releases Versal Gen 2 to improve support for embedded AI, edge processing appeared first on The Robot Report.

]]>
https://www.therobotreport.com/amd-releases-versal-gen-2-to-support-ai-edge-processing/feed/ 0
NEURA and Omron Robotics partner to offer cognitive factory automation https://www.therobotreport.com/neura-omron-robotics-partner-offer-cognitive-factory-automation/ https://www.therobotreport.com/neura-omron-robotics-partner-offer-cognitive-factory-automation/#respond Thu, 04 Apr 2024 12:55:34 +0000 https://www.therobotreport.com/?p=578518 NEURA Robotics and Omron Robotics and Safety Technologies say their strategic alliance will make cognitive systems 'plug and play.'

The post NEURA and Omron Robotics partner to offer cognitive factory automation appeared first on The Robot Report.

]]>
NEURA Robotics lab.

NEURA has developed cognitive robots in a variety of form factors. Source: NEURA Robotics

Talk about combining robotics and artificial intelligence is all the rage, but some convergence is already maturing. NEURA Robotics GmbH and Omron Robotics and Safety Technologies Inc. today announced a strategic partnership to introduce “cognitive robotics” into manufacturing.

“By pooling our sensor and AI technologies and expertise into an ultimate platform approach, we will significantly shape the future of the manufacturing industry and set new standards,” stated David Reger, founder and CEO of NEURA Robotics.

Reger founded the company in 2019 with the intention of combining sensors and AI with robotics components for a platform for app development similar to that of smartphones. The “NEURAverse” offers flexibility and cost efficiency in automation, according to the company.

“Unlike traditional industrial robots, cognitive robots have the ability to learn from their environment, make decisions autonomously, and adapt to dynamic production scenarios,” said Metzingen, Germany-based NEURA. “This opens new application possibilities including intricate assembly tasks, detailed quality inspections, and adaptive material handling processes.”

Omron has sensor, channel expertise

“We see NEURA’s cognitive technologies as a compelling growth opportunity for industrial robotics,” added Olivier Welker, president and CEO of Omron Robotics and Safety Technologies. “By combining NEURA’s innovative solutions with Omron’s global reach and automation portfolio, we will provide customers new ways to increase safety, productivity, and flexibility in their operations.”

Pleasanton, Calif.-based Omron Robotics is a subsidiary of OMRON Corp. focusing on automation and safety sensing. It designs and manufactures industrial, collaborative, and mobile robots for various industries.

“We’ve known Omron for quite some time, and even before I started NEURA, we had talked about collaborating,” Reger told The Robot Report. “They’ve tested our products, and we’ve worked together on how to benefit both sides.”

“We have the cognitive platform, and they’re one of the biggest sensor, controllers, and safety systems providers,” he added. “This collaboration will integrate our cognitive abilities and NEURAverse with their sensors for a plug-and-play solution, which everyone is working toward.”

Omron Robotics' Olivier Welker and NEURA's David Reger.

Omron Robotics’ Olivier Welker and NEURA’s David Reger celebrate their partnership. Source: NEURA

Collaboration has ‘no limits’

When asked whether NEURA and Omron Robotics’ partnership is mainly focused on market access, Reger replied, “It’s not just the sales channel … there are no really big limits. From both sides, there will be add-ons.”

Rather than see each other as competitors, NEURA and Omron Robotics are working to make robots easier to use, he explained.

“As a billion-dollar company, it could have told our startup what it wanted, but Omron is different,” said Reger. “I felt we got a lot of respect from Olivier and everyone in that organization. It won’t be a one-sided thing; it will be just ‘Let’s help each other do something great.’ That’s what we’re feeling every day since we’ve been working together. Now we can start talking about it.”

NEURA has also been looking at mobile manipulation and humanoid robots, but adding capabilities to industrial automation is the “low-hanging fruit, where small changes can have a huge effect,” said Reger. “A lot of things for humanoids have not yet been solved.”

“I would love to just work on household robots, but the best way to get there is to use the synergy between industrial robotics and the household market,” he noted. “Our MAiRA, for example, is a cognitive robot able to scan an environment and from an idle state pick any known or unknown objects.”

MAiRA cognitive robot on MAV mobile base.

MAiRA cognitive robot on MAV mobile base. Source: NEURA Robotics

Ease of use drives NEURA strategy

NEURA and Omron Robotics promise to make robots easier to use, helping overall adoption, Reger said.

“A big warehouse company out of the U.S. is claiming that it’s already using more than 1 million robots, but at the same time, I’m sure they’d love to use many more robots,” he said. “It’s also in the transformation from a niche market into a mass market. We see that’s currently only possible if you somehow control the environment.”

“It’s not just putting all the sensors inside the robot, which we were first to do, and saying, ‘OK, now we’re able to interact with a human and also pick objects,'” said Reger. “Imagine there are external sensors, but how do you calibrate them? To make everything plug and play, you need new interfaces, which means collaboration with big players like Omron that provide a lot of sensors for the automation market.”

NEURA has developed its own sensors and explored the balance of putting processing in the cloud versus the edge. To make its platform as popular with developers as that of Apple, however, the company needs the support of partners like Omron, he said.

Reger also mentioned NEURA’s partnership with Kawasaki, announced last year, in which Kawasaki offers the LARA CL series cobot with its portfolio. “Both collaborations are incredibly important for NEURA and will soon make sense to everyone,” he said.

NEURA to be at Robotics Summit & Expo

Reger will be presenting a session on “Developing Cognitive Robotics Systems” at 2:45 p.m. EDT on Wednesday, May 1, Day 1 of the Robotics Summit & Expo. The event will be at the Boston Convention and Exhibition Center, and registration is now open.

“I’ll be talking about making robots cognitive to enable AI to be useful to humanity instead of competing with us,” he said. “AI is making great steps, but if you look at what it’s doing, like drawing pictures or writing stories — these are things that I’d love to do but don’t have the time for. But if I ask, let’s say, AI to take out the garbage or show it a picture of garbage, it can tell me how to do it, but it’s simply not able to do something about it yet.”

NEURA is watching humanoid development but is focusing on integrating cognitive robotics with sensing and wearables as it expands in the U.S., said Reger. The company is planning for facilities in Detroit, Boston, and elsewhere, and it is looking for leadership team members as well as application developers and engineers.

“We don’t just want a sales office, but also production in the U.S.,” he said. “We have 220 people in Germany — I just welcomed 15 new people who joined NEURA — and are starting to build our U.S. team. In the past several months, we’ve gone with only European and American investors, and we’re looking at the Japanese market. The U.S. is now open to innovation, and it’s an exciting time for us to come.”


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


The post NEURA and Omron Robotics partner to offer cognitive factory automation appeared first on The Robot Report.

]]>
https://www.therobotreport.com/neura-omron-robotics-partner-offer-cognitive-factory-automation/feed/ 0
Top 10 robotics news stories of March 2024 https://www.therobotreport.com/top-10-robotic-stories-of-march-2024/ https://www.therobotreport.com/top-10-robotic-stories-of-march-2024/#respond Mon, 01 Apr 2024 17:01:03 +0000 https://www.therobotreport.com/?p=578366 From events like MODEX and GTC to new product launches, there was no shortage of robotics news to cover in March 2024. 

The post Top 10 robotics news stories of March 2024 appeared first on The Robot Report.

]]>
March 2024 was a non-stop month for the robotics industry. From events such as MODEX and GTC to exciting new deployments and product launches, there was no shortage of news to cover. 

Here are the top 10 most popular stories on The Robot Report this past month. Subscribe to The Robot Report Newsletter or listen to The Robot Report Podcast to stay updated on the latest technology developments.


10. Robotics Engineering Career Fair to connect candidates, employers at Robotics Summit

The career fair will draw from the general robotics and artificial intelligence community, as well as from attendees at the Robotics Summit & Expo. Past co-located career fairs have drawn more than 800 candidates, and MassRobotics said it expects even more people at the Boston Convention and Exhibition Center this year. Read More


SMC released LEHR series grippers for UR cobot arms in March 2024.

9. SMC adds grippers for cobots from Universal Robots

SMC recently introduced a series of electric grippers designed to be used with collaborative robot arms from Universal Robots. Available in basic and longitudinal types, SMC said the LEHR series can be adapted to different industrial environments like narrow spaces. Read More


anyware robotics pixmo robot.8. Anyware Robotics announces new add-on for Pixmo unloading robots

Anyware Robotics announced in March 2024 an add-on for its Pixmo robot for truck and container unloading. The patent-pending accessory includes a vertical lift with a conveyor belt that is attached to Pixmo between the robot and the boxes to be unloaded. Read More


image of Phoenix humanoid robot, full body, not a render.

7. Accenture invests in humanoid maker Sanctuary AI in March 2024

In its Technology Vision 2024 report, Accenture said 95% of the executives it surveyed agreed that “making technology more human will massively expand the opportunities of every industry.” Well, Accenture put its money where its mouth is. Accenture Ventures announced a strategic investment in Sanctuary AI, one of the companies developing humanoid robots. Read More


Cambrian Robotics is applying machine vision to industrial robots

6. Cambrian Robotics obtains seed funding to provide vision for complex tasks

Machine vision startup Cambrian Robotics Ltd. has raised $3.5 million in seed+ funding. The company said it plans to use the investment to continue developing its AI platform to enable robot arms “to surpass human capabilities in complex vision-based tasks across a variety of industries.” Read More


Mobile Industrial Robots introduced the MiR1200 pallet jack in March 2024.5. Mobile Industrial Robots launches MiR1200 autonomous pallet jack

Autonomous mobile robots (AMRs) are among the systems benefitting from the latest advances in AI. Mobile Industrial Robots at LogiMAT in March 2024 launched the MiR1200 Pallet Jack, which it said uses 3D vision and AI to identify pallets for pickup and delivery “with unprecedented precision.” Read More


4. Reshape Automation aims to reduce barriers of robotics adoption

Companies in North America bought 31,159 robots in 2023. That’s a 30% decrease from 2022. And that’s not sitting well with robotics industry veteran Juan Aparicio. After working at Siemens for a decade and stops at Ready Robotics and Rapid Robotics, Aparicio hopes his new startup Reshape Automation can chip away at this problem. Read More


Apptronik Apollo moves a tote.

3. Mercedes-Benz testing Apollo humanoid

Apptronik announced that leading automotive brand Mercedes-Benz is testing its Apollo humanoid robot. As part of the agreement, Apptronik and Mercedes-Benz will collaborate on identifying applications for Apollo in automotive settings. Read More


NVIDIA CEO Jenson Huang on stage with a humanoid lineup in March 2024.

2. NVIDIA announces new robotics products at GTC 2024

The NVIDIA GTC 2024 keynote kicked off like a rock concert in San Jose, Calif. More than 15,000 attendees filled the SAP Arena in anticipation of CEO Jensen Huang’s annual presentation of the latest product news from NVIDIA. He discussed the new Blackwell platform, improvements in simulation and AI, and all the humanoid robot developers using the company’s technology. Read More


Schneider cobot product family.

1. Schneider Electric unveils new Lexium cobots at MODEX 2024

In Atlanta, Schneider Electric announced the release of two new collaborative robots: the Lexium RL 3 and RL 12, as well as the Lexium RL 18 model coming later this year. From single-axis machines to high-performance, multi-axis cobots, the Lexium line enables high-speed motion and control of up to 130 axes from one processor, said the company. It added that this enables precise positioning to help solve manufacturer production, flexibility, and sustainability challenges. Read More

 

The post Top 10 robotics news stories of March 2024 appeared first on The Robot Report.

]]>
https://www.therobotreport.com/top-10-robotic-stories-of-march-2024/feed/ 0
Team and TAM: the keys to investing in robotics https://www.therobotreport.com/team-and-tam-the-keys-to-investing-in-robotics/ https://www.therobotreport.com/team-and-tam-the-keys-to-investing-in-robotics/#respond Fri, 29 Mar 2024 22:38:54 +0000 https://www.therobotreport.com/?p=578347 Jamie Lee, Managing Partner with Tamarack Global, is our guest this week to discuss the recent investment into Figure AI.

The post Team and TAM: the keys to investing in robotics appeared first on The Robot Report.

]]>


Our featured guest on the show this week is Jamie Lee, managing partner at Tamarack Global. Tamarack Global emerged on our radar last month as one of the investors who participated in the recent Series B funding round for Figure AI.

After meeting Lee for research for those news stories, we invited him to come onto the podcast and share his investment thesis and the reasons why they are so bullish about humanoids, Figure AI, and especially founder and CEO Brett Adcock.

The heart of Tamarack’s investment philosophy centers around investing in strong leaders who hire strong teams and build solutions for very large markets. But you’ll also learn about Jamie’s pragmatic philosophy for evaluating proposals and some of the danger signals that he looks for when evaluating a potential investment.

News from the week

Viam raised $45 million in Series B funding

Viam has been quiet after all of the news last year, but Viam is building a modular, interoperable, and open-source software platform that works across all hardware and any fleet of machines. Viam stated that the funding will enable it to accelerate partnerships, drive commercial innovation, and further develop its platform.

Accenture announced an investment in Sanctuary AI

Sanctuary is building humanoids with embodied intelligence and they’ve always tightly focused on hand-eye coordination and manipulation over the bipedal walking aspects of humanoid robots. The investment in Sanctuary is the latest move by Accenture to build out a robotics strategy.

In January 2024, Accenture and Mujin created a joint venture to help bring robotics to the manufacturing and logistics industries. Called Accenture Alpha Automation, the new venture is owned 70% by Accenture and 30% by Mujin. The new company, called Accenture Alpha Automation, combines Mujin’s industrial robotics expertise with Accenture’s digital engineering and manufacturing service, Industry X.

Sanctuary has published a series of videos of its robots “doing stuff” on YouTube. These videos illustrate the development path of the two-armed humanoid as well as the AI behind the robots’ decision-making.

NYC takes steps to allow robotaxis in NYC

New York took its first steps towards allowing robotaxis this week and announced new safety requirements and permitting guidelines for companies looking to test their self-driving cars on public roads. 

Even with the city’s newfound interest in testing, autonomous vehicle (AV) commercialization in New York is difficult. It is one of the hardest cities for AVs to navigate due to its pedestrian-filled streets, unpredictable vehicle traffic, and sensor-disrupting bright lights.

The post Team and TAM: the keys to investing in robotics appeared first on The Robot Report.

]]>
https://www.therobotreport.com/team-and-tam-the-keys-to-investing-in-robotics/feed/ 0
Separate fact from fiction about AI in the warehouse at the Robotics Summit https://www.therobotreport.com/separate-fact-from-fiction-about-ai-in-the-warehouse-robotics-summit/ https://www.therobotreport.com/separate-fact-from-fiction-about-ai-in-the-warehouse-robotics-summit/#respond Thu, 28 Mar 2024 21:55:57 +0000 https://www.therobotreport.com/?p=578320 AI in the warehouse could be a game-changer, but a Locus 3PL expert will help Robotics Summit attendees see through the hype.

The post Separate fact from fiction about AI in the warehouse at the Robotics Summit appeared first on The Robot Report.

]]>
Locus Robotics shares its experience with digital transformation of the warehouse.

3PL expert Sean Pineau will share his insights into automation and AI in the warehouse. Source: Locus Robotics

Artificial intelligence promises to revolutionize robotics and industries including supply chain and logistics. For all of the hype around generative AI, robotics developers, integrators, and warehouse operators need to separate the facts from fiction. At the 2024 Robotics Summit & Expo, Locus Robotics will offer some help in demystifying AI in the warehouse.

Sean Pineau, head of third-party logistics (3PL) segments at Locus Robotics, will present a session on “AI in the Warehouse: What You Really Need to Know” at 1:45 p.m. ET on Wednesday, May 1. He will discuss the considerations and potential benefits and impacts of implementing AI in the warehouse.

Pineau will also explain what “embodied AI” is, what is and is not AI, and how warehouse managers can optimize their operations with AI and robotics.

Sean Pineau, Locus Robotics

Sean Pineau, Locus Robotics

Speaker to discuss robots and AI in the warehouse

Pineau has a decade’s experience in leadership roles in the materials handling industry. He said his time at Dematic and Crown Equipment Corp. provided a deep understanding of automation.

In 2021, Pineau became an account executive focusing on the retail vertical market at Locus Robotics. The Wilmington, Mass.-based company is a leading provider of autonomous mobile robots (AMRs).

In recognition of his results-driven approach, relentless dedication, and strategic acumen, Locus recently appointed Pineau as head of 3PL segments.

About the Robotics Summit & Expo

The 2024 Robotics Summit & Expo will be the largest ever, according to WTWH Media, which also produces Mobile Robot Guide and The Robot Report. The event will be at the Boston Convention and Exhibition Center on May 1 and 2.

It will include up to 5,000 attendees, more than 200 exhibitors, various networking opportunities, a Women in Robotics breakfast, a career fair, an engineering theater, a startup showcase, and more!

New to the summit is the RBR50 Robotics Innovation Awards Gala. It will include a cocktail hour, a plated dinner, photo opportunities, and the chance to hear from the Robot of the Year, Startup of the Year, and Application of the Year winners.

Each RBR50 winner will receive two complimentary tickets to the Robotics Summit and RBR50 gala. A limited number of tickets is available to attendees, but they’re selling fast!

The Robotics Summit will be co-located with DeviceTalks, an event focused on medical devices, and the inaugural Digital Transformation Forum. Registration is now open for the event.


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


The post Separate fact from fiction about AI in the warehouse at the Robotics Summit appeared first on The Robot Report.

]]>
https://www.therobotreport.com/separate-fact-from-fiction-about-ai-in-the-warehouse-robotics-summit/feed/ 0
Viam brings in $45M to accelerate enterprise partnerships https://www.therobotreport.com/viam-brings-in-45m-to-accelerate-enterprise-partnerships/ https://www.therobotreport.com/viam-brings-in-45m-to-accelerate-enterprise-partnerships/#respond Wed, 27 Mar 2024 13:11:06 +0000 https://www.therobotreport.com/?p=578279 Viam says the funding will enable it to accelerate partnerships, drive commercial innovation, and further develop its platform.

The post Viam brings in $45M to accelerate enterprise partnerships appeared first on The Robot Report.

]]>
Viam has developed an open platform for robotics and IoT development.

Viam is a tool for developing software that comes with the cloud services necessary to prototype and build robots quickly. | Source: Viam

Viam, which offers a software platform for smart machines, yesterday announced that it has raised $45 million in Series B funding. This latest round brings the company’s total funding to date to $87 million.

Eliot Horowitz, co-founder and former chief technology officer of MongoDB, founded Viam in 2020. The New York-based company said the latest investment will enable it to accelerate enterprise partnerships, drive commercial innovation, and further develop its open-source platform.

“This investment affirms Viam’s commitment to innovation and strengthens our vision to empower developers with intuitive, powerful, and flexible tools that help transform the way software powers hardware,” stated Eliot Horowitz, founder and CEO, Viam. “Whether you’re in IoT [Internet of Things], robotics, smart home, or industrial automation, we’re empowering the next generation of startups, developers, and enterprises to move quickly and build better.”


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


Viam provides platform for scalable development

Despite recent advances in artificial intelligence and machine learning and the prevalence of cloud and edge computing, software made to manage hardware has only made moderate gains, according to Viam.

This kind of software has been stymied by proprietary systems specific to individual machines, sensors, and equipment, the company asserted. This has led to complex constraints that frustrate developers, impede growth, and stifle progress, it said. 

Horowitz has experience in enterprise cloud computing from MongoDB, which is a source-available, cross-platform, secure, high-availability cloud database solution. That company said it is integrated into a number of enterprise applications used every day by millions of people.

Horowitz said he established Viam with the intent of establishing a similar platform for robotics development. He foresaw the need for software based on safe and performant cloud-computing principles as robotics progressed from point solutions to connected swarms of mobile robots and cloud-monitored systems.

Viam is a modular, interoperable, and open-source software platform that works across all hardware and any fleet of machines. In addition, its open architecture can remove costly and complex barriers to working with physical devices, the company claimed.

All of these features can speed up developer velocity and democratize access to open data. Viam added that this data can be used to inform AI and accelerate innovation in critical sectors such as industrial manufacturing, energy, and climate. 

 

Startup to invest in partnerships

Viam’s platform became generally available in May 2023. Since then, the company said it has been working with global enterprises and startups of all sizes to substantially accelerate time to market, decrease risks, increase developer velocity, improve operational efficiency, and craft modern end-user experiences.

Viam said it can help improve customer satisfaction and deliver increased revenue. Its Series B round included participation from previous investors Union Square Ventures and Battery Ventures. 

“Viam’s open architecture represents a paradigm shift that will bring the promises of robotics to the devices we use every day,” said Albert Wenger, partner at Union Square Ventures, in a release. “We’re thrilled to continue partnering with Viam in this exciting next chapter.”

This funding followed Viam’s announcement that it is working with the Whale and Vessel Safety Taskforce (WAVS). The partners will establish an open-source data-collection program and AI system for North Atlantic Right Whale conservation efforts.

They said the project showcases how the platform can be used for open data to drive collaboration and transparency. It will also demonstrate how Viam brings AI and actuation to the edge.

In addition, the company works with industrial, automation, and innovation teams to keep machines running smoothly on the edge. Viam said it’s working to enable device-to-cloud data pipelines to help manufacturers with real-time data monitoring, predictive maintenance, and remote diagnostics. 

The post Viam brings in $45M to accelerate enterprise partnerships appeared first on The Robot Report.

]]>
https://www.therobotreport.com/viam-brings-in-45m-to-accelerate-enterprise-partnerships/feed/ 0
GTC 2024 and R-24 recap https://www.therobotreport.com/gtc-2024-and-r-24-recap/ https://www.therobotreport.com/gtc-2024-and-r-24-recap/#respond Mon, 25 Mar 2024 22:38:01 +0000 https://www.therobotreport.com/?p=578268 In this episode of our podcast, our editorial team reviews its attendance at R-24 in Denmark and NVIDIA GTC 2024.

The post GTC 2024 and R-24 recap appeared first on The Robot Report.

]]>


The Robot Report editorial director Eugene Demaitre recently returned from the R-24 international robotics event in Odense, Denmark. From this trip, he immediately headed out to San Jose, Calif., to attend NVIDIA GTC 2024 with senior editor Mike Oitzman.

In this episode, Gene and Mike talk about what Gene saw and learned during his latest visit to Odense. From there, the co-hosts discuss their experiences at the GTC event, and all of the interesting sessions on artificial intelligence and robotics, NVIDIA’s product announcements for robotics, and the demonstrations by vendors that exhibited on the busy show floor.

R-24: Robots, Automation, and Drones

  • Odense Robotics is one of the largest robotics clusters in the world, with 350 members across Denmark, about half of which are in the Odense area.
  • It employs about 18,000 people, with plans to double that over the next decade. Local leaders attributed that to a culture of collaboration.
  • Among the interesting things the international delegations saw around R-24 was Odense Port, which is now building giant wind turbines in addition to maintaining container ships.
  • They also visited the drone test center at the Hans Christian Andersen Airport; the Danish Technological Institute, which hosts the Odense Robotics Startup Fund; and the Maersk-McKinney Moller Institute at the University of Southern Denmark, as well as Universal Robots headquarters.
  • Odense is also hosting ROSCon later this year.

Highlights from NVIDIA GTC 2024

In addition to CEO Jensen Huang’s keynote, here are some highlights from NVIDIA‘s latest GPU Technology Conference:

New foundation for humanoid robotics

The big news from the robotics side of the house was that NVIDIA launched a new general-purpose foundation model for humanoid robots called Project GR00T. This new model is designed to bring robotics and embodied AI together while enabling the robots to understand natural language and emulate movements by observing human actions.

GR00T uses the new Jetson Thor

As part of its robotics announcements, NVIDIA unveiled Jetson Thor for humanoid robots, based on the NVIDIA Thor system-on-a-chip (SoC). Significant upgrades to the NVIDIA Isaac robotics platform include generative AI foundation models and tools for simulation and AI workflow infrastructure.

The Thor SoC includes a next-generation GPU based on NVIDIA Blackwell architecture with a transformer engine delivering 800 teraflops of 8-bit floating-point AI performance. With an integrated functional safety processor, a high-performance CPU cluster, and 100GB of Ethernet bandwidth, it can simplify design and integration efforts, claimed the company.

NVIDIA updates Isaac simulation platform

The Isaac tools that GR00T uses are capable of creating new foundation models for any robot embodiment in any environment, according to NVIDIA. Among these tools are Isaac Lab for reinforcement learning, and OSMO, a compute orchestration service.

NVIDIA DRIVE Thor for robot axis

The company also announced NVIDIA DRIVE Thor, which now supersedes NVIDIA DRIVE Orin as a SoC for autonomous driving applications.

Other notable sessions (worth watching the replays):

  • Geordie Rose, CEO of Sanctuary: “Using Omniverse to generate first-person experiential data for humanoid robots”
  • Aaron Saunders, chief technology officer of Boston Dynamics: “Deploying AI in real-world robots”
  • Vincent Vanhouke, senior director of robotics at Google Deepmind: “Robotics in the age of GenAI”

Interesting robots seen at GTC24:

  • Agility DIGIT (static)
  • Apptronik Apollo (static)
  • Unitree H1
  • 1X Eve
  • Fourier Analysis – GR1
  • Disney BD-X droids
  • ANYbotics ANYmal
  • Enchanted Tools Mirokai
  • Richtech Robotics ADAM

The post GTC 2024 and R-24 recap appeared first on The Robot Report.

]]>
https://www.therobotreport.com/gtc-2024-and-r-24-recap/feed/ 0
Apptronik to integrate Apollo humanoid with NVIDIA general-purpose foundation model https://www.therobotreport.com/apptronik-integrates-apollo-humanoid-nvidia-project-gr00t/ https://www.therobotreport.com/apptronik-integrates-apollo-humanoid-nvidia-project-gr00t/#respond Sat, 23 Mar 2024 14:30:28 +0000 https://www.therobotreport.com/?p=578257 Apptronik is working with NVIDIA's Project GR00T to enable general-purpose humanoid robots to learn complex tasks.

The post Apptronik to integrate Apollo humanoid with NVIDIA general-purpose foundation model appeared first on The Robot Report.

]]>
NVIDIA CEO Jensen Huang (left) with Apptronik's Apollo humanoid robot. Source: Apptronik

NVIDIA CEO Jensen Huang (left) with the Apollo humanoid robot. Source: Apptronik

SAN JOSE, Calif. — Among the highlights of GTC this week was the convergence of artificial intelligence and humanoid robotics. Apptronik Inc. announced that it working to integrate its Apollo humanoid robot with Project GR00T, NVIDIA Corp.’s new general-purpose foundation model for robot learning.

“The world is designed for humans — so it makes sense that humanoid robots are the type of robot best equipped to navigate, adapt, and interact with it,” said the Austin, Texas-based company. “Furthermore, humanoid robots comprise the ideal hardware for learning general-purpose skills by observing human demonstrations.”

“The combination of Apollo and Project GR00T will enable developers to take text, video, and human demonstrations as task prompts, learn generalizable skills like coordination and dexterity, and generate actions as output on the robot hardware,” Apptronik asserted. “Instead of simply repeating the actions in the training data, Apollo will recognize the environment and predict what to do next to achieve its goal.”

Apptronik said it is working to make possible a future where humanoid robots improve how people live and work. In his GTC keynote, NVIDIA CEO Jensen Huang included an example of this type of learning. He showed how Apollo, through its integration with the GR00T foundation models, learned how to autonomously operate a juicer and serve juice.


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


Apptronik iterates humanoid robot design

Apptronik spun out of the Human Centered Robotics Lab at the University of Texas at Austin in 2016. The company‘s stated mission is to use innovative technology for the betterment of society. It added that Apollo is the culmination of the design and development of 10 robots designed to work in environments designed for people.

Those robots ranged from industrial robot arms and exoskeletons to bipedal mobility platforms, as well as extensive work on NASA Valkyrie. In August 2023, Apptronik launched Apollo, which it designed for friendly interaction, mass manufacturability, and safety.

The company claimed that its design provides scalability, cost, power efficiency, manufacturability, and supply chain resilience. Apollo uses linear actuators rather than rotary ones, copying how human muscles work.

The robot’s force-control architecture is intended to maintain safe arm movement around people, similar to a collaborative robot in comparison with an industrial robot arm. Apollo’s modular design allows its humanoid upper body to be deployed on legs, a wheeled base, or a pedestal so it can operate in whichever form is best suited to a given task, said Apptronik.

Apollo’s hot swappable batteries, each with a four-hour runtime, eliminate the need for a plug-in charge, the company added. This gives it more operational time than other humanoid robots, it asserted.

“Apollo represents a novel approach to humanoid robot design and is purpose-built to break through the technological and performance ceilings that have prevented past generations of humanoid robots from making a significant impact,” said Apptronik. “This allows for a new level of function, efficiency, and scale, along with a force-control architecture that makes it possible for Apollo to operate side by side with people and perform useful tasks.”

NVIDIA Project GR00T gives Apollo agility

Apollo’s main computing system includes onboard NVIDIA Jetson AGX Orin and Jetson Orin NX modules. They enable the AI-powered robot to efficiently use models such as the new GR00T foundation models to perform a wide variety of tasks, while its humanoid form allows it to learn from human demonstrations, said Apptronik and NVIDIA.

The integration of NVIDIA’s hardware and software with Apollo will help Apptronic accelerate skill development for general-purpose robots, the companies noted. Apptronik plans to rely on NVIDIA’s AI portfolio for humanoid robots, including GR00T, Isaac Lab, and the OSMO compute orchestration service announced at GTC.

Generative AI is currently used to generate text, images, and video,” stated Jeff Cardenas, co-founder and CEO of Apptronik. “The next frontier is to leverage these AI tools to generate intelligent humanoid robot behavior.”

“In addition to our internal efforts, our collaboration with NVIDIA will combine the superior hardware design of our Apollo humanoid robot with NVIDIA-powered multimodal learning from demonstration – particularly video demonstration,” he explained. “We believe this combination has the potential to change the world and benefit all of humanity.”

Last week, Apptronik announced that Mercedes-Benz was testing Apollo for automotive manufacturing applications.

The post Apptronik to integrate Apollo humanoid with NVIDIA general-purpose foundation model appeared first on The Robot Report.

]]>
https://www.therobotreport.com/apptronik-integrates-apollo-humanoid-nvidia-project-gr00t/feed/ 0
Stealthy startup Mendaera is developing a fist-sized medical robot with Dr. Fred Moll’s support https://www.therobotreport.com/mendaera-developing-fist-sized-medical-robot-with-dr-fred-moll-support/ https://www.therobotreport.com/mendaera-developing-fist-sized-medical-robot-with-dr-fred-moll-support/#respond Fri, 22 Mar 2024 14:43:38 +0000 https://www.therobotreport.com/?p=578241 Mendaera is working on medical technology that combines robotics, AI, and real-time imaging in a compact device.

The post Stealthy startup Mendaera is developing a fist-sized medical robot with Dr. Fred Moll’s support appeared first on The Robot Report.

]]>
Mendaera logo.

Editor’s Note: This article was syndicated from The Robot Report’s sister site Medical Design & Outsourcing.

The veil is starting to lift on medical robotics startup Mendaera Inc. as it exits stealth mode and heads toward regulatory submission with a design freeze on its first system and verification and validation imminent.

Two former Auris Health leaders co-founded the San Mateo, Calif.-based company. Mendaera also has financial support from Dr. Fred Moll, the Auris and Intuitive Surgical co-founder who is known as “the father of robotic surgery.”

“Among the innovators in the field, Mendaera’s efforts to make robotics commonplace earlier in the healthcare continuum are unique and can potentially change the future of care delivery,” stated Moll in a release.

But Mendaera isn’t a surgical robotics developer. Instead, it said it is working on technology that combines robotics, artificial intelligence, and real-time imaging in a compact device “no bigger than your fist” for procedures including percutaneous instruments.

Mendaera co-founder and CEO Josh DeFonzo.

Mendaera co-founder and CEO Josh DeFonzo. | Source: Mendaera

Josh DeFonzo, co-founder and CEO of Mendaera, offered new details about his startup’s technology and goals in an exclusive interview, as he announced the acquisition of operating room telepresence technology that Avail Medsystems developed.

Avail, which shut down last year, was founded by former Intuitive Surgical and Shockwave Medical leader Daniel Hawkins, who’s now CEO at MRI automation software startup Vista.ai

“We’re a very different form factor of robot that focuses on what I’ll describe as gateway procedures,” DeFonzo said. “It’s a different category of robots that we don’t believe the market has seen before [as] we’re designing and developing it.”

Those procedures include vascular access for delivery of devices or therapeutic agents; access to organs for surgical or diagnostics purposes; and pain management procedures such as regional anesthesia, neuraxial blocks, and chronic pain management. DeFonzo declined to go into much detail about specific procedures because the product is still in the development stage.

“The procedures that we are going after are those procedures that involve essentially a needle or a needle-like device and real-time imaging, and as such, there are specific procedures that we think the technology will perform very well at,” he said. “However, the technology is also designed to be able to address any suite of procedures that use those two common denominators: real-time imaging and a percutaneous instrument.”

“And the reason that’s an important point to make is that oftentimes, when you are a specialist who performs these procedures, you don’t perform just one,” added DeFonzo. “You perform a number of procedures: central venous catheters [CVCs], peripherally inserted central catheter [PICC] lines, regional anesthetic blocks that are in the interscalene area or axial blocks. The technology is really designed to enable specialists — of whom there are many — the ability to perform these procedures more consistently with a dramatically lower learning curve.”


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


Mendaera marks progress to date

Preclinical testing has shown the technology has improved accuracy and efficiency in comparison with freehand techniques, regardless of the individual’s skill level, asserted DeFonzo. User research spanned around 1,000 different healthcare providers ranging from emergency medicine and interventional radiology to licensed medical doctors, nurse practitioners, and physician’s assistants.

“It seems to be very stable across user types,” he said. “So whether somebody is a novice, of intermediate skill level, or advanced, the robot is a great leveler in terms of being able to provide consistent outcomes.”

“Whereas when you look at the same techniques performed freehand, the data generally tracks with what you would expect: lesser skilled people are less accurate; more experienced people are more accurate,” DeFonzo noted. “But even in that most skilled category, we do find that the robot makes a fairly remarkable improvement on accuracy and timeliness of intervention.”

Last year, the startup expanded into a production facility to accommodate growth and volume manufacturing for the product’s launch and said its system will be powered by handheld ultrasound developer Butterfly Network’s Ultrasound-on-Chip technology.

Butterfly Network won FDA clearance in 2017 for the Butterfly iQ for iPhone. | Source: Butterfly Network

Mendaera’s aim is to eventually deploy these systems “to the absolute edge of healthcare,” starting with hospitals, ambulatory surgical centers and other procedural settings, said DeFonzo. The company will then push to alternative care sites and primary care clinics as evidence builds to support the technology.

“The entire mission for the company is to ensure essentially that high-quality intervention is afforded to every patient at every care center at every encounter,” he said. “We want to be able to push that as far to the edge of healthcare as possible, and that’s certainly something we aim to do over time, but it’s not our starting point explicitly.”

“As a practical starting point, however, we do see ourselves working in the operating room, in the interventional radiology suite, and likely in cath labs to facilitate these gateway procedures, the access that is afforded adjacent to a larger intervention,” DeFonzo acknowledged.

Mendaera said it expects to submit its system to the U.S. Food and Drug Administration for review through the 510(k) pathway by the end of 2024 with the goal of offering the product clinically in 2025.

“What we really want to do with this technology is make sure that we’re leveraging not just technological trends, but really important forces in the space — robotics, imaging and AI — to dramatically improve access to care,” said DeFonzo. “Whether you’re talking about something as basic as a vascular access procedure or something as complex as transplant surgery or neurosurgery, we need to leverage technology to improve patient experience.”

“We need to leverage technology to help hospitals become more financially sustainable, ultimately improving the healthcare system as we do it,” he said. “So our vision was to utilize technology to provide solutions that aggregate across many millions, if not tens and hundreds of millions, of procedures to make a ubiquitous technology that really helps benefit our healthcare system.”

Mendaera’s research and development group will work with employees from Avail on how to best add the telepresence technology to the mix.

“We see a lot of power in what the Avail team has built,” DeFonzo said. “Bringing that alongside robotic technology, our imaging partnerships and AI, we think that we’ve got a really good opportunity to digitize to a further extent not only expertise in the form of the robot, but [also] clinical judgment, like how do you ensure that the right clinician and his or her input is present ahead of technologies like artificial intelligence that hopefully augment all users in an even more scalable way.”

The post Stealthy startup Mendaera is developing a fist-sized medical robot with Dr. Fred Moll’s support appeared first on The Robot Report.

]]>
https://www.therobotreport.com/mendaera-developing-fist-sized-medical-robot-with-dr-fred-moll-support/feed/ 0
Teradyne partners with NVIDIA to add AI to cobots https://www.therobotreport.com/teradyne-partners-with-nvidia-to-add-ai-to-cobots/ Wed, 20 Mar 2024 12:55:12 +0000 https://www.therobotreport.com/?p=578204 Teradyne units Universal Robots and Mobile Industrial Robots have incorporated NVIDIA AI for the first time.

The post Teradyne partners with NVIDIA to add AI to cobots appeared first on The Robot Report.

]]>
Universal Robots cobots are gaining precision thanks to NVIDIA
Universal Robots cobots are gaining precision thanks to a collaboration with NVIDIA. Source: Teradyne Robotics

SAN JOSE, Calif. — Artificial intelligence is already making robots smarter. Teradyne Robotics announced at GTC 2024 a collaboration with NVIDIA to add new AI capabilities to collaborative and mobile robots.

North Reading, Mass.-based Teradyne owns collaborative robot maker Universal Robots A/S (UR) and autonomous mobile robot (AMR) company Mobile Industrial Robots A/S (MiR), both of which are in Odense, Denmark.

“This is the first of a series of planned AI offerings by Teradyne Robotics,” stated Ujjwal Kumar, group president of Teradyne Robotics. “By adding high-performance compute hardware to our control systems, as well as investing in targeted upgrades to our software stacks, we are investing to establish UR and MiR as the preferred robotics platforms for developing and deploying AI applications.”

“We are working to shape the future of robotics by combining NVIDIA’s state-of-the-art AI platform with Teradyne Robotics’ real-world domain expertise in industrial automation,” he added. “We’re creating the platform for new solutions to previously unsolvable problems.”

Kumar will deliver a keynote at the Robotics Summit & Expo in Boston in May.

Universal Robots integrates accelerated computing

Universal Robots is demonstrating at this week’s GPU Technology Conference (GTC) an autonomous inspection system using its cobot arms and AI. The company has integrated NVIDIA accelerated computing into its cobots for path planning 50 to 80 times faster than today’s applications.

“NVIDIA has been working with Universal Robots for three years,” Kumar explained to The Robot Report. “Its researchers were used to the UR cobots, which are inherently safe and thus good for testing AI.”

“While the Microsofts and Googles of the world may own digital AI, NVIDIA wants to be the market leader in physical AI, as CEO Jensen Huang mentioned in his keynote,” he said. “In digital AI, 90% might be OK for an image or text generated with AI, but that’s not sufficient in the real world. Teradyne has experience with quality and reliability.”

The partners said the application combines the following to increase efficiency for automation customers:

The combination of cuMotion, PolyScope X, and the UR cobot makes possible a range of applications that were previously not feasible to automate fully, according to the partners. It can also improve existing programming concepts.

Teradyne and NVIDIA cited benefits including ease of programming and lower computation time for planning, optimizing, and executing trajectories. For customers, this technology can simplify the setup of common industrial applications, facilitating robot adoption for high-mix, low-volume scenarios.

Not only can cuMotion allow automatic calculation of path planning for collision-free trajectories, but it also enables path optimization for other criteria such as speed, minimum wear, or energy efficiency, according to Universal Robots.

At the GTC demonstration, The Robot Report watched the cobot-mounted camera move to inspect a workpiece that was randomly reoriented. It did so automatically, and a digital twin mirrored its maneuvers.

For the inspection application, users can load CAD files for up to 20 parts with associated test procedures. NVIDIA’s technology enables the robot to identify each part and procedure and conduct path planning accordingly, explained Andrew Pether, principal innovation research engineer at Universal Robots. He said the combination of cuMotion on AGX Orin for dynamic positioning, Isaac Sim for digital twins of the current state and trajectories can improve inspections for automotive, large electronics, and “white goods”/appliances manufacturers.

The UR inspection demo at GTC 2024.
The Universal Robots inspection demo at GTC 2024. Credit: Eugene Demaitre

Teradyne, NVIDIA expect AI robotics apps to grow

Mobile Industrial Robots also announced the MiR1200 Pallet Jack, which uses the NVIDIA Jetson AGX Orin module for AI-powered pallet detection. This enables it to identify and precisely move objects, navigate autonomously, and operate in complex factory and warehouse environments.

“There are two kinds of workers in factories or warehouses — those in static cells or who are moving,” said Kumar. “We have robots to help both sorts of tasks. Mobile robots and cobots could jointly meet needs in welding, semiconductors, and more.”

Teradyne noted that autonomous inspection and the autonomous pallet handling are two use cases with significant potential for scalability. The MiR1200 Pallet Jack and UR’s cuMotion demo are the two most recent examples of “physical AI” solutions, with others already available through Teradyne Robotics’ ecosystem partners, OEMs, and end users.

“NVIDIA’s Isaac platform is enabling increased autonomy in robotics with rapid advancements in simulation, generative AI, foundation models and optimized edge compute,” said Deepu Talla, vice president of robotics and edge computing at NVIDIA. “This collaboration with Teradyne Robotics will bring the power of AI and accelerated computing to rapidly growing cobot and AMR markets.”

The MiR1200 Pallet Jack has enhanced autonomy thanks to NVIDIA AI.
The MiR1200 Pallet Jack has enhanced autonomy thanks to NVIDIA AI. Source: Teradyne Robotics

The post Teradyne partners with NVIDIA to add AI to cobots appeared first on The Robot Report.

]]>
NVIDIA announces new robotics products at GTC 2024 https://www.therobotreport.com/nvidia-announces-new-robotics-products-at-gtc-2024/ https://www.therobotreport.com/nvidia-announces-new-robotics-products-at-gtc-2024/#respond Tue, 19 Mar 2024 11:02:34 +0000 https://www.therobotreport.com/?p=578193 NVIDIA CEO Jenson Huang wowed the crowd in San Jose with the company's latest processor, AI, and simulation product announcements.

The post NVIDIA announces new robotics products at GTC 2024 appeared first on The Robot Report.

]]>
NVIDIA CEO Jenson Huang on stage with a humanoid lineup.

NVIDIA CEO Jenson Huang ended his GTC 2024 keynote backed by life size images of all of the various humanoids in development and powered by the Jetson Orin computer. | Credit: Eugene Demaitre

SAN JOSE, Calif. — The NVIDIA GTC 2024 keynote kicked off like a rock concert yesterday at the SAP Arena. More than 15,000 attendees filled the arena in anticipation of CEO Jensen Huang’s annual presentation of the latest product news from NVIDIA.

To build the excitement, the waiting crowd was mesmerized by an interactive and real-time generative art display running live on the main stage screen, driven by the prompts of artist Refik Anadol Dustio.

New foundation for humanoid robotics

The big news from the robotics side of the house is that NVIDIA launched a new general-purpose foundation model for humanoid robots called Project GR00T. This new model is designed to bring robotics and embodied AI together while enabling the robots to understand natural language and emulate movements by observing human actions.

GR00T training model diagram.

Project GR00T training model. | Credit: NVIDIA

GR00T stands for “Generalist Robot 00 Technology,” and with the race for humanoid robotics heating up, this new technology is intended to help accelerate development. GR00T is a large multimodal model (LMM) providing robotics developers with a generative AI platform to begin the implementation of large language models (LLMs).

“Building foundation models for general humanoid robots is one of the most exciting problems to solve in AI today,” said Huang. “The enabling technologies are coming together for leading roboticists around the world to take giant leaps towards artificial general robotics.”

GR00T uses the new Jetson Thor

As part of its robotics announcements, NVIDIA unveiled Jetson Thor for humanoid robots, based on the NVIDIA Thor system-on-a-chip (SoC). Significant upgrades to the NVIDIA Isaac robotics platform include generative AI foundation models and tools for simulation and AI workflow infrastructure.

The Thor SoC includes a next-generation GPU based on NVIDIA Blackwell architecture with a transformer engine delivering 800 teraflops of 8-bit floating-point AI performance. With an integrated functional safety processor, a high-performance CPU cluster, and 100GB of Ethernet bandwidth, it can simplify design and integration efforts, claimed the company.

Image of a humanoid robot.

Project GR00T, a general-purpose multimodal foundation model for humanoids, enables robots to learn different skills. | Credit: NVIDIA

NVIDIA showed humanoids in development with its technologies from companies including 1X Technologies, Agility Robotics, Apptronik, Boston Dynamics, Figure AI, Fourier Intelligence, Sanctuary AI, Unitree Robotics, and XPENG Robotics.

“We are at an inflection point in history, with human-centric robots like Digit poised to change labor forever,” said Jonathan Hurst, co-founder and chief robot officer at Agility Robotics. “Modern AI will accelerate development, paving the way for robots like Digit to help people in all aspects of daily life.”

“We’re excited to partner with NVIDIA to invest in the computing, simulation tools, machine learning environments, and other necessary infrastructure to enable the dream of robots being a part of daily life,” he said.

NVIDIA updates Isaac simulation platform

The Isaac tools that GR00T uses are capable of creating new foundation models for any robot embodiment in any environment, according to NVIDIA. Among these tools are Isaac Lab for reinforcement learning, and OSMO, a compute orchestration service.

Embodied AI models require massive amounts of real and synthetic data. The new Isaac Lab is a GPU-accelerated, lightweight, performance-optimized application built on Isaac Sim for running thousands of parallel simulations for robot learning.

simulation screen shots.

NVIDIA software — Omniverse, Metropolis, Isaac and cuOpt — combine to create an ‘AI gym’
where robots, AI agents can work out and be evaluated in complex industrial spaces. | Credit: NVIDIA

To scale robot development workloads across heterogeneous compute, OSMO coordinates the data generation, model training, and software/hardware-in-the-loop workflows across distributed environments.

NVIDIA also announced Isaac Manipulator and Isaac Perceptor — a collection of robotics-pretrained models, libraries and reference hardware.

Isaac Manipulator offers dexterity and modular AI capabilities for robotic arms, with a robust collection of foundation models and GPU-accelerated libraries. It can accelerate path planning by up to 80x, and zero-shot perception increases efficiency and throughput, enabling developers to automate a greater number of new robotic tasks, said NVIDIA.

Among early ecosystem partners are Franka Robotics, PickNik Robotics, READY Robotics, Solomon, Universal Robots, a Teradyne company, and Yaskawa.

Isaac Perceptor provides multi-camera, 3D surround-vision capabilities, which are increasingly being used in autonomous mobile robots (AMRs) adopted in manufacturing and fulfillment operations to improve efficiency and worker safety. NVIDIA listed companies such as ArcBest, BYD, and KION Group as partners.


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


‘Simulation first’ is the new mantra for NVIDIA

A simulation-first approach is ushering in the next phase of automation. Real-time AI is now a reality in manufacturing, factory logistics, and robotics. These environments are complex, often involving hundreds or thousands of moving parts. Until now, it was a monumental task to simulate all of these moving parts.

NVIDIA has combined software such as Omniverse, Metropolis, Isaac, and cuOpt to create an “AI gym” where robots and AI agents can work out and be evaluated in complex industrial spaces.

Huang demonstrated a digital twin of a 100,000-sq.-ft, warehouse — built using the NVIDIA Omniverse platform for developing and connecting OpenUSD applications — operating as a simulation environment for dozens of digital workers and multiple AMRs, vision AI agents, and sensors.

Each mobile robot, running the NVIDIA Isaac Perceptor multi-sensor stack, can process visual information from six sensors, all simulated in the digital twin.

robots working together in a warehouse.

Image depicting AMR and a manipulator working together to
enable AI-based automation in a warehouse powered by NVIDIA Isaac. | Credit: NVIDIA

At the same time, the NVIDIA Metropolis platform for vision AI can create a single centralized map of worker activity across the entire warehouse, fusing data from 100 simulated ceiling-mounted camera streams with multi-camera tracking. This centralized occupancy map can help inform optimal AMR routes calculated by the NVIDIA cuOpt engine for solving complex routing problems.

cuOpt, an optimization AI microservice, solves complex routing problems with multiple constraints using GPU-accelerated evolutionary algorithms.

All of this happens in real-time, while Isaac Mission Control coordinates the entire fleet using map data and route graphs from cuOpt to send and execute AMR commands.

NVIDIA DRIVE Thor for robot axis

The company also announced NVIDIA DRIVE Thor, which now supersedes NVIDIA DRIVE Orin as a SoC for autonomous driving applications.

Multiple autonomous vehicles are using NVIDA architectures, including robotaxis and autonomous delivery vehicles from companies including Nuro, Xpeng, Weride, Plus, and BYD.

The post NVIDIA announces new robotics products at GTC 2024 appeared first on The Robot Report.

]]>
https://www.therobotreport.com/nvidia-announces-new-robotics-products-at-gtc-2024/feed/ 0
RIOS Intelligent Machines raises Series B funding, starts rolling out Mission Control https://www.therobotreport.com/rios-intelligent-machines-raises-series-b-funding-starts-rolls-out-mission-control/ https://www.therobotreport.com/rios-intelligent-machines-raises-series-b-funding-starts-rolls-out-mission-control/#comments Fri, 08 Mar 2024 15:56:52 +0000 https://www.therobotreport.com/?p=578111 RIOS has gotten investment from Yamaha and others to continue developing machine vision-driven robotics for manufacturers.

The post RIOS Intelligent Machines raises Series B funding, starts rolling out Mission Control appeared first on The Robot Report.

]]>
RIOS Intelligent Machines works with NVIDIA Isaac Sim

RIOS works with NVIDIA Isaac Sim and serves the wood-products industry. Source: RIOS Intelligent Machines

RIOS Intelligent Machines Inc. this week announced that it has raised $13 million in Series B funding, co-led by Yamaha Motor Corp. and IAG Capital Partners. The company said it plans to use the investment to develop and offer artificial intelligence and vision-driven robotics, starting with a product for the lumber and plywood-handling sector.

Menlo Park, Calif.-based RIOS said its systems can enhance production efficiency and control. The company focuses on three industrial segments: wood products, beverage distribution, and packaged food products.

RIOS works with NVIDIA Omniverse on factory simulations. It has also launched its Mission Control Center, which uses machine vision and AI to help manufacturers improve quality and efficiency.

RIOS offers visibility to manufacturers

“Customers in manufacturing want a better way to introspect their production — ‘Why did this part of the line go down?'” said Clinton Smith, co-founder and CEO of RIOS. “But incumbent tools have not been getting glowing reviews. Our standoff vision system eliminates a lot of that because our vision and AI are more robust.”

The mission-control product started as an internal tool and is now being rolled out to select customers, Smith told The Robot Report. “We’ve observed that customers want fine-grained control of processes, but there are a lot of inefficiencies, even at larger factories in the U.S.”

Manufacturers that already work with tight tolerances, such as in aerospace or electronics, already have well-defined processes, he noted. But companies with high SKU turnover volumes, such as with seasonal variations, often find it difficult to rely on a third party’s AI, added Smith.

“Mission Control is a centralized platform that provides a visual way to visualize processes and to start to interact with our robotics,” he explained. ‘We want operators to identify what to work on and what metrics to count for throughput and ROI [return on investment], but if there’s an error on the data side, it can be a pain to go back to the database.”

Smith shared the example of a bottlecap tracker. In typical machine learning, this requires a lot of data to be annotated before training models and then looking at the results.

With RIOS Mission Control, operators can monitor a process and select a counting zone. They can simply draw a box around a feature to be annotated, and the system will automatically detect and draw comparisons, he said.

“You place a system over the conveyor, pick an item, and you’re done,” said Smith. “It’s not just counting objects. For example, our wood products customers want to know where there are knots in boards to cut around. It could also be used in kitting applications.”

RIOS is releasing the feature in phases and is working on object manipulation. Smith said the company is also integrating the new feature with its tooling. In addition, RIOS is in discussions with customers, which can use its own or their existing cameras for Mission Control.

Investors express confidence in automation approach

Yamaha has been an investor in RIOS Intelligent Machines since 2020. The vehicle maker said it has more than doubled its investment in RIOS, demonstrating its confidence in the company’s automation technologies and business strategy.

IAG Capital Partners is a private investment group in Charleston, S.C. The firm invests in early-stage companies and partners with innovators to build manufacturing companies. Dennis Sacha, partner at IAG, will be joining the RIOS board of directors.

“RIOS’s full production vision — from automation to quality assurance to process improvement to digital twinning — and deep understanding of production needs positions them well in the world of manufacturing,” said Sacha, who led jet engine and P-3 production for six years during his career in the U.S. Navy.

In addition, RIOS announced nearly full participation from its existing investors, including Series A lead investor, Main Sequence, which doubled its pro-rata investment. RIOS will be participating in MODEX, GTC, and Automate.


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


The post RIOS Intelligent Machines raises Series B funding, starts rolling out Mission Control appeared first on The Robot Report.

]]>
https://www.therobotreport.com/rios-intelligent-machines-raises-series-b-funding-starts-rolls-out-mission-control/feed/ 1