Mike Oitzman, Author at The Robot Report https://www.therobotreport.com/author/moitzman/ Robotics news, research and analysis Tue, 16 Apr 2024 21:55:29 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.2 https://www.therobotreport.com/wp-content/uploads/2017/08/cropped-robot-report-site-32x32.png Mike Oitzman, Author at The Robot Report https://www.therobotreport.com/author/moitzman/ 32 32 Mentee Robotics de-cloaks to launch new AI-driven humanoid robot https://www.therobotreport.com/mentee-robotics-de-cloaks-launches-ai-driven-humanoid-robot/ https://www.therobotreport.com/mentee-robotics-de-cloaks-launches-ai-driven-humanoid-robot/#respond Wed, 17 Apr 2024 11:00:05 +0000 https://www.therobotreport.com/?p=578722 Mentee Robotics has emerged on the scene with a new AI-driven humanoid robot, planned for production release in early 2025.

The post Mentee Robotics de-cloaks to launch new AI-driven humanoid robot appeared first on The Robot Report.

]]>
group shot of the mentee robotics cofounders.

Mentee Robotics co-founders include Lior Wolf, CEO (left); Amnon Shashua, chairman (middle); Shai Shalev-Shwartz, chief scientist (right). | Credit: Mentee Robotics

Mentee Robotics came out of stealth today and unveiled its first bipedal humanoid robot prototype. An experienced team founded the Herzliya, Israel company in 2022. It includes Prof. Amnon Shashua, the chairman of Mentee Robotics, an expert in AI, computer vision, natural language processing and other related fields.

The company‘s founders also include Prof. Lior Wolf, the CEO of Mentee Robotics and formerly a research scientist and director at Facebook AI Research, and Prof. Shai Shalev-Shwartz, a computer scientist and machine learning researcher. 

Prof. Shashua is also the founder and current CEO of Mobileye, a public company that is developing autonomous-driving and driver-assist technologies and harnessing advancements in computer vision, machine learning, mapping, and data analysis.

The company joins a growing list of robotics developers that have launched competing humanoids in the past year, including Figure AI, Sanctuary AI, Apptronik, Tesla, 1X, and others.

mentee humanoid robot with commands listed.

The Menteebot humanoid can take verbal instructions and then execute a mission. | Credit: Mentee Robotics

Leveraging Sim2Real training data

Mentee Robotics is developing a humanoid robot that it said will be capable of understanding natural-language commands by using artificial intelligence. The growth and evolution of large language models (LLM) over the past year is the foundation for this capability.

The prototype of Menteebot that was unveiled today incorporates AI at every level of its operations. The motion of the robot is based on a new machine-learning method called simulation to reality (Sim2Real). In this method, reinforcement learning happens on a virtual version of the robot, which means that it can use as much data as it needs to learn and then respond to the real world with very little data. 

NeRF-based methods, which are the newest neural network-based technologies for representing 3D scenes, map the world on the fly. The semantic knowledge is stored in these cognitive maps, which the computer can query to find things and places.

Mentee’s robot can then figure out where it is on the 3D map and then automatically plan dynamic paths to avoid obstacles.


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


Mentee Robotics has more work to do

The prototype that was unveiled today demonstrated an end-to-end cycle from complex task completion, including navigation, locomotion, scene understanding, object detection and localization, grasping, and natural language understanding.

However, Mentee Robotics noted that this is not the final version that is ready for deployment.

The company also told The Robot Report that is it targets two primary market initially with the Mentee humanoid. One of these markets is household, with a domestic assistant adept at maneuvering within households, capable of executing a range of tasks including table setting, table cleanup, laundry handling, and the ability to learn new tasks on the fly through verbal instructions and visual imitation. The second industrial market is in the warehouse, with a warehouse automation robot designed to efficiently locate, retrieve, and transport items, and a capacity to handle loads weighing up to 25 kg (55 lbs).

rear view of the robot.

The robot includes custom-engineered motors to deliver torque, design life, and efficiency. | Credit: Mentee Robotics

Production units to come in 2025

Mentee Robotics said it is planning to release a production-ready prototype by Q1 2025. The system uses only vision-based cameras for sensing the world around it.

In addition, the company’s engineering team developed proprietary electric motors to support the robot’s dexterity requirements.

“We are on the cusp of a convergence of computer vision, natural language understanding, strong and detailed simulators, and methodologies on and for transferring from simulation to the real world,” said Prof. Amnon Shashua, chairman of Mentee Robotics. “At Mentee Robotics, we see this convergence as the starting point for designing the future general-purpose bipedal robot that can move everywhere — as a human — with the brains to perform household tasks and learn through imitation tasks it was not previously trained for.”

The post Mentee Robotics de-cloaks to launch new AI-driven humanoid robot appeared first on The Robot Report.

]]>
https://www.therobotreport.com/mentee-robotics-de-cloaks-launches-ai-driven-humanoid-robot/feed/ 0
Teledyne FLIR IIS announces new Bumblebee X stereo vision camera https://www.therobotreport.com/teledyne-flir-iis-announces-new-bumblebee-x-stereo-vision-camera/ https://www.therobotreport.com/teledyne-flir-iis-announces-new-bumblebee-x-stereo-vision-camera/#respond Tue, 16 Apr 2024 21:55:29 +0000 https://www.therobotreport.com/?p=578731 Bumblebee X is a new GigE powered stereo imaging solution that delivers high-accuracy and low-latency for robotic guidance and pick & place applications.

The post Teledyne FLIR IIS announces new Bumblebee X stereo vision camera appeared first on The Robot Report.

]]>
teledyne flir logo and multiple products in the background.

Bumblebee X is a new GigE-powered stereo imaging solution that delivers high-accuracy and low-latency for robotic guidance and pick-and-place applications. | Credit: Teledyne FLIR

Teledyne FLIR IIS (Integrated Imaging Solutions) today announced the new Bumblebee X series – an advanced stereo-depth vision solution optimized for multiple applications. The imaging device is a comprehensive industrial-grade (IP67) stereo vision solution with onboard processing to build successful systems for warehouse automation, robotics guidance, and logistics.

Bumblebee X 5GIGE delivers on the essential need for a comprehensive and real-time stereo vision solution, the Wilsonville, Ore.-based company says. Customers can test and deploy depth sensing systems that work up to ranges of 20 meters with the wide baseline solution.

product image showing front and rear of the camera.

The Teledyne FLIR Bumblebee X camera is packaged in an IP76 enclosure, and ready for industrial use cases. | Credit: Teledyne FLIR

Available in three configurations

The new camera is available in three different configurations, which are identical except for the field of view (FOV) of the camera lens. Teledyne designed the camera to operate accurately across varying distances. The low latency and GigE networking make it ideal for real-time applications such as autonomous mobile robots, automated guided vehicles, pick and place, bin picking, and palletization, the company said. 

“We’re thrilled to announce the release of Bumblebee X, a new comprehensive solution for tackling complex depth sensing challenges with ease,” said Sadiq Panjwani, General Manager at Teledyne FLIR IIS. “Our team’s extensive stereo vision expertise and careful attention to customer insights have informed the design of the hardware, software, and processing at the core of Bumblebee X. With high accuracy across a large range of distances, this solution is perfect for factories and warehouses.”

Specifications

a table of specs for the teledyne bumblebee camera configurations.

This table compares the specs for the three different configurations of the Bumblebee X camera. Check the website for actual specs. | Credit: Teledyne FLIR

Key features include:

  • Factory-calibrated 9.4 in (24 cm) baseline stereo vision with 3 MP sensors for high accuracy and low latency real-time applications
  • IP67 industrial-rated vision system with ordering options of color and monochrome, different field-of-views, and 1GigE or 5GigE PoE
  • Onboard processing to output a depth map and color data for point cloud conversion and colorization
  • Ability to trigger an external pattern projector and synchronize multiple systems together for more precise 3D depth information

Teledyne FLIR manages a software library with articles, example code, and Windows, Linux, and Robotics Operating System (ROS) support. Order requests will be accepted at the end of Q2, 2024.

The post Teledyne FLIR IIS announces new Bumblebee X stereo vision camera appeared first on The Robot Report.

]]>
https://www.therobotreport.com/teledyne-flir-iis-announces-new-bumblebee-x-stereo-vision-camera/feed/ 0
Sanctuary AI enters strategic relationship with Magna to build embodied AI robots https://www.therobotreport.com/sanctuary-ai-enters-strategic-relationship-with-magna-to-build-embodied-ai-robots/ https://www.therobotreport.com/sanctuary-ai-enters-strategic-relationship-with-magna-to-build-embodied-ai-robots/#respond Fri, 12 Apr 2024 13:33:23 +0000 https://www.therobotreport.com/?p=578689 Magna International's relationship with Sanctuary is threefold: as an investor, a contract manufacturer, and an end user.

The post Sanctuary AI enters strategic relationship with Magna to build embodied AI robots appeared first on The Robot Report.

]]>
image of Phoenix humanoid robot, full body, not a render.

The Phoenix humanoid robot is being developed to enable embodied AI and support general-purpose applications. | Credit: Sanctuary AI

Humanoid robot developer Sanctuary Cognitive Systems Corp., or Sanctuary AI, is entering a new strategic partnership with automotive components supplier Magna International Inc. Through this expanded partnership, Sanctuary plans to equip Magna’s manufacturing facilities with general-purpose AI robots.

The Vancouver-based company also plans to engage Magna to manufacture the Sanctuary Phoenix robots under contract in the future. Aurora, Ontario-based Magna has been an investor in Sanctuary AI since 2021, and it acquired autonomous vehicle startup Optimus Ride in 2022.

Yesterday’s announcement with Magna follows Accenture’s recent investment in Sanctuary for an undisclosed amount.

Phoenix includes human-like design, AI

“We founded Sanctuary AI with the goal to become the first organization in the world to create human-like AI,” stated Geordie Rose, co-founder and CEO of Sanctuary AI. “World-changing goals like these require world-changing partners.”

“Magna’s position as a world leader in the use of robots today makes this partnership an essential advancement for our mission,” he added. “We’re privileged to be working with Magna, and believe they will be a key element in the successful global deployment of our machines.”

Sanctuary Phoenix includes human-like dexterous hands and arms. Since it launched the robot in May 2023, the company has invested heavily in the development of manipulation capabilities, perception features, and artificial intelligence models that control the humanoid robot.

In December 2023, Sanctuary secured patents for numerous technologies developed both internally and through strategic acquisitions from external sources. The company acquired the latest assets from Giant.AI Inc. and Tangible Research.

Two Sanctuary AI robotic torsos demonstrate training process.

Sanctuary is iterating on humanoid design by perfecting hand-eye coordination and AI model training. | Credit: Sanctuary AI

Sanctuary AI builds relationship with Magna

“The intent of the relationship [with Magna International] has always been threefold,” Rose told The Robot Report. “One is that they were an investor.”

“Another would be they would participate in manufacturing the robots at some point,” he said. “And the third would be there could be a consumer of the robots as a customer. So all of those three things are obviously related to each other. All of them are good for both parties.”

“So we’ve continued to impress [Magna] with our velocity and acceleration in terms of developing the technology from something that was a twinkle in our eyes six years ago to something that can actually perform real-world work tasks,” Rose noted.

The workflow opportunities for an agile humanoid at Magna are endless, according to Rose. “The key to getting a good fit in the short term is understanding how to overlap that type of analysis with the type of capability that you can deliver,” he said. “So this is a difficult thing for companies that are early stage, including us, because of the ‘drinking your own Kool-Aid’ phenomenon.”

“A lot of companies will release a whole bunch of hype both to their customers, their investors, and internally in themselves — they start to believe that they can do things they can’t, and they make bad decisions about how they position their technology,” Rose continued. “So we have to be clear-eyed about what’s actually possible with our [robot] and then be very diligent in trying to understand the details of how the workflow actually works in practice, and then overlap the two.”

“When you do that with this type of technology, what you find is that the first use cases all fall into the following categories: There is an aspect of mobility, that’s best treated with wheels, where the robot has to move from place to place within an environment. And then there’s the aspect of manipulation,” he explained.

Magna also said its team is excited about the possibilities for intelligent mobile manipulation. It said it expects to automate various tasks and to improve the quality and efficiency of its manufacturing and logistics processes.

“Magna is excited to partner with Sanctuary AI in our shared mission to advance the future of manufacturing,” said Todd Deaville, vice president of advanced manufacturing innovation at Magna. “By integrating general-purpose AI robots into our manufacturing facilities for specific tasks, we can enhance our capabilities to deliver high-quality products to our customers.”


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


A key success factor for robotics startups

As Sanctuary AI begins the process of commercializing Phoenix, it plans to contract with Magna for the production of part or all of the robot going forward. Sanctuary asserted that finding the right manufacturing partner to build its robots at high volumes is best outsourced and that manufacturing should be non-core for any robotics startup.

Many robotics startups often fail when they attempt to manufacture their systems in-house, observed Rose. He said he has sought to find the right production partner since the inception of the company.

Sanctuary employs embodied AI and foundation models

Embodied AI is core to the future of Sanctuary AI, which said it is spending all of its intellectual energy on engineering and training the smartest models for these robots. Rose said he is amazed at the evolution of embodied AI over the past decade.

The real race, according to Rose, is to find a way to gather the immense amount of data needed and put the robot into the necessary training situations for the AI models to learn and grow in confidence.

This is where the enhanced relationship with Magna comes in. The product roadmap for the Sanctuary over the next year is to deploy all of the production runs of Phoenix robots into real-world manufacturing environments at Magna facilities. In simple terms, Phoenix will learn by executing tasks every day and gathering training data.

“In the run that we’re about to begin with Magna, we’ll be able to collect data in a commercial environment of the sort that will train a production robot,” Rose said. “So the progression of this, from our perspective, is the ability to collect training data to generate autonomous behaviors. The systems that we’re building this year are going to be consumed in data collection.”

In 2025, Sanctuary said it will iterate on a version of the robot for broader use and sale. Similar to the model used at Rose’s prior company, Kindred, there will be a human in the loop to help robots resolve edge cases while minimizing any impact on day-to-day operations.

Rose summed up the current state of development: “We can go from data collection to a trained policy in less than 24 hours now, where the train policy does as well or better than the people who are doing the task for simple tasks. So that is an amazing thing that I was not expecting — these new transformer-based models are spectacularly good at moving robots, way better than I thought they would be.”

“I think it’s an echo of my surprise that how well large language models can generate text; who would have thought that predicting the next token would allow you to be a coherent understander of the world?” he said. “But it seems like that’s the way they work. And in the space of moving robots, if you’ve got enough data, what can’t you do? You can just talk to the robot and say, ‘Do this thing,’ and it will just do it. It’s magical.”

The post Sanctuary AI enters strategic relationship with Magna to build embodied AI robots appeared first on The Robot Report.

]]>
https://www.therobotreport.com/sanctuary-ai-enters-strategic-relationship-with-magna-to-build-embodied-ai-robots/feed/ 0
Collaborative Robotics raises $100M in Series B for mysterious mobile manipulator https://www.therobotreport.com/collaborative-robotics-raises-100m-series-b-funding/ https://www.therobotreport.com/collaborative-robotics-raises-100m-series-b-funding/#respond Wed, 10 Apr 2024 13:00:52 +0000 https://www.therobotreport.com/?p=578664 Collaborative Robotics has raised $100M to commercialize its cobot, starting with automating warehouse operations.

The post Collaborative Robotics raises $100M in Series B for mysterious mobile manipulator appeared first on The Robot Report.

]]>
Collaborative Robotics has raised Series B funding.

Collaborative Robotics has been developing a system for trustworthy operations. Source: Adobe Stock, Photoshopped by The Robot Report

Collaborative Robotics today closed a $100 million Series B round on the road to commercializing its autonomous mobile manipulator. The Santa Clara, Calif.-based company said it is developing robots that can safely and affordably work alongside people in varied manufacturing, supply chain, and healthcare workflows. In many cases, this is the same work that humanoid robots are jockeying for.

Brad Porter, a former distinguished engineer and vice president of robotics at Amazon, founded Collaborative Robotics in 2022. The Cobot team includes robotics and artificial intelligence experts from Amazon, Apple, Meta, Google, Microsoft, NASA, Waymo, and more.

“Getting our first robots in the field earlier this year, coupled with today’s investment, are major milestones as we bring cobots with human-level capability into the industries of today,” stated Porter. “We see a virtuous cycle, where more robots in the field lead to improved AI and a more cost-effective supply chain. This funding will help us accelerate getting more robots into the real world.”

The Robot Report caught up with Porter to learn more about the company and its product since our last conversation in July 2023, when Cobot raised its $30 million Series A.

Nothing to see here

Collaborative Robotics has been secretive about the design of its robot. You won’t find any photos of the cobot on the company’s site or anywhere else on the Web yet.

However, Porter told The Robot Report that it is already in trials with several pilot customers, including a global logistics company. He described the machine as a mobile manipulator, with roughly the stature of a human. However, it’s not a humanoid, nor does it have a six degree-of-freedom arm or a hand with fingers.

“When talking about general-purpose robots versus special-purpose robots, we know what humanoids look like, but with a new morphology, we want to protect it for a while,” he said. “We’ve been looking at humanoids for a long time, but in manufacturing, secondary material flow is designed around humans and carts. Hospitals, airports, and stadiums are usually designed around people flow. A huge amount of people is still moving boxes, totes, and carts around the world.”

The new cobot’s base is capable of omnidirectional motion with four wheels and a swerve-drive design, along with a central structure that can acquire, carry, and place totes and boxes around the warehouse. It is just under 6 ft. (2 m) tall and can carry up to 75 lb. (34 kg), said Porter.

The robot can also engage and move existing carts with payloads weighing up to 1,500 lb. (680 kg) around the warehouse. How the robot engages carts remains part of the mystery. But by automating long-distance moves and using existing cart infrastructure, Porter said he believes that the Collaborative Robotics system is differentiated from both mobile robot platforms and humanoid competitors.

“We looked at use cases for humanoids at Amazon, but you don’t actually want the complexity of a humanoid; you want something that’s stable and could move faster than people,” Porter added. “There are orders of magnitude more mobile robots than humanoids in day-to-day use, and at $300,000 to $600,000 per robot, the capital to build the first 10 humanoids is very high. We want to get robots into the field faster.”

pixelated, unrecognizable image of a mobile robot pushing a cart in a warehouse.

Collaborative Robotics has kept its actual robot out of public view. | Source: Adobe Stock image Photoshopped by The Robot Report

Robots must be trustworthy

Porter said that he “believes that robots need to be trustworthy, in addition to being safe. This philosophy is driving the design and user-interface decisions that the company has made so far. Users need to understand what the robot should do by looking at it, unlike some of the existing designs of mobile robots currently on the market.”

In addition to a human-centered design approach, Collaborative Robotics is using off-the-shelf parts to reduce the robot bill of materials cost and simplify the supply chain as it begins the process of commercialization. It is also taking a “building-block” approach to hardware and plans to adjust software and machine learning for navigation and learning new tasks.

“The robot we’ve designed is 70% off-the-shelf parts, and we can design around existing motors, while every humanoid company is hand-winding its own motors to find advanced actuation capabilities,” Porter noted. “We designed the system digitally, so we don’t have to hand-tweak a bunch of things. By using 3D lidar, we know the state of the art of the technology, and it’s easier to safety-qualify.”

With large language models (LLMs), Porter said he sees the day when someone in a hospital or another facility can just tell a robot to go away. “It’s about user interaction rather than just safety, which is table stakes,” he said. “We think a lot about trustworthiness.”


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


Collaborative Robotics preps for commercialization

General Catalyst led Collaborative Robotics’ Series B round, with participation from Bison Ventures, Lux Capital, and Industry Ventures. Existing investors Sequoia Capital, Khosla Ventures, Mayo Clinic, Neo, 1984 Ventures, MVP Ventures, and Calibrate Ventures also participated.

Since its founding in 2022, Cobot said it has raised more than $140 million. The company plans to grow its headcount from 35, adding production, sales, and support staffers.

In addition, Collaborative Robotics announced that Teresa Carlson will be joining it as an advisor on go to market at scale and industry transformation. She held leadership roles at Amazon Web Services, Microsoft, Splunk, and Flexport.

“I’m super-excited to be working with Teresa,” said Porter. “We’ve kept up since Amazon, and she thinks a lot about digital transformation at a very large scale — federal government and industry. She brings a wealth of knowledge about economics that will elevate the scope of what we’re doing.”

Paul Kwan, managing director at General Catalyst, is joining Alfred Lin from Sequoia on Collaborative Robotics’ board of directors. 

“In our view, Brad and Cobot are spearheading the future of human-robot interaction,” said Kwan. “We believe the Cobot team is world-class at building the necessary hardware, software, and institutional trust to achieve their vision.”

Editor’s note: Eugene Demaitre contributed to this article.

The post Collaborative Robotics raises $100M in Series B for mysterious mobile manipulator appeared first on The Robot Report.

]]>
https://www.therobotreport.com/collaborative-robotics-raises-100m-series-b-funding/feed/ 0
Rainbow Robotics unveils RB-Y1 wheeled, two-armed robot https://www.therobotreport.com/rainbow-robotics-unveils-rb-y1-wheeled-two-armed-robot/ https://www.therobotreport.com/rainbow-robotics-unveils-rb-y1-wheeled-two-armed-robot/#respond Sun, 07 Apr 2024 12:30:07 +0000 https://www.therobotreport.com/?p=578597 Rainbow Robotics launched RB-Y1, a new bimanual wheeled manipulator as the company begins the development of AI-driven robots.

The post Rainbow Robotics unveils RB-Y1 wheeled, two-armed robot appeared first on The Robot Report.

]]>
hero image of rainbow robotics RB-Y1.

RB-Y1 mounts a humanoid-type double-arm robot on a wheeled, high-speed mobile base. | Credit: Rainbow Robotics

Rainbow Robotics announced the release of detailed specifications for the new RB-Y1 mobile robot. The company recently signed a memorandum of understanding with Schaeffler Group and the Korea Electronics Technology Institute, or KETI, to co-develop the RB-Y1 and other mobile manipulators in Korea.

What’s in a name: Wheeled humanoid? Bimanual manipulator?

The past year has seen an explosion in the growth of humanoids, where most of the robots are bipedal and walk on two legs. Likewise, there have been many recent releases of mobile manipulators, or autonomous mobile robots (AMRs) with a single arm manipulator on board the vehicle.

The RB-Y1 is a form of wheeled robot base with a humanoid double-arm robot on top. Rainbow Robotics’ robot uses that base to maneuver through its environment and position the arms for manipulation tasks. The company called this configuration a “bimanual manipulator.”

To perform various and complex tasks, both arms on the RB-Y1 are equipped with seven degrees of freedom and consist of a single torso with six axes that can move the body. With this kinematic configuration, it is possible to move more than 50 cm (19.7 in.) vertically, making it possible to perform tasks at various heights.


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


Rainbow Robotics provides high-speed cornering

The maximum driving speed for the RB-Y1 is 2,500 mm/s (5.6 mph), and the company is claiming that the robot can accelerate quickly and turn at higher speeds by leaning the body into the turn. To avoid toppling while in motion, the center of gravity can be safely controlled by dynamically changing the height of the body.

The dimensions of the robots are 600 x 690 x 1,400 mm (23.6 x 27.2 x 55.1 in.), and the unit weighs 131 kg (288.8 lb.). The manipulators can each lift 3 kg (6.61 lb.).

At press time, there are not a lot of details about the robot’s ability to function using artificial intelligence, and one early video showed it working via teleoperation. It’s likely that the demonstrations in the video below are with remote operators.

However, Rainbow Robotics clearly has the goal of making its robot fully autonomous in the future, as more research, development, training, and simulation are completed.

“These days, when Generative AI such as ChatGPT and Figure is a hot topic in the robot industry, we have developed a bimanual mobile manipulator in line with the AI era,” stated a company spokesperson. “We hope that the platform will overcome the limitations of existing industrial robots and be used in many industrial sites.”

The post Rainbow Robotics unveils RB-Y1 wheeled, two-armed robot appeared first on The Robot Report.

]]>
https://www.therobotreport.com/rainbow-robotics-unveils-rb-y1-wheeled-two-armed-robot/feed/ 0
Tune in to Automated Warehouse webinar on stationary robots, smart controls https://www.therobotreport.com/automated-warehouse-webinar-automation-robotics-smart-controls/ https://www.therobotreport.com/automated-warehouse-webinar-automation-robotics-smart-controls/#respond Thu, 04 Apr 2024 15:31:29 +0000 https://www.therobotreport.com/?p=578530 This episode explores the integration of stationary robotics and workers in warehouse operations, focusing on trends, gaps, and available offerings.

The post Tune in to Automated Warehouse webinar on stationary robots, smart controls appeared first on The Robot Report.

]]>
cover art for stationary robotics research report.

The fourth installment of the Automated Warehouse research series captures market sentiment about stationary robots. | Credit: WTWH Media

Warehouse operators are grappling with a formidable challenge in the fast-paced logistics world: a severe shortage of available labor. With the increasing demand for operational efficiency, the optimization of warehouse processes has become an imperative rather than simply an objective.

In this fourth session of our Automated Warehouse webinar series, we will explore the current state of stationary robotics, specifically examining how these systems are being integrated with human workers through smart controls. Attendees will learn valuable insights derived from recent bespoke research conducted directly with warehouse operators.

Stationary robots can be found in various workflows, performing a diverse array of tasks. To better understand what kinds of systems are being used in fulfillment operations, distribution centers, and warehouses, we asked respondents about their stationary robot setups. The responses from these participants provide a snapshot of the state of the market.

The session is targeted at robotics OEMs, systems integrators, and warehouse operators. This webinar will be the last in this initial series of research projects that started with mobile robotics and then dug into the digitization of warehouse workflows, and how fixed conveyance is being used today.

The webinar is scheduled for Wednesday, April 10, at 2:00 p.m. EDT and will share approaches and examples with warehouse operators seeking to modernize and gain better control over workflows. Attendees will learn more about the following:

Insights from market research: Our experts have conducted a fresh market survey, uncovering the latest trends and developments in warehouse technology. By attending this webinar, you’ll gain exclusive access to this research, providing you with a competitive edge in the industry.

Q&A: You will have an opportunity to have your burning questions answered live.

Register now to save your spot and stay current on the market trends.


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


The post Tune in to Automated Warehouse webinar on stationary robots, smart controls appeared first on The Robot Report.

]]>
https://www.therobotreport.com/automated-warehouse-webinar-automation-robotics-smart-controls/feed/ 0
Zoox gets ready to launch robotaxi service in Las Vegas https://www.therobotreport.com/zoox-gets-ready-launch-robotaxi-service-las-vegas/ https://www.therobotreport.com/zoox-gets-ready-launch-robotaxi-service-las-vegas/#respond Sat, 30 Mar 2024 10:06:21 +0000 https://www.therobotreport.com/?p=578281 Zoox is expanding its area of robotaxi operations in Las Vegas as it prepares to launch a public service later this year.

The post Zoox gets ready to launch robotaxi service in Las Vegas appeared first on The Robot Report.

]]>
a zoox robo taxi turns a corner in Las Vegas.

Zoox is expanding the geofence for its operations in Las Vegas. | Credit: Zoox

Over the past year, Zoox Inc. has made significant progress on its autonomous robotaxi service roadmap.

In February, the Amazon.com subsidiary announced that it completed a key milestone: deploying its robotaxi on open public roads with passengers.

In the shadow of the disappointing news from competitor Cruise, which lost its autonomous operating permit from the California Department of Motor Vehicles (DMV), closed its San Francisco service, and laid off 900 employees, Zoox completed rigorous testing on private roads. It received approval from the California DMV to operate its robotaxi on the state’s public roads.

On the way, Zoox has invested heavily in simulation tools necessary to train the robot drivers to handle any on-road situation. Simulation is the key to safely training AI models and logging thousands of hours of drive time without endangering anyone.

“One [key to our success] is obviously through our test vehicle logged miles,” said Qi Hommes, senior director of system design and mission assurance at Zoox. “We drive our test vehicles with safety drivers quite a bit in our launch-intent areas. And anytime we encounter something unexpected, those are inputs into the development of those simulation scenarios.”

Zoox begins service for employees in California

Zoox claimed that it is the only purpose-built robotaxi permitted on California public roads that is self-certified to the Federal Motor Vehicle Safety Standards (FMVSS). The company recently deployed its employee shuttle service in its headquarters in Foster City, Calif. Zoox will offer the shuttle service exclusively to all full-time employees.

“Becoming the first company to operate a purpose-built robotaxi with passengers on open public roads in California is a significant milestone in not only Zoox’s journey, but [also] for the autonomous vehicle industry at large,” stated Aicha Evans, CEO of Zoox, after the DMV approval. “With the announcement of the maiden run of our autonomous employee shuttle, we are adding to the progress this industry has seen over the last year and bringing Zoox one step closer to a commercialized purpose-built robotaxi service for the general public.”

Unlike robotaxi competitors relying on car chassis, Zoox said it has designed its platform from the ground up for autonomous passenger movement. Every design decision was made with the goal of providing a comfortable, interactive experience.

The most obvious difference between Zoox and competitors like Waymo and Cruise is that the Zoox vehicle is missing a steering wheel. It has large doors on both sides of the vehicle and seats up to four passengers, with the riders sit facing one another.

Zoox robo taxi on the street in Foster City CA.

The Zoox autonomous robo-taxi vehicle is omnidirectional and uses four-corner steering. There is no onboard safety driver. | Credit: Zoox

A rider’s view of the robotaxi

At CES 2024, I interviewed Chris Stoffle, director of industrial and creative design at Zoox, and got a tour of the vehicle on the show floor.

The first thing that I noticed was how quiet it was inside the vehicle. CES is a noisy place. The cacophony of tens of thousands of people talking can be overwhelming, and inside the Zoox vehicle, it was quiet and comfortable. 

Stoffle described the Zoox rider experience: “Right now, we’re inside a robotaxi designed from the ground up to provide the best rider experience. The outside of the vehicle has a smaller footprint than a BMW i3. But inside we have this large space where passengers can sit comfortably across from each other. There’s no bad seat in the vehicle. Each rider can see the map, and adjust the temperature for their seat. In addition, there are USB power ports, drink holders, task lights, and an emergency button to contact help immediately.”

people entering into the cabin of a robotaxi.

Inside of the Zoox robotaxi, each rider has their own comfort controls while facing one anther for the ride. | Credit: Zoox

Las Vegas operations expand to five-mile radius

Since its February announcement, Zoox has expanded the geofence for its Las Vegas fleet of robotaxis, expanding the operational area for moving employees as they test the service. The new geofence is a five-mile radius around the company’s Las Vegas headquarters.

The new service area is more complex and includes three-lane roads, harder lane changes, unprotected right turns onto high-speed roadways, and double-right and left-hand turn lanes. 

The robotaxis are now handling more difficult operating situations as the engineering team validates the features and safety of the vehicle operation. In addition, the autonomous vehicles are now driving at speeds of up to 45 mph, in light rain, and at night.

The fact that the vehicles have no steering wheel and no onboard safety driver means that the Zoox team has to monitor each vehicle in real-time. These “human-in-the-loop” operators do not teledrive the vehicles, but they do monitor each ride in real time, looking at the environment and the vehicle’s intentions as it decides to turn, stop, and move through intersections.

If there is an emergency situation, the remote operator can direct the vehicle how to respond and where to go to safely resolve the situation.

“[Zoox’s vehicle] was approved to drive on public roads last year and now is fully homologated in response to emergency vehicles being able to detect them,” explained Stoffel. “Being able to interact with humans outside the vehicle in safe ways is really important. And so we’ve been able to not only update our sensor pod to improve self-driving in inclement weather with some of our sensor cleaning, but also to bring in a better microphone designed on the exterior so we can detect sirens and first-responder vehicles earlier.”

“The door interface module allows us to interact with those outside the vehicle with the human in the loop, whether it be a rider, someone from the public, or even a first responder,” he added. “We believe that being able to have that human in the loop is the right approach for those off-nominal situations that we’re going to be seeing more and more as we expand on public roads.”

To hear about the development of the perception engine and sensor stack used on the Zoox vehicle, listen to the podcast interview with two of the company’s technology leaders: RJ He and Ryan McMichael

zoox robo taxi in traffic on las vegas road.

Zoox is incrementally increasing the parameters for operation of its fleet of robotaxis in Las Vegas. | Credit: Zoox

The post Zoox gets ready to launch robotaxi service in Las Vegas appeared first on The Robot Report.

]]>
https://www.therobotreport.com/zoox-gets-ready-launch-robotaxi-service-las-vegas/feed/ 0
Accenture invests in humanoid maker Sanctuary AI https://www.therobotreport.com/accenture-invests-in-humanoid-maker-sanctuary-ai/ https://www.therobotreport.com/accenture-invests-in-humanoid-maker-sanctuary-ai/#respond Wed, 27 Mar 2024 22:08:22 +0000 https://www.therobotreport.com/?p=578285 Accenture said it sees "huge potential" for humanoids in post and parcel, manufacturing, retail, and logistics warehousing operations.

The post Accenture invests in humanoid maker Sanctuary AI appeared first on The Robot Report.

]]>
image of Phoenix humanoid robot, full body, not a render.

Sanctuary’s Phoenix humanoid is being developed to support a variety of tasks. | Credit: Sanctuary AI

In its Technology Vision 2024 report, Accenture said 95% of the executives it surveyed agreed that “making technology more human will massively expand the opportunities of every industry.” Well, today Accenture put its money where its mouth is. Accenture Ventures announced a strategic investment in Sanctuary AI, one of the companies developing humanoid robots.

The Robot Report reached out to Sanctuary to learn more about the investment but hadn’t heard back at press time. This article will be updated if more details are learned. Financial details of the investment were not disclosed.

Vancouver, Canada-based Sanctuary was founded in 2018 by Geordie Rose, Suzanne Gildert, Olivia Norton, and Ajay Agrawal. In December 2023, the company announced the acquisition of intellectual property from Giant.AI Inc. and Tangible Research to improve its touch and grasping technologies.

Humanoids are generating a lot of interest these days. Figure recently raised $675 million and is piloting its humanoid with BMW. Agility Robotics has piloted its Digit humanoid with Amazon and GXO Logistics. Apptronik recently announced a partnership with BMW. NVIDIA also announced a foundation model for humanoids, called GROOT, that is designed to bring robotics and embodied AI together.

Sanctuary focused on dexterous manipulation

While other humanoid developers have focused much of their energy on bipedal locomotion, Sanctuary has taken a different approach with its Phoenix robot. It believes object manipulation is the key to humanoid success in the market and has put most of its energy into hand-eye coordination and AI intelligence to support dexterous manipulation.

Sanctuary has published a series of videos of its robots “doing stuff” on YouTube (see video at the bottom of the story). These videos illustrate the development path of the two-armed humanoid as well as the AI behind the robots’ decision-making.

“AI-powered humanoid robots are essential to reinventing work and supporting human workers as labor shortage is becoming an issue in many countries and industries,” said Joe Lui, Accenture’s global advanced automation and robotics lead. “Sanctuary AI’s advanced AI platform trains robots to react to their environment and perform new tasks with precision in a very short time. We see huge potential for their robots in post and parcel, manufacturing, retail, and logistics warehousing operations, where they could complement and collaborate with human workers and automate tasks that traditional robotics can’t.”

Building an AI brain

Phoenix is powered by Sanctuary’s AI control system, Carbon, which attempts to mimic subsystems found in the human brain. The approach taken by the company also seeks to make the AI actions explainable, as well as editable.

“Robots with human-like intelligence will completely transform the workforce of the future,” said Rose, chief executive officer and co-founder of Sanctuary AI. “By combining Accenture’s expertise in disruptive technology with Sanctuary AI’s industry-leading robotics, we can help some of the biggest companies in the world manage this change and provide the best solutions for its clients.”

Accenture’s growing robotics portfolio

The investment in Sanctuary is the latest move by Accenture to build out a robotics strategy. In January 2024, Accenture and Mujin created a joint venture to help bring robotics to the manufacturing and logistics industries. Called Accenture Alpha Automation, the new venture is owned 70% by Accenture and 30% by Mujin. The new company, called Accenture Alpha Automation, combines Mujin’s industrial robotics expertise with Accenture’s digital engineering and manufacturing service, Industry X.

Accenture Alpha Automation is located in Japan, which is a robotics powerhouse. Japan had the fourth-highest robot density of any country in 2022, according to the International Federation of Robotics. Robot density measures the number of operational industrial robots per 10,000 employees in a country.

Accenture has also built out robotics integration capabilities in recent years. In early 2021, Accenture acquired Pollux, a provider of industrial robotics and automation. This was Accenture’s first acquisition of the kind. At the time of the deal, Pollux had implemented more than 1,000 projects for manufacturing companies, primarily in Brazil. It said it has deployed 150-plus collaborative robots successfully throughout Brazil.

In 2022, Accenture acquired Eclipse Automation, a provider of custom automation and robotics solutions for manufacturing applications. Eclipse Automation creates automated manufacturing systems for life sciences, industrial equipment, automotive, energy and consumer goods companies.

The post Accenture invests in humanoid maker Sanctuary AI appeared first on The Robot Report.

]]>
https://www.therobotreport.com/accenture-invests-in-humanoid-maker-sanctuary-ai/feed/ 0
NVIDIA announces new robotics products at GTC 2024 https://www.therobotreport.com/nvidia-announces-new-robotics-products-at-gtc-2024/ https://www.therobotreport.com/nvidia-announces-new-robotics-products-at-gtc-2024/#respond Tue, 19 Mar 2024 11:02:34 +0000 https://www.therobotreport.com/?p=578193 NVIDIA CEO Jenson Huang wowed the crowd in San Jose with the company's latest processor, AI, and simulation product announcements.

The post NVIDIA announces new robotics products at GTC 2024 appeared first on The Robot Report.

]]>
NVIDIA CEO Jenson Huang on stage with a humanoid lineup.

NVIDIA CEO Jenson Huang ended his GTC 2024 keynote backed by life size images of all of the various humanoids in development and powered by the Jetson Orin computer. | Credit: Eugene Demaitre

SAN JOSE, Calif. — The NVIDIA GTC 2024 keynote kicked off like a rock concert yesterday at the SAP Arena. More than 15,000 attendees filled the arena in anticipation of CEO Jensen Huang’s annual presentation of the latest product news from NVIDIA.

To build the excitement, the waiting crowd was mesmerized by an interactive and real-time generative art display running live on the main stage screen, driven by the prompts of artist Refik Anadol Dustio.

New foundation for humanoid robotics

The big news from the robotics side of the house is that NVIDIA launched a new general-purpose foundation model for humanoid robots called Project GR00T. This new model is designed to bring robotics and embodied AI together while enabling the robots to understand natural language and emulate movements by observing human actions.

GR00T training model diagram.

Project GR00T training model. | Credit: NVIDIA

GR00T stands for “Generalist Robot 00 Technology,” and with the race for humanoid robotics heating up, this new technology is intended to help accelerate development. GR00T is a large multimodal model (LMM) providing robotics developers with a generative AI platform to begin the implementation of large language models (LLMs).

“Building foundation models for general humanoid robots is one of the most exciting problems to solve in AI today,” said Huang. “The enabling technologies are coming together for leading roboticists around the world to take giant leaps towards artificial general robotics.”

GR00T uses the new Jetson Thor

As part of its robotics announcements, NVIDIA unveiled Jetson Thor for humanoid robots, based on the NVIDIA Thor system-on-a-chip (SoC). Significant upgrades to the NVIDIA Isaac robotics platform include generative AI foundation models and tools for simulation and AI workflow infrastructure.

The Thor SoC includes a next-generation GPU based on NVIDIA Blackwell architecture with a transformer engine delivering 800 teraflops of 8-bit floating-point AI performance. With an integrated functional safety processor, a high-performance CPU cluster, and 100GB of Ethernet bandwidth, it can simplify design and integration efforts, claimed the company.

Image of a humanoid robot.

Project GR00T, a general-purpose multimodal foundation model for humanoids, enables robots to learn different skills. | Credit: NVIDIA

NVIDIA showed humanoids in development with its technologies from companies including 1X Technologies, Agility Robotics, Apptronik, Boston Dynamics, Figure AI, Fourier Intelligence, Sanctuary AI, Unitree Robotics, and XPENG Robotics.

“We are at an inflection point in history, with human-centric robots like Digit poised to change labor forever,” said Jonathan Hurst, co-founder and chief robot officer at Agility Robotics. “Modern AI will accelerate development, paving the way for robots like Digit to help people in all aspects of daily life.”

“We’re excited to partner with NVIDIA to invest in the computing, simulation tools, machine learning environments, and other necessary infrastructure to enable the dream of robots being a part of daily life,” he said.

NVIDIA updates Isaac simulation platform

The Isaac tools that GR00T uses are capable of creating new foundation models for any robot embodiment in any environment, according to NVIDIA. Among these tools are Isaac Lab for reinforcement learning, and OSMO, a compute orchestration service.

Embodied AI models require massive amounts of real and synthetic data. The new Isaac Lab is a GPU-accelerated, lightweight, performance-optimized application built on Isaac Sim for running thousands of parallel simulations for robot learning.

simulation screen shots.

NVIDIA software — Omniverse, Metropolis, Isaac and cuOpt — combine to create an ‘AI gym’
where robots, AI agents can work out and be evaluated in complex industrial spaces. | Credit: NVIDIA

To scale robot development workloads across heterogeneous compute, OSMO coordinates the data generation, model training, and software/hardware-in-the-loop workflows across distributed environments.

NVIDIA also announced Isaac Manipulator and Isaac Perceptor — a collection of robotics-pretrained models, libraries and reference hardware.

Isaac Manipulator offers dexterity and modular AI capabilities for robotic arms, with a robust collection of foundation models and GPU-accelerated libraries. It can accelerate path planning by up to 80x, and zero-shot perception increases efficiency and throughput, enabling developers to automate a greater number of new robotic tasks, said NVIDIA.

Among early ecosystem partners are Franka Robotics, PickNik Robotics, READY Robotics, Solomon, Universal Robots, a Teradyne company, and Yaskawa.

Isaac Perceptor provides multi-camera, 3D surround-vision capabilities, which are increasingly being used in autonomous mobile robots (AMRs) adopted in manufacturing and fulfillment operations to improve efficiency and worker safety. NVIDIA listed companies such as ArcBest, BYD, and KION Group as partners.


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


‘Simulation first’ is the new mantra for NVIDIA

A simulation-first approach is ushering in the next phase of automation. Real-time AI is now a reality in manufacturing, factory logistics, and robotics. These environments are complex, often involving hundreds or thousands of moving parts. Until now, it was a monumental task to simulate all of these moving parts.

NVIDIA has combined software such as Omniverse, Metropolis, Isaac, and cuOpt to create an “AI gym” where robots and AI agents can work out and be evaluated in complex industrial spaces.

Huang demonstrated a digital twin of a 100,000-sq.-ft, warehouse — built using the NVIDIA Omniverse platform for developing and connecting OpenUSD applications — operating as a simulation environment for dozens of digital workers and multiple AMRs, vision AI agents, and sensors.

Each mobile robot, running the NVIDIA Isaac Perceptor multi-sensor stack, can process visual information from six sensors, all simulated in the digital twin.

robots working together in a warehouse.

Image depicting AMR and a manipulator working together to
enable AI-based automation in a warehouse powered by NVIDIA Isaac. | Credit: NVIDIA

At the same time, the NVIDIA Metropolis platform for vision AI can create a single centralized map of worker activity across the entire warehouse, fusing data from 100 simulated ceiling-mounted camera streams with multi-camera tracking. This centralized occupancy map can help inform optimal AMR routes calculated by the NVIDIA cuOpt engine for solving complex routing problems.

cuOpt, an optimization AI microservice, solves complex routing problems with multiple constraints using GPU-accelerated evolutionary algorithms.

All of this happens in real-time, while Isaac Mission Control coordinates the entire fleet using map data and route graphs from cuOpt to send and execute AMR commands.

NVIDIA DRIVE Thor for robot axis

The company also announced NVIDIA DRIVE Thor, which now supersedes NVIDIA DRIVE Orin as a SoC for autonomous driving applications.

Multiple autonomous vehicles are using NVIDA architectures, including robotaxis and autonomous delivery vehicles from companies including Nuro, Xpeng, Weride, Plus, and BYD.

The post NVIDIA announces new robotics products at GTC 2024 appeared first on The Robot Report.

]]>
https://www.therobotreport.com/nvidia-announces-new-robotics-products-at-gtc-2024/feed/ 0
Mercedes-Benz testing Apollo humanoid https://www.therobotreport.com/mercedes-benz-testing-apollo-humanoid/ https://www.therobotreport.com/mercedes-benz-testing-apollo-humanoid/#respond Fri, 15 Mar 2024 09:29:14 +0000 https://www.therobotreport.com/?p=578172 Mercedes is exploring how to use Apptronik's humanoid for automating some low-skill, physically challenging tasks.

The post Mercedes-Benz testing Apollo humanoid appeared first on The Robot Report.

]]>
Apptronik Apollo moves a tote.

Apollo moving a tote at the Mercedes factory. | Credit: Apptronik

Apptronik today announced that leading automotive brand Mercedes-Benz is testing its Apollo humanoid robot. As part of the agreement, Apptronik and Mercedes-Benz will collaborate on identifying applications for Apollo in automotive settings.

Mercedes-Benz is exploring how well Apollo can bring parts to the production line for workers to assemble, while simultaneously inspecting the components. Apollo will also be tested at delivering totes of kitted parts later in the manufacturing process.

“When we set out to build Apollo, an agreement like the one we’re announcing today with Mercedes-Benz was a dream scenario,” Jeff Cardenas, co-founder & CEO of Apptronik, said in a press statement. “Mercedes plans to use robotics and Apollo for automating some low-skill, physically challenging, manual labor – a model use case which we’ll see other organizations replicate in the months and years to come.”

The Robot Report reached out to both Apptronik and Mercedes to learn more, but hadn’t heard back at press time. At the moment, it is unclear how many Apollos are being tested, if multiple automotive plants are involved, and what the extent of this partnership is. The photos Apptronik shared show Apollo in a plant in Hungary.

Humanoid race heats up

Apptronik unveiled Apollo in August 2023 and is one of the early innovators in the humanoid race, joining the likes of Agility Robotics’ Digit and Tesla Optimus. Since August, additional manufacturers have thrown their collective hats in the humanoid ring, including 1X, Figure AI, Sanctuary AI, Unitree, LimX and Fourier Intelligence.

The robotics market is at a unique point in time where all of the enabling technologies that make a humanoid viable are coming together. This includes the huge leaps in AI maturity and model training over the last two years, the power of edge compute, battery capacity, and the maturation of legged motion algorithms.

But there are at least two big looming questions: “What can humanoids reliably do?” and “Does it make sense for robots to be on legs versus wheels?” A key milestone for these humanoid manufacturers is to pass these early tests and secure reference customers that validate the robot’s functionality and help guide the product roadmap.

Automotive manufacturing is the leading adopter of robotics worldwide. Automotive manufacturers are under pressure to improve quality, and reduce costs, all while struggling to deal with labor issues. It makes perfect sense that the automotive market would explore how humanoids could help. Optimus will be tested at internal Tesla manufacturing plants, Figure has an agreement with BMW, and Agility Robotics has landed pilots with Amazon and GXO Logistics.

Why humanoids? Why now?

Back to the question of “Why humanoids?” Apptronik said the addition of humanoids to factories and plants would allow organizations like Mercedes-Benz to deploy robotics that are optimized to perform in spaces that are designed for humans, thus avoiding full-scale facility redesigns that are built around robots rather than people. In short, this approach centers on automating some physically demanding, repetitive, and dull tasks for which it is increasingly hard to find reliable workers.

Apollo, which has a form factor that roughly matches the size of a human worker (5 feet 8 inches tall and 160 lb with the ability to lift 55 lb), is built to operate in industrial spaces. Combined with a unique force control architecture that maintains safe operation around people (similar to a collaborative robot versus a traditional industrial robot), Apollo’s design allows it to work alongside people while simultaneously taking on physically demanding tasks, Apptronik said.

woman and apollo humanoid.

Apollo can safely work side by side and collaboratively with humans on the production line. | Credit: Apptronik

“To build the most desirable cars, we continually evolve the future of automotive production. Advancements in robotics and AI open up new opportunities for us. We are exploring new possibilities with the use of robotics to support our skilled workforce in manufacturing,” said Jörg Burzer, member of the board of management of Mercedes-Benz Group AG, production, quality & supply chain management. “This is a new frontier, and we want to understand the potential both for robotics and automotive manufacturing to fill labor gaps in areas such as low skill, repetitive and physically demanding work and to free up our highly skilled team members on the line to build the world’s most desirable cars.”

Jonathan Hurst, co-founder and chief robot officer of Agility Robotics, will keynote the Robotics Summit & Expo, which runs May 1-2 in Boston. The event expects more than 5,000 attendees and is designed to help robotics engineers overcome the technical challenges of building commercial robots. The Robotics Summit & Expo is produced by The Robot Report and parent company WTWH Media.

Hurst’s keynote on May 1 from 9:00 to 9:45 a.m. ET is titled “Humanoid Robots Get to Work.” It will explore the technological breakthroughs propelling humanoids like Digit into real-world use cases. Attendees will learn about the ongoing challenges and opportunities and will go inside Digit’s first pilots.

The post Mercedes-Benz testing Apollo humanoid appeared first on The Robot Report.

]]>
https://www.therobotreport.com/mercedes-benz-testing-apollo-humanoid/feed/ 0
Intuitive secures FDA clearance for da Vinci 5 surgical robot https://www.therobotreport.com/intuitive-secures-fda-clearance-for-da-vinci-5-surgical-robot/ https://www.therobotreport.com/intuitive-secures-fda-clearance-for-da-vinci-5-surgical-robot/#respond Thu, 14 Mar 2024 23:34:56 +0000 https://www.therobotreport.com/?p=578168 Intuitive Surgical has received FDA 510(k) clearance for its next-generation da Vinci 5 surgical robotics system.

The post Intuitive secures FDA clearance for da Vinci 5 surgical robot appeared first on The Robot Report.

]]>
Intuitive surgical Da Vinci surgical system.

The complete Intuitive Surgical da Vinci 5 surgical robot system. | Credit: Intuitive Surgical

Intuitive Surgical (NASDAQ: ISRG) announced today that it secured FDA 510(k) clearance for its next-generation da Vinci 5 multiport surgical robotics system. 

The news comes less than two months after the dominant surgical robotics developer disclosed that it had submitted for the much-anticipated clearance, revealing the name of the new system in the process. (Here is our roundup of top surgical robotics companies.)

Da Vinci joins a deep portfolio of solutions from Intuitive

The da Vinci 5 joins Intuitive’s existing da Vinci robotic surgical system portfolio alongside the multiport X and Xi systems and the single-port SP. There is also Ion, Intuitive’s robotic-assisted platform for minimally invasive biopsy in the lung.

“We are pleased to receive FDA clearance for our fifth-generation robotic system, da Vinci 5,” CEO Gary Guthart said in a news release after the market closed.

“Intuitive is committed to meaningful improvements in surgery that enable better patient outcomes, enhance the patient and care team experiences, and ultimately lower the total cost of care,” he continued. “After more than a decade of careful research, design, development, and testing, we believe da Vinci 5 will deliver on these goals and help drive the future of robotic-assisted surgery.”

Da Vinci 5 features:

Intuitive shared more details about how the da Vinci 5 improves on previous da Vinci robots.

Intuitive Surgical da Vinci 5 tower.

The Intuitive Surgical da Vinci 5 tower [Image courtesy of Intuitive Surgical]

  • The company says new surgeon controllers and powerful vibration and tremor controls make the da Vinci 5 the smoothest and most precise system it has developed to date;
  • Intuitive says the da Vinci 5 has a next-generation 3D display and image processing, providing a high-quality and natural imaging experience. The goal is to enable surgeons to see more today and support future generations of surgical endoscopes and vision software.
  • Da Vinci 5 introduces Force Feedback technology and optional instruments that enable the system to measure and surgeons to feel subtle forces exerted on tissue during surgery. Intuitive says this feature is something no other surgical technology in any modality presently offers. Intuitive says preclinical trials showed surgeries with Force Feedback demonstrated up to 43% less force exerted on tissue, which could result in less trauma on tissue.
  • In addition to potentially less tissue trauma, Force Feedback will add an important new data stream to surgical data science, according to Intuitive. (Note: The Force Feedback instruments are optional for use with da Vinci; they cleared for many of the same procedures as da Vinci Xi.)
Intuitive Surgical da Vinci 5 insufflator.

The Intuitive Surgical da Vinci 5 insufflator [Image courtesy of Intuitive Surgical]

  • Da Vinci 5 also has features to help increase surgeon autonomy and streamline surgeon and care team workflow, boosting healthcare efficiency. ​For example, da Vinci 5 has integrated key OR technologies, including insufflation and an electrosurgical unit. There is also an optimized user interface, with settings that are accessible by the broader surgical team and by the surgeon directly from the head-in menu. Plus, surgeons can access other key settings while head-in to help them stay focused on the surgical field.

  • Da Vinci 5 has more than 10,000 times the computing power of da Vinci Xi. Intuitive says the greatly boosted computing power enables innovative new system capabilities and advanced digital experiences — now and in the future. There is integration with Intuitive’s My Intuitive app, SimNow (virtual reality simulator), Case Insights (computational observer), and Intuitive Hub (edge computing system).
  • Features to increase surgeon comfort include a redesigned console capable of customizable positioning, allowing surgeons to find their best fit for surgical viewing and comfort. Surgeons even have the ability to sit completely upright. The surgeon can make any necessary adjustments while their head is in the console. In addition, there are options designed to fit different body types, including surgeons who are pregnant.
Intuitive Surgical da Vinci 5’s surgeon console.

The Intuitive Surgical da Vinci 5’s surgeon console [Image courtesy of Intuitive Surgical]

Limited availability for initial release

Da Vinci 5 will initially be available to a small number of customers in the U.S. who collaborated with Intuitive during the development period and those with mature robotic surgery programs. The company’s goal is to work with surgeons at these initial sites to generate additional data on the system’s use before a wider commercial introduction.

“We strive to provide customers with technology that meets their needs and solves important problems,” said Intuitive Chief Medical Officer Dr. Myriam Curet. “We intend to launch da Vinci 5 more broadly in the U.S. and globally after we learn from and work with an initial smaller number of customers directly.”

Curet discussed the secret behind Intuitive’s surgical robotics success last year with our sibling site Medical Design & Outsourcing.

For an inside look at what the Da Vinci 5 is capable of, register now to attend DeviceTalks Boston. Intuitive EVP and Chief Digital Officer Brian Miller will review some of the futuristic functionality of the new system in a closing keynote, “Intuitive: The Future is Now with da Vinci 5.”


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


The post Intuitive secures FDA clearance for da Vinci 5 surgical robot appeared first on The Robot Report.

]]>
https://www.therobotreport.com/intuitive-secures-fda-clearance-for-da-vinci-5-surgical-robot/feed/ 0
LG Electronics invests $60M in service robot maker Bear Robotics https://www.therobotreport.com/lg-makes-strategic-investment-in-bear-robotics/ https://www.therobotreport.com/lg-makes-strategic-investment-in-bear-robotics/#respond Wed, 13 Mar 2024 23:39:07 +0000 https://www.therobotreport.com/?p=578144 LG said its investment into Bear Robotics follows its roadmap for standardized service platforms and software-defined robotics.

The post LG Electronics invests $60M in service robot maker Bear Robotics appeared first on The Robot Report.

]]>
Bear Robot product family with Bear Robot and LG logo in the background.

LG Electronics is making a strategic investment move to expedite the advancement of Bear Robotics capabilities in service robotics, a key new business area of the company. | Credit: Bear Robotics

LG Electronics Inc. is strategically investing in development of service robots. Bear Robotics Inc. this week said that it has received $60 million in Series C funding led by LG.

The company said this will add to LG’s portfolio for sustained growth, rather than focusing on immediate gains. After finalizing the stock purchase, LG will be the largest single shareholder of Bear Robotics.

“In the service robotics market, we’re focusing primarily on areas such as delivery and logistics,” stated William Cho, CEO of LG. “However, we are carefully considering future directions, keeping open the possibility of equity investments or mergers and acquisitions.”

Bear Robotics focuses on restaurants

Founded in 2017, Bear Robotics produces AI-powered indoor delivery robots that serve the U.S., South Korea, and Japanese markets. CEO John Ha is a former Google technical lead and senior software engineer.

Bear Robotics’ co-founder and chief technology officer also have engineering experience at prominent software companies. Its product features include fleet management software, cloud-based control systems, and evolving platforms for service robots.

The Redwood City, Calif.-based company introduced a larger autonomous mobile robot (AMR) with Servi Plus in March 2023.


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


LG leads the shift to software-defined robotics

LG said it is embracing a transition to software-defined robotics (SDR), which emphasizes software more than hardware, similar to what has been seen in the mobility business. To prepare for future growth, the company said it is dedicated to creating scalable service robots on an open-design software platform that can work in a range of settings.

Seoul, South Korea-based LG said that it understands the importance of standardizing AI-based AMRs and that its investment in Bear Robotics is an opportunity to grow its robot business. The company has already installed service robots in airports, hotels, restaurants, hospitals, retail outlets, museums, smart warehouses, and golf courses.

At LG Future Park in Gumi, South Korea, LG said it produces service robots with “world-class” quality, supply chain, and customer-service capabilities.

By combining Bear Robotics’ research and development and software expertise with its own strengths, LG said it will lead efforts to standardize robot platforms to significantly reduce market-entry costs, This will enhance operational efficiency and foster synergies, it asserted.

“Just as Android revolutionized the smartphone era, standardized open platforms are essential for the activation of the robot market,” remarked Ha.

Looking for growth in the service robotics market

LG said that market trends and has led to a business strategy to reallocate resources to high-growth industries in recent years. It claimed that its investment in Bear Robotics shows its commitment to advancing the service robot industry.

Since deploying guide robots at Incheon International Airport in 2017, LG has introduced systems tailored to diverse commercial applications such as delivery and disinfection. The company noted that it has actively pursued expansion into international markets including the U.S., Japan and Southeast Asia.

LG announced its Future Vision 2030 last year to become a “smart life solution company” that connects and expands client experiences across residential, commercial, transportation, and virtual worlds. LG’s “Triple Seven” target is an average growth rate and operating profit of 7% or higher, together with an enterprise value and EBITDA ratio of 7.

“As the service robot market enters a period of growth, this equity investment will significantly contribute to securing a ‘winning competitive edge’ for the company,” said Lee Sam-soo, chief strategy officer at LG Electronics. “From a mid- to long-term perspective, we will seek to develop our robot business into a new growth engine, exploring various opportunities through the integration of cutting-edge technologies such as embodied AI and robotic manipulation.”

The post LG Electronics invests $60M in service robot maker Bear Robotics appeared first on The Robot Report.

]]>
https://www.therobotreport.com/lg-makes-strategic-investment-in-bear-robotics/feed/ 0
Schneider Electric unveils new Lexium cobots at MODEX 2024 https://www.therobotreport.com/schneider-electric-unveils-new-lexium-cobots-at-modex-2024/ https://www.therobotreport.com/schneider-electric-unveils-new-lexium-cobots-at-modex-2024/#respond Mon, 11 Mar 2024 19:36:53 +0000 https://www.therobotreport.com/?p=578122 Schneider Electric unveils Lexium cobots at MODEX 2024, offering high-speed motion, 130 axe control, and cost-effective pricing, utilizing EcoStruxure architecture for collaborative data flow.

The post Schneider Electric unveils new Lexium cobots at MODEX 2024 appeared first on The Robot Report.

]]>
Schneider cobot product family.

The Lexium cobot product line, from left to right: RL3, RL5, RL7, and RL12 (not to scale). | Credit: Schneider Electric

Schneider Electric today at MODEX announced the release of two new collaborative robots: the Lexium RL 3 and RL 12, as well as the Lexium RL 18 model coming later this year. From single-axis machines to high-performance, multi-axis cobots, the Lexium line enables high-speed motion and control of up to 130 axes from one processor. This enables precise positioning to help solve manufacturer production, flexibility, and sustainability challenges, said the company.

“As U.S. manufacturing increases, the demand for smart machines is growing, and customers are requiring robots with digital twin capabilities that validate machine performance to help them quickly increase production consistently, efficiently, and sustainably,” stated Christine Bush, leader of the Robotics Center of Excellence at Schneider Electric.

“We are partnering with our customers to understand their challenges and pain points, then responding with complete, customized automation solutions – from power products and HMIs [human-machine interfaces] to PLCs [programmable logic controllers] and robotics – to simplify the process and meet their needs,” she added.

In addition to robots, Schneider Electric said it offers digitally engineered automation from concept to operation and maintenance. The company‘s EcoStruxure architecture connects smart devices, controls, software, and services to enable collaborative data flow from shop-floor to top-floor machine control.

Schneider Electric also said it provides robots using Modicon motion controllers, which combine PLC, motion, and robotics control on a single hardware platform with the EcoStruxure Machine Expert software.

 

Lexium digital twins offer layout, programming

The Lexium Cobots feature a positioning accuracy of +/- 0.02 mm (+/- 0.00079 in.), as well as a wide range of payloads from 3 to 18 kg (6.6 to 39.6 lb.), coming soon. The cobots range in price from $27,368 to $41,170.

The Lexium Cobot product line is compatible with EcoStruxure Machine Expert Twin, a software suite that creates digital models of real machines. It allows for virtual test strategies and commissioning, as well as shortened factory acceptance testing (FAT).

Digitizing these processes can reduce time-to-market by up to 50% and commissioning time by up to 60%, according to Schneider Electric. A 20% to 40% savings in investment costs can also be realized due to faster time to market, the company claimed.

Robot assembly, installation, and maintenance are faster with increased computational power, open software, and networking, it said. Software and automation work together using a centralized architecture and open-standard programming platform.

In addition to fast deployment, collaborative robots allow for more ergonomic work, easier integration with existing equipment and processes, and consistent output for higher product quality, said Schneider Electric. It also cited the benefit of reduced workplace strain and injuries.

For U.S. manufacturers to stay globally competitive, they must modernize processes by embracing the digitization of Industry 4.0, which includes advances in AI, machine learning, the Internet of Things (IoT), and digital twins, said Schneider Electric. The latest software can help makers develop new ideas quickly, cut down on working time, and meet shifting customer demand, it noted.

As the business world moves toward Industry 5.0, Lexium’s motion and robot solutions will likely change the way things are done and make personalized automated manufacturing possible.


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


The post Schneider Electric unveils new Lexium cobots at MODEX 2024 appeared first on The Robot Report.

]]>
https://www.therobotreport.com/schneider-electric-unveils-new-lexium-cobots-at-modex-2024/feed/ 0
Afara launches autonomous picker to clean up after cotton harvest https://www.therobotreport.com/afara-launches-autonomous-picker-to-clean-up-after-cotton-harvest/ https://www.therobotreport.com/afara-launches-autonomous-picker-to-clean-up-after-cotton-harvest/#respond Sat, 09 Mar 2024 14:05:38 +0000 https://www.therobotreport.com/?p=578115 AFARA-COTTON uses a variety of sensors to autonomously detect and pick up cotton dropped during mechanical harvest.

The post Afara launches autonomous picker to clean up after cotton harvest appeared first on The Robot Report.

]]>

Afara Agricultural Technologies Inc. has developed AFARA-COTTON, an autonomous mobile robot designed to collect cotton spilled on the ground after the mechanical harvesting process. The Turkish company has also developed automation for seeding, irrigation, disinfestation, and weeding.

According to Afara, 5% to 20% of annual cotton yields are unpicked by mechanical harvesters or are dropped to the ground during harvest. This valuable resource is currently either wasted or must be gathered by hand, it said.

The company said AFARA-COTTON is an all-electric, self-driving platform to address this waste. It is currently selling its systems only in Turkey and select European countries.


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


AFARA-COTTON cleans two rows at once

Computer engineer Ömer Muratlı, the son of a farmer, invented the Afara Agricultural Robot in 2019. His family, which harvested cotton, had asked him to build a robot to collect remaining cotton. He patented the agricultural platform and brought it to the working prototype stage.

Afara Agricultural Technologies was established in 2023 with investment from the crowdfunding platform Fonlabüyüsün. The company is continuing work on a mass-production model.

The video animation above shows that the robot has four cameras, two lidar sensors, and ultrasonic sensors. Cameras scan the ground looking for the stray seed cotton.

An AI-based perception model identifies the seed cotton and deploys a suction cup to vacuum up the seed cotton and collect it in a central bailing area in the middle of the machine. The startup said it is targeting a 90% efficiency rate.

AFARA-COTTON autonomously traverses the field, avoiding obstacles as it cleans up two rows at a time. Currently, as the robot completes a row, the operator needs to manually realign it onto the next row and reinitiate the collection process.

The robot can accumulate up to 200 kg (440 lb.) before it needs to be emptied. While collecting the cotton, the picker drives up to 3.2 kph (2 mph), and it can operate for up to six hours on a single charge. The current two-row model will be available for €120,000 to €130,000 ($131,275 to $142,231 U.S.).

image of the afara cotton picker.

AFARA-COTTON is designed to clean up cotton wastage from the field after the harvest. | Credit: AFARA

The post Afara launches autonomous picker to clean up after cotton harvest appeared first on The Robot Report.

]]>
https://www.therobotreport.com/afara-launches-autonomous-picker-to-clean-up-after-cotton-harvest/feed/ 0
Figure AI raises whopping $675M to commercialize humanoids https://www.therobotreport.com/figure-ai-raises-675m-to-commercialize-humanoids/ https://www.therobotreport.com/figure-ai-raises-675m-to-commercialize-humanoids/#respond Thu, 29 Feb 2024 13:00:54 +0000 https://www.therobotreport.com/?p=578023 Figure AI has raised funding from Amazon, Microsoft, NVIDIA, Open AI, and others to accelerate humanoid development, AI training, manufacturing, and hiring.

The post Figure AI raises whopping $675M to commercialize humanoids appeared first on The Robot Report.

]]>

We declared 2023 the year of the humanoid, but 2024 has already said, “Hold my beer.” You likely read leaked reports about this last week, but today Figure AI Inc. confirmed that it has raised $675 million to develop humanoids.

With the Series B funding, the Sunnyvale, Calif.-based company is now valued at $2.6 billion. Microsoft, OpenAI Startup Fund, NVIDIA, Amazon Industrial Innovation Fund, Jeff Bezos (through Bezos Expeditions), Parkway Venture Capital, Intel Capital, Align Ventures, and ARK Invest were among the investors. Qatalyst Partners provided strategic and financial advice to Figure.

Figure has been on a blistering path to market since it exited stealth mode in January 2023. The company was co-founded in 2022 by CEO Brett Adcock, a startup veteran with two successful exits under his belt. Adcock asserted that time to market and hiring the right people are two keys to the success of any startup.

Figure said it plans to use the funding to grow its AI training, manufacture more robots, and hire the engineers necessary to get production units to market in the 2024-2025 timeframe.

“Our vision at Figure is to bring humanoid robots into commercial operations as soon as possible,” Adcock said in a release. “This investment, combined with our partnership with OpenAI and Microsoft, ensures that we are well-prepared to bring embodied AI into the world to make a transformative impact on humanity. AI and robotics are the future, and I am grateful to have the support of investors and partners who believe in being at the forefront.”

Adcock builds on experience, teamwork

Adcock previously founded Vettery and Archer Aviation. Vettery is an online talent marketplace that Adecco Group acquired in 2018 for $100 million. And Archer Aviation is a publicly traded company that is developing electric vertical takeoff and landing (eVTOL) aircraft.

Before Adcock started Figure AI, he visited the AMBER bipedal robotics lab of Aaron Ames at Caltech. Ames is one of the early researchers in bipedal walking mechanics and was a student of Marc Raibert, who founded Boston Dynamics in 1992 and is largely responsible for the development of the ATLAS humanoid. Adcock wanted Ames’ advice on how difficult it would be to commercialize a humanoid robot.

Figure has hired industry veterans such as Jerry Pratt from Institute for Human and Machine Cognition (IHMC). Pratt brings more than 20 years of experience in humanoid development to Figure as chief technology officer. He was also associated with Raibert’s MIT Leg Lab.

Figure said the knowledge and experience of its team of about 80, which also includes veterans from Boston Dynamics, Google DeepMind, and Tesla, have helped accelerate its bipedal walking development. Its stated goal is autonomous, general-purpose humanoid robots to address labor shortages, unsafe or undesirable jobs, and global supply chain needs.


SITE AD for the 2024 Robotics Summit registration.Learn from Agility Robotics, Amazon, Disney, Teradyne and many more.


A startup mature beyond its years

The Robot Report visited Figure AI in October 2023 to meet Adcock and see its humanoid robot for the first time. I was impressed by the organization of the company‘s product prototyping process. What I witnessed was a methodical approach to iterative development that included an in-house CNC metalworking shop and a 3D-printing farm. 

To help accelerate software development and debug all of the electrical control systems, the lab included a completely deconstructed robot. The robot was dissected with all of its electrical systems arranged across a table. For any engineer who has tried to troubleshoot a complex electromechanical system, this is a well-known best practice.


Figure AI timeline


Last year, nearly a dozen companies worldwide emerged to pursue the humanoid robot dream. In about 21 months, Figure has been able to go from nothing to a walking humanoid prototype. 

In January 2024, BMW began testing a Figure robot at its automotive factory in Spartanburg, S.C. This milestone made Figure the second company to land a humanoid pilot with a high-profile client. Agility Robotics announced in late 2023 pilots with both Amazon and GXO Logistics.

Note that Figure is also the second humanoid developer to get funding from the Amazon Industrial Innovation Fund. The Amazon investment arm was part of Agility’s Series B in April 2022

Figure and OpenAI to collaborate on AI

Figure AI said it will work with OpenAI on the next generation of AI models for humanoids. This will be done by combining OpenAI’s language research with Figure’s robotics hardware and software expertise. 

“We’ve always planned to come back to robotics, and we see a path with Figure to explore what humanoid robots can achieve when powered by highly capable multimodal models,” stated Peter Welinder, vice president of product and partnerships at OpenAI. “We’re blown away by Figure’s progress to date, and we look forward to working together to open up new possibilities for how robots can help in everyday life.”

Back in 2019, OpenAI demonstrated impressive in-hand manipulation skills with the ability to solve a Rubik’s cube using neural networks. In mid-2021, the company abandoned research into robotics hardware in favor of AI research. In addition, OpenAI was among the funders of humanoid developer 1X Technologies.

Figure said it will use Microsoft Azure for AI training, storage, and servers.

“We are excited to work with Figure to speed up research into AI breakthroughs,” said Jon Tinter, corporate vice president of business development at Microsoft. “Through our work together, Figure will have access to Microsoft‘s AI infrastructure and services to support the deployment of humanoid robots to assist people with real-world applications.”

The post Figure AI raises whopping $675M to commercialize humanoids appeared first on The Robot Report.

]]>
https://www.therobotreport.com/figure-ai-raises-675m-to-commercialize-humanoids/feed/ 0