Universal Robots and Scale AI launch imitation learning system for training industrial robotsUniversal Robots (UR) has unveiled the UR AI Trainer at GTC 2026 in Silicon Valley. Developed in collaboration with Scale AI, the AI Trainer marks a tectonic shift as robots move from pre-programmed applications to fully AI-driven tasks. These systems are powered by robust data generated in AI training cells where “robots imitate humans”. “Our customers, ranging […] https://lnkd.in/eCUZW3nq
Universal Robots Launches AI Trainer with Scale AI
More Relevant Posts
-
The AI Trainer marks a tectonic shift as robots move from pre-programmed applications to fully AI-driven tasks. These systems are powered by robust data generated in AI training cells where robots imitate…
To view or add a comment, sign in
-
“Our customers, ranging from large enterprises to AI research labs, are no longer just asking for AI features,” said Anders Billesø Beck, VP of AI Robotics Products at Universal Robots. “They need a way to collect high-fidelity, synchronized robot and vision data to train AI models on the same robots they intend to deploy. Our AI Trainer is the industry's first direct lab-to-factory solution for AI model training.” #PhysicalAI #UniversalRobots #ScaleAI #AIRobotics #LabToFactory #ImitationLearning #SmartManufacturing #NVIDIAGTC #Industry40 #Automation #FutureOfWork https://lnkd.in/dsXnY4r5
To view or add a comment, sign in
-
“Our customers, from large enterprises to AI research labs, are no longer just asking for AI capabilities,” said Anders Billesø Beck, Vice President of AI Robotics Products at Universal Robots. “They need a way to collect high-precision, synchronized robotic and visual data to train AI models on the same robot that will be deployed. Our AI Trainer is the industry’s first AI model training solution that goes directly from the lab to the factory.” #UniversalRobots #AIRobotics #GTC2026 #IndustrialAI #Automation #AITraining #Cobots #SmartManufacturing #ImitationLearning #FutureOfWork https://lnkd.in/d7FUTWNT
To view or add a comment, sign in
-
"Our customers, ranging from large enterprises to AI research labs, are no longer just asking for AI features," said Anders Billesø Beck, VP of AI Robotics Products at Universal Robots. "They need a way to collect high-fidelity, synchronized robot and vision data to train AI models on the same robots they intend to deploy. Our AI Trainer is the industry's first direct lab-to-factory solution for AI model training." #IndustrialRobotics #PhysicalAI #AITraining #Automation #SmartManufacturing #ScaleAI #UniversalRobots https://lnkd.in/dKu_3daY
To view or add a comment, sign in
-
Someone just raised $450 million to teach robots by making them watch YouTube. And investors valued the whole thing at $1.7 BILLION. The company is called Rhoda AI. They just emerged from 18 months of stealth and the idea behind it is genuinely clever. Teaching a robot to navigate the real world through traditional data collection is painfully slow and expensive. The standard method is teleoperation. Humans remotely controlling robots over and over to generate training data. It works. But it doesn't scale. And it completely falls apart the moment something unexpected happens. . . So Rhoda asked a different question. The internet already has hundreds of millions of hours of humans doing EXACTLY the things robots need to learn. Opening doors. Stacking shelves. Carrying boxes. Navigating crowds. Sorting objects. Their CEO Jagdeep Singh put it perfectly: "If the phone orientation changes, that might be enough to cause a teleoperation-trained model to fail. Whereas in our case, the model has seen so many examples of objects at different orientations it's able to generalize." Why build the training data from scratch when billions of hours of it already exists? The system they built is called Direct Video Action. It pre-trains on hundreds of millions of public internet videos to build a foundational understanding of physics, motion and real world interaction. Then combines that with a smaller amount of actual robot data to fine-tune execution. The system continuously observes its surroundings, predicts how the environment may change and translates those predictions into robotic actions every few hundred milliseconds in a closed feedback loop. Isn’t that insane? I mean it’s scary but it’s also cool lol. And it's already been tested in a real automotive manufacturing facility.
To view or add a comment, sign in
-
-
Robots learning skills through intention-based control suggest a shift from scripted actions to adaptive behavior. At Robotic Crew, we follow advances that bring autonomy closer to real decision-making. How important is intention for next-gen robotics? https://lnkd.in/dKJN9z_w
To view or add a comment, sign in
-
Check out the latest episode on The Industry Forward podcast as Dale Tutt and Rahul Garg talk about the growing momentum behind humanoid robots and why now? What are humanoid robots and what technologies are coming together to make them more than just science fiction? https://lnkd.in/gRZte7zG
Industry Strategy Leader @ Siemens, Aerospace Executive, Engineering and Program Leadership | Driving Growth with Digital Solutions
One of my favorite books growing up was "I, Robot" by Isaac Asimov. If you're not familiar with the book, it was a collection of nine short stories that imagined the development of positronic robots that had humanoid form and relied on artificial intelligence to learn. It also dealt with the moral implications of the technology, famously developing the Three Laws of Robotics, that governed the development, and actions, of positronic robots. I've been thinking often about those stories now that we have seen an increase in the number of companies working on humanoid robots, especially those that were shown at the Consumer Electronics Show in Jan 2026. Recently, I have been sitting down with a number of experts regarding humanoid robots for an upcoming podcast series on The Industry Forward podcast and have had a number of fun and informative discussions. What are some of my takeaways so far? Why are manufacturing companies interested in humanoid robots? As they seek to add automation, humanoid robots can provide a unique solution especially in existing factories and production lines. In an environment where the products and production processes were designed for assembly by humans, the humanoid robots can be quickly deployed without redesigning the product or process for automation. That's a game changer for companies. The concept of humanoid robots has been around for decades. We all recall watching Rosie on "The Jetsons". But the technology has finally come of age to support the development of humanoid robots. Miniaturized electronics, more powerful semiconductors, software defined automation and AI, plus new battery technology that allows these robots greater mobility and endurance. As companies seek to implement this new automation, they need to understand that this is not just a technology problem. It's about people, technology, processes, safety and change management. Companies need to take a holistic approach to managing how humans and humanoids can coexist on the factory floor, and how the digital twin and simulations can support the new technology and systems. Humanoids aren’t a silver bullet, but they’re clearly moving from demos to real production use. And the factories that start thinking about that future now and building their digital foundation will be the ones best positioned to take advantage of it. Watch for more posts in the coming weeks that will announce the release of various podcast episodes. Or better yet, follow "The Industry Forward" podcast on your favorite podcast channel! Here's a link to our podcast page: https://lnkd.in/gCNcFRMy This will be a fun discussion - I look forward to hearing your thoughts in the comments!
To view or add a comment, sign in
-
-
Physical AI Hardware Robots Late last year, I have been learning and heavily invested in robots as it's the new shift but people don't see it yet, only builders ( and maybe their closed circle) not even investors can enlighten you properly about how physical AI will change how the world is designed. This is the moment I'll say we're truly transitioning from the stone age to digital age if robots are fully deployed but yet what we still see is a well tailored, edited videos of robotic actions. 🔺Why, what is the challenge from the Aforementioned? Not hardware is problem, Not printer But Data. Unlike software AI data is available and structured, software data can be gotten from almost everywhere (quality data) it gets to learn and improve through the available data, the Internet is its training ground. It's a very different case for physical AI to get robot up and running you've to get your hands dirty more like a trial and error in real world. 🔺Is there available data for robots? Yes, there are tailored data for robots, a lot of start ups are beginning to gather datasets for robots which still seem far for autonomy to be achieved, gap is still wide. To the early contributors of robots, we know it'll take maybe 5,10 or 20years to get physical AI to where software AI is presently, this won't happen overnight. Real life doesn't scale easily. Software learns from Internet, Robots learn from real life. In next post, I'll talk about what my humble builder friend Roberto De la Cruz González is building with Nextis.
To view or add a comment, sign in
-
-
Will embodied AI create robotic coworkers? June 30, 2025 | Article A pragmatic look at what general-purpose robots can—and can’t yet—do in the workplace. From C-3PO’s polished diplomacy to R2-D2’s battlefield heroics, robots have long captured our imagination. Today, what was once confined to science fiction is inching toward industrial reality. General-purpose robots, powered by increasingly capable embodied AI, are being tested in warehouses, factories, hospitals, and fields.1 And unlike previous generations of robots, they’re not just performing a single preprogrammed task but adapting to dynamic environments, learning new motions, and even following verbal commands. Much of the current buzz centers on humanoids—robots that resemble people—whose recent exploits include running marathons and performing backflips. General-purpose robots also come in many other forms, however, including those that rely on four legs or wheels for movement (Exhibit 1). But as executives weigh automation road maps and workforce evolution, their focus should not be on whether their robots look human but on whether these robots can flex across tasks in environments designed for humans. This issue is both urgent and intriguing because general-purpose robots, including those in the multipurpose subcategory, may become part of the workplace team: trained to pack, pick, lift, inspect, move, and collaborate with people in real time.2 Surge in investment and innovation The sector has seen an explosion in activity. General-purpose robotics funding grew fivefold from 2022 to 2024, surpassing $1 billion in annual investment, with leading start-ups such as Figure AI, Skild AI, and Agility Robotics raising hundreds of millions of dollars. Patent filings have also surged, with a 40 percent CAGR in volume since 2022. Governments are taking notice, too. China has designated embodied AI a national priority, anchoring a $138 billion innovation fund. McKinsey Global Institute’s recent research report, The next big arenas of competition, identifies embodied AI and robotics as one of five emerging frontiers that are shaping future global productivity and digital infrastructure. AI foundation models as robotics brainpower Just as large language models unlocked natural conversation for chatbots, vision-language-action (VLA) foundation models enable robots to interpret visual cues, follow spoken instructions, and execute complex sequences. These foundation models support key robotic functions, including perception, reasoning, and decision-making. When paired with multimodal sensors—those that can ingest and act on multiple inputs, such as touch and force—they create systems that can learn by observing humans, without being manually programmed step by step.
To view or add a comment, sign in
-
Modern AI and robotics can combine:The KAIST Humanoid v0.7 The Korea Advanced Institute of Science and Technology (KAIST) developed the KAIST Humanoid v0.7 as an advanced research robot which demonstrates significant progress in both humanoid robotics and artificial intelligence. The robot operates as a locomotion research platform which enables it to develop human-like movements through dynamic driving, but it does not execute complicated objects handling tasks. The v0.7 robot possesses its most exceptional characteristic through its capacity to display exceptional agility combined with perfect balance. The system enables walking and running at speed of 12 km/h and it executes dance movements like the moonwalk and it plays soccer through ball recognition and kicking abilities. The abilities of this system exhibit human-like coordination and stability, which become evident during difficult tasks and through motion on uneven surfaces. Researchers developed Physical AI to work with deep reinforcement learning systems, which serves as the primary robot control system. Physical AI enables the robot to process real-world data through Physical AI, which creates a new system that lets the robot move through the physical environment while using sensors to track its balance and position and terrain. The system enables the robot to immediately adapt to external disturbances while maintaining its balance during advanced movements. The development of the v0.7 also makes use of a "simulation-to-reality" (Sim-to-Real) training approach. The robot primarily masters moves in a virtual setup and later applies the acquired skills in the real world, thereby drastically reducing the time needed for training and enhancing performance. Moreover, KAIST engineers have developed many of the robot's hardware components such as actuators and control systemsindividually which facilitates a close software and mechanical integration. Most critically, the v0.7 isn't a commercial product but a research platform intended to push forward robotic mobility. It draws from KAIST's extensive experience with humanoid robots, including the famous HUBO series. The knowledge obtained from this robot will likely lead to the invention of new robots for different fields like disaster response, industrial automation, and human-robot collaboration. In short, the KAIST Humanoid v0.7 is an example of how contemporary AI and robots connected can create machines with the ability to move almost as humans do. The accomplishments of this model point to the increasing capabilities of humanoid robots to work successfully in real-world environments and carry out complex physical tasks.
To view or add a comment, sign in
-
Explore related topics
- Training Robots Without Pre-Programmed Models
- Autonomous Agents That Imitate Human Behavior
- AI Training Applications in Robotics and Automation
- Imitation Learning Techniques for Complex Tasks
- Scalable AI Models for Robotics Development
- Scaling Robotic Task Learning in Automation
- AI Training Strategies for Robotics and Autonomous Vehicles
- How Robots Learn From Human Demonstrations
- AI Models That Simulate Human Thinking
- Training Employees On New AI Technologies
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development