Robotics Advancements at Tsinghua University

Explore top LinkedIn content from expert professionals.

Summary

Robotics advancements at Tsinghua University highlight innovative developments that make robots smarter, faster, and more adaptable, such as combining vision and touch for better learning and building humanoid robots that can move with agility and recover from falls. These breakthroughs bring us closer to robots that can handle complex, real-world tasks just like humans.

  • Explore new capabilities: Pay attention to how robots are now using touch sensors and vision together, allowing them to perform tasks that require delicate handling and quick learning.
  • Follow market trends: Track the progress of humanoid robots that are achieving record speeds and tackling challenging environments, as these achievements signal rapid growth in the robotics industry.
  • Consider practical impacts: Think about how robots that can stand up from any position or adapt to rough terrain may soon be used in everyday settings, from healthcare to disaster recovery.
Summarized by AI based on LinkedIn member posts
  • View profile for Arhaan Aggarwal

    Sextuple Major@UC Berkeley’26|| ZFellow|| Serial Entrepreneur|| Researcher

    11,411 followers

    Just read one of the more exciting papers I've seen in the field of robotics and AI: ViTaMIn – Visuo-Tactile Manipulation Interface from Tsinghua and UC Berkeley (with Prof. Pieter Abbeel on the team)! Instead of relying on expensive, rigid, and hard to scale robot teleoperation setups, ViTaMIn introduces a portable, robot free gripper that captures both vision and touch data making it drastically easier (and cheaper!) to teach robots how to handle contact rich tasks. Why this matters: -Tactile feedback is essential for tasks like inserting a sponge into a cup or adjusting a test tube mid air. -Most data collection systems use vision only input and struggle with real world contact scenarios. -ViTaMIn’s system not only collects high resolution tactile data, but also pre trains a multimodal representation (vision + touch) to improve data efficiency and generalization. Result? Robots trained using ViTaMIn outperformed vision only baselines by up to 100% on some tasks and learned faster with less data. From orange placement to scissor hanging, sponge insertion, and even dual arm knife pulling, ViTaMIn showed robust, scalable performance across 7 diverse manipulation tasks. Massive respect to the authors for pushing the boundaries of scalable and intuitive robot learning. This could pave the way for a new generation of touch aware robotic systems that learn like humans-through both eyes and hands. Full paper: https://lnkd.in/gHjuz9TV Project page: https://lnkd.in/g4pmSuF5 #robotics #AI #manipulation #tactileAI #deepRL #berkeleyAI #ViTaMIn #PieterAbbeel #UCberkeley #Tsinghua

  • View profile for Mike Kalil

    10M+ Annual Reach | Covering the Rise of the Machines Without an Agenda | mikekalil.com

    4,553 followers

    In Beijing, the successor to the world's fastest humanoid robot just emerged. Robot Era, which was spun from Tsinghua University in 2023, is pushing its suite of AI-powered humanoids to market with significant financial backing. The startup recently secured around 500 million yuan, or nearly $70 million, in a Series A round led by CDH Investments and Haier Capital. Their total publicized funding for 2024 and 2025 is over 900 million yuan or $125 million. Robot Era gained significant attention in fall 2024 when its Star 1 biped achieved a top speed of nearly 13 km/h (8 mph) during a 34-minute run across the Gobi Desert. The former fastest bipedal humanoid, the H1 by Hangzhou-based Unitree Robotics, set the previous record earlier in 2024, topping out at just under 12 km/h (7.5 mph). Robot Era's experiment also showed that robotic humanoids run faster when wearing sneakers than barefoot. Another one of its two-legged humanoids, the X-Bot L, became the first humanoid robot to successfully walk the Great Wall of China. The way the robot walked weathered stone, climbed uneven stairs, and performed tai chi moves was seen as shocking just a year ago. But today the demonstration would barely be noteworthy, showing how fast things are progressing amid the international race toward fake humans. The firm's latest humanoid, the L7, builds on the Star 1's bipedal agility, adding advanced upper-limb dexterity. #humanoidrobots #china #robotics #innovation #ai

  • View profile for Akshet Patel 🤖

    Robotics Engineer | Creator

    53,265 followers

    What if a humanoid robot could stand up from any posture, lying on a sofa, seated at a table, or even prone outdoors, with smooth, reliable control? [⚡Join 2500+ Robotics enthusiasts - https://lnkd.in/dYxB9iCh] A team from Shanghai AI Laboratory, The Chinese University of Hong Kong, The University of Hong Kong, and Tsinghua University - Tao Huang, Junli Ren, Huayi Wang, Zirui Wang, Qingwei Ben, Muning Wen, Xiao Chen, Jianan Li, and Jiangmiao Pang presents HoST. This reinforcement-learning framework enables posture-adaptive standing-up across diverse environments. They train HoST entirely from scratch in simulation using a multi-critic architecture, curriculum-based terrain exposure, and motion smoothing constraints to prevent jerky or violent movements. The learned policy is directly transferred to a Unitree G1 robot, allowing it to stand up smoothly and robustly from various starting positions, even outdoors on slopes, while carrying payloads or after external pushes. This work tackles a fundamental capability: fall recovery and posture adaptation, moving beyond scripted trajectories to real-world-ready skills. If we can now reliably haul robots upright from anywhere, what next-level capabilities should humanoids master? Paper: https://lnkd.in/esWDCXCb Project Page: https://lnkd.in/esfZy-Wm #HumanoidRobotics #ReinforcementLearning #Sim2Real #LeggedControl #RSS2025

Explore categories