#Robot-assisted #arthroscopic #surgery offers new possibilities for precision and control. Our latest publication introduces a two-arm robotic system designed to enhance these procedures. One robot assists the surgeon by holding the arthroscope (the camera) steadily with the help of an impedance controller and a gravity iterative learning (Git) scheme, ensuring stable visual feedback and allowing easy adjustments. The other robot, equipped with virtual fixtures and haptic feedback, provides the surgeon with intuitive guidance and precise control over the surgical tool. This improves surgeon dexterity and reduces cognitive load, enhancing the overall safety and effectiveness of arthroscopic surgeries. Discover more about our work in "Robotic Assistance and Haptic Feedback in Arthroscopic Procedures: Design and Preliminary Evaluation of a Two-Arm System," published in the Journal of Medical Robotics Research (#JMRR), 2024, by Teng Li, Armin Badre, and Mahdi Tavakoli. https://lnkd.in/gUuM7tR6 https://lnkd.in/gkBdMGfY #SurgicalRobotics #Arthroscopy #Haptics #VirtualFixtures #TBSgroupUofA
Haptic Feedback in Robotics
Explore top LinkedIn content from expert professionals.
Summary
Haptic feedback in robotics lets robots simulate the sense of touch, allowing users to feel pressure, vibration, or texture in real time as they interact with robotic systems. This technology is making surgical training, virtual reality, and collaborative work more immersive and intuitive by bridging the gap between physical sensation and digital environments.
- Integrate tactile solutions: Look for haptic devices that can add touch sensations to training or remote operations, giving users a more natural and responsive experience.
- Refine skill acquisition: Use haptic-enabled robotics to speed up learning and improve precision, especially in surgical practice or complex manual tasks.
- Expand collaborative uses: Consider robot-mediated haptic feedback for tasks where coordinated movement or teamwork is critical, such as music, rehabilitation, or joint workspace activities.
-
-
HAPTICS TUESDAY 🖐️🦷 A New Way of Training Surgeons in Advanced Procedures Surgical training is undergoing a quiet transformation. Not in the operating room—but in virtual environments enhanced with touch, feedback, and immersion. Extended Reality (XR)—including VR, AR, and MR—combined with haptic feedback is redefining how we train the next generation of surgeons. ⸻ 🎯 Why This Matters Traditional surgical training depends on time, case availability, and patient exposure. XR + haptics changes that. It allows surgeons to: ✔ Practice complex procedures repeatedly ✔ Receive real-time, objective feedback ✔ Train in a zero-risk environment ⸻ 📊 What the Evidence Shows Research across multiple specialties demonstrates: • 18–43% faster procedural times after VR training (Mao 2021) • Significant improvements in accuracy, efficiency, and error reduction (Portelli 2020) • Transfer effectiveness of 7–42% from simulation to real OR performance (Gallagher 2013) • Up to 42% improvement in procedural accuracy and 45% reduction in errors with AI-driven VR platforms (Kumar 2025) ⸻ 🖐️ The Role of Haptics Haptic feedback is the missing piece. It allows trainees to feel tissue resistance, force, and instrument interaction—bringing realism to simulation. Evidence shows: • Faster learning curves for complex procedures • Improved precision and force control • Strongest impact during early skill acquisition ⸻ 🧠 Beyond VR: The XR Ecosystem Each technology plays a different role: 🕶 VR – immersive training environments 👁 AR – overlays anatomy during real procedures 🔬 MR – blends physical and digital worlds Together, they create a continuous learning loop from simulation → guidance → clinical execution. ⸻ 🏥 Real-World Impact XR training is already showing: ✔ Improved OR performance ✔ Reduced operative time ✔ Better surgical fluency ✔ Enhanced confidence in trainees From orthopedics and neurosurgery to laparoscopy, the results are consistent. ⸻ ⚖️ The Reality Check Despite the promise, challenges remain: • High cost and infrastructure demands • Limited validation in some specialties • Need for curriculum integration • Time constraints for trainees XR is not replacing traditional training—but augmenting it. ⸻ 🔮 The Bigger Question If surgeons can now practice endlessly, fail safely, and improve objectively… Are we ready to redesign how we define competency in surgical education? ⸻ 📚 References Co 2023 Toni 2024 Mao 2021 Gallagher 2013 Palter 2014 Portelli 2020 Kovoor 2021 Zhang 2023 Rangarajan 2020 Kumar 2025 Dr. Sompop Bencharit is the President of Haptics & Artificial Intelligence in Dental Education Network, HAIDENer He is also a KOL for FifthIngenium and UNIDRAW. #virtualreality #surgery #education #ar #vr #xr #haptics #simulation
-
Feel the Invisible… Most haptic gear is still stuck in the buzz-and-vibrate phase. Big gloves. Full suits. Nothing that really feels right. Turns out, a team at the University of Illinois found a better way with a fingertip patch. It weighs just 0.3 grams. Inside are nine micro actuators that press and stretch like tiny muscles to simulate pressure, vibration, and texture. Each one’s built from a soft elastomer and a spring. When it expands, it pushes back. That’s how you get real feedback without adding bulk. The patch also senses touch. It can send and receive signals in real time. All from something the size of your fingertip. Still a lab project, but it makes sense. It’s light, flexible, and built to sit directly on the skin. If it scales, it could mean sensation in prosthetics, feedback in surgery, or touch in small robots. A small patch with a lot of reach. Where else could this kind of tech make a difference? Daily #electronics insights from Asia—follow me, Keesjan, and never miss a post by ringing my 🔔. #technology #innovation
-
Introduce InflatableBots, shape-changing inflatable #robots for large-scale encountered-type haptics in #VR. Unlike traditional inflatable shape displays, which are immobile and limited in interaction areas, our approach combines mobile robots with fan-based inflat- able structures. This enables safe, scalable, and deployable haptic interactions on a large scale. They developed three coordinated inflatable mobile robots, each of which consists of an omni-directional mobile base and a reel-based inflatable structure. The robot can simultaneously change its height and position rapidly (horizontal: 58.5 cm/sec, vertical: 10.4 cm/sec, from 40 cm to 200 cm), which allows for quick and dynamic haptic rendering of multiple touch points to simulate various body-scale objects and surfaces in real- time across large spaces (3.5 m x 2.5 m). They evaluated the system with a user study (N = 12), which confirms the unique advantages in safety, deployability, and large-scale interactability to significantly improve realism in VR experiences. #research #paper: https://lnkd.in/dKxW23tY #authors: Ryota Gomi, Ryo Suzuki, Kazuki Takashima, Kazuyuki Fujita, Yoshifumi Kitamura University of Calgary #robotics #innovation #technology #future
-
We are used to thinking that coordination depends on sight and sound. This Science Robotics paper, “Robot-mediated haptic feedback outperforms vision in violin duo coordination”, shows that touch can sometimes do more. The authors tested 20 violin duos under four sensory conditions: audio only, audio plus vision, audio plus haptics, and the full multisensory combination. The key result is very clear: haptic feedback improved spatiotemporal coordination and musical alignment more than the standard audio-visual condition, and the combination of audio, vision, and haptics produced the strongest overall performance. What makes this work valuable is the way haptics enters coordination. Vision is explicit. Haptics is embedded in movement itself. It shapes timing, alignment, and mutual adaptation from inside the sensorimotor loop. That changes the perspective on wearable robots. A wearable robot is not only a rehabilitation tool, an assistive interface, or a teleoperation device. It can also become a channel for human-human coordination. Music is just the demonstration here. The implications reach training, collaborative tasks, and any context where two people need to move together with precision. 👇 Link in the first comment. Where do you see robot-mediated haptics making the biggest impact first: rehabilitation, skill training, or collaborative work? #sciencerobotics #haptics #wearablerobotics #exoskeletons #motorcontrol #sensorimotor #musicperformance #robotics #rehabilitationengineering #neuroengineering #humanmachineinterface #motorlearning #biomedicalengineering #humanhumaninteraction
-
The Third Eye: How Sanctuary AI Demonstrates the Value of Force Sensing in Robotics. Robots rely on vision to understand their environment—but what if that’s not enough? Humans don’t just see; we feel.Touch gives us crucial information that vision alone can’t provide, yet many robots are still blind to the power of force sensing. Sanctuary AI is proving how vital tactile and force feedback are to robotic intelligence. By integrating advanced touch sensors into its Phoenix humanoid, the company enables robots to interact with objects more effectively, even when vision is obstructed. This unlocks capabilities like blind picking, slip detection, and controlled force application, making general-purpose robots more useful in real-world applications. Force sensing doesn’t just improve manipulation—it enhances learning. Sanctuary’s robots are already benefiting from richer behavioral data, allowing for faster adaptation to tasks. Their hydraulically actuated hands with fine-resolution tactile sensors are an example of how high-precision force feedback is advancing robotics. At ATI Industrial Automation, we support force sensing for robotics, including humanoids, to help robots learn, adapt, and deliver better outcomes in automation. By combining vision with force sensing, robots can achieve human-like dexterity, making them more reliable, efficient, and capable across industries. #robotics #humanoids
-
I am pleased to highlight the recent achievements of Dr SHI Ge, who completed his PhD at UCL Mechanical Engineering/UCL and is now a researcher at the Commonwealth Scientific and Industrial Research Organisation (CSIRO's Data61). SHI Ge's latest publication in the IEEE Transactions on Haptics (#ToH) presents a novel multi-cavity haptic feedback system. This system utilises a purely hydraulic-based approach that detects physical touch and delivers directional feedback through a fingertip sensor, paving the way for enhanced tactile interaction capabilities. Read the full article here: https://lnkd.in/enVR2CG5. This research builds upon his prior work on fluidic haptic interfaces for mechano-tactile feedback, previously published in the IEEE Transactions on Haptics (https://lnkd.in/edMDUntD), and modelled using finite deformation theory, which was featured in #SoftMatter Journal by The Royal Society of Chemistry. Read the full article here: https://lnkd.in/eNN87kfz In collaboration with Dr Jialei Shi, who graduated from UCL Mechanical Engineering/UCL earlier in the year and is now with the Hamlyn Centre for Robotic Surgery at Imperial College London, they developed a flexible, soft robotic handheld laparoscopic device, driven by this innovative multi-cavity touch interface. This work has been published in IEEE Transactions on Medical Robotics and Bionics (#TMRB): https://lnkd.in/eFVHB6Cd. This impactful research has been supported by UCL Grand Challenges, the EPSRC (grant number: EP/V01062X/1), and UCL-Indian Institute of Technology, Delhi Seed Funding 2020-21. #Haptics #Robotics #Research #Innovation #UCL #MedicalRobotics #IEEE
-
Here is an example showing how our soft 3D-printed haptic feedback device can be used to accurately direct a user based on force feedback. In this example, a user (i.e., the receiver) can accurately distinguish different load conditions from another user (i.e., the transmitter) based on the force feedback from the haptic device. Note that the two users cannot see each other. This device can be integrated into rehabilitation robots and robotic prostheses, as well as augmented, virtual, and mixed reality (AR/VR/MR) systems, to induce various types of bio-mimicked feedback. Full Paper | Open-Access | MDPI Biomimetics | A 3D-Printed Soft Haptic Device with Built-in Force Sensing Delivering Bio-Mimicked Feedback https://lnkd.in/ePJAdaMd University of Wollongong, University of Wollongong in Dubai, Lebanese American University Dilpreet S., Mert Aydin, Rahim Mutlu, Emre Sariyildiz #Haptics #Robotics #3DPrinting #Research
-
𝐀𝐒𝐂𝐄 𝐢𝟑𝐂𝐄 𝟐𝟎𝟐𝟓 𝐔𝐩𝐝𝐚𝐭𝐞𝐬 𝟯 𝗮𝗻𝗱 𝟰 Yuming Zhang shared two impactful studies on immersive VR-based training systems for worker-robotics interaction in construction, supported by NSF Awards# 2402008 and 2410255. 𝗜𝗻𝘃𝗲𝘀𝘁𝗶𝗴𝗮𝘁𝗶𝗻𝗴 𝘁𝗵𝗲 𝗥𝗼𝗹𝗲 𝗼𝗳 𝗛𝗮𝗽𝘁𝗶𝗰 𝗙𝗲𝗲𝗱𝗯𝗮𝗰𝗸 𝗶𝗻 𝗘𝗻𝗵𝗮𝗻𝗰𝗶𝗻𝗴 𝗜𝗺𝗺𝗲𝗿𝘀𝗶𝘃𝗲 𝗘𝘅𝗼𝘀𝗸𝗲𝗹𝗲𝘁𝗼𝗻 𝗧𝗿𝗮𝗶𝗻𝗶𝗻𝗴 𝗶𝗻 𝗖𝗼𝗻𝘀𝘁𝗿𝘂𝗰𝘁𝗶𝗼𝗻 A compelling presentation was given by Yuming, introducing a VR-based training platform that incorporates haptic feedback to address the limitations of conventional visual and auditory guidance in worker-exoskeleton interaction. A simulated construction environment was developed to evaluate how tactile cues affect task performance. Results indicate that haptic feedback improves posture accuracy, reduces task duration, and enhances perceived usability, highlighting the role of multisensory interaction in promoting safer and more intuitive exoskeleton use. 𝗜𝗻𝘃𝗲𝘀𝘁𝗶𝗴𝗮𝘁𝗶𝗻𝗴 𝘁𝗵𝗲 𝗜𝗺𝗽𝗮𝗰𝘁 𝗼𝗳 𝗩𝗶𝗿𝘁𝘂𝗮𝗹 𝗥𝗲𝗮𝗹𝗶𝘁𝘆 𝗧𝗿𝗮𝗶𝗻𝗶𝗻𝗴 𝗼𝗻 𝗖𝗼𝗴𝗻𝗶𝘁𝗶𝘃𝗲 𝗟𝗼𝗮𝗱 𝗶𝗻 𝗪𝗼𝗿𝗸𝗲𝗿–𝗨𝗻𝗺𝗮𝗻𝗻𝗲𝗱 𝗚𝗿𝗼𝘂𝗻𝗱 𝗩𝗲𝗵𝗶𝗰𝗹𝗲 𝗜𝗻𝘁𝗲𝗿𝗮𝗰𝘁𝗶𝗼𝗻𝘀 Yuming also shared insightful findings from a study focused on impact of VR-training on cognitive load during UGV operations. To address the cognitive demands of operating UGVs under dynamic construction conditions, a VR training platform was developed for interface-based UGV control. The study employed both subjective and physiological measures to assess cognitive load before and after training. Findings demonstrate that immersive VR training improves operational accuracy and reduces cognitive effort, supporting safer and more efficient human–robot collaboration. Congratulations to Yuming for driving forward innovation in immersive training and construction automation. Further details from these studies will be published in the ASCE i3CE 2025 Proceedings.
-
1 of 2 startup highlights from Canadian Space Day in SF I had to share. —Guillaume Falardeau presented how they’re solving a real problem — how do you give someone refined, precise control of a robot in hypersensitive, high-stakes environments? Surgery. Manufacturing. Situations where millimeters matter and you can’t afford a miss. Already in production. Shipping hundred of units with real use cases. They also announced the closing that day of another $11+Mil funding round. Haply Robotics Inverse3 haptic controller lets you feel what the robot feels. The level of sensitivity you get while controlling something remotely is incredible. This isn’t a spec-sheet flex — it’s built around what the operator actually needs accomplish projects that halt robots in production in challenging situations. #PhysicalAI #Robotics #Haptics #HaplyRobotics #canadianspaceday #SF
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development