Subsea Robotics Development

Explore top LinkedIn content from expert professionals.

Summary

Subsea robotics development involves designing and improving underwater robots, like uncrewed underwater vehicles (UUVs), to perform tasks such as exploration, navigation, and data collection in challenging marine environments. These robots rely on advanced AI, machine learning, and sensor technologies to operate where traditional methods, like GPS and cameras, are unreliable due to water conditions and low visibility.

  • Embrace AI solutions: Using machine learning and AI-based navigation models can help underwater robots adapt quickly and maintain accurate control even in rough or unpredictable ocean conditions.
  • Innovate sensor systems: Exploring new sensor technologies, including inertial measurement and velocity prediction, allows robots to navigate for long periods without visual data or expensive equipment.
  • Leverage simulation tools: High-performance simulators like OceanSim make it possible to test and refine underwater robot designs virtually, speeding up development and improving reliability before real-world deployment.
Summarized by AI based on LinkedIn member posts
  • View profile for Charles Durant

    Director Field Intelligence Element, National Security Sciences Directorate, Oak Ridge National Laboratory

    13,903 followers

    'Uncrewed underwater vehicles (UUVs) are underwater robots that operate without humans inside. Early use cases for the vehicles have included jobs like deep-sea exploration and the disabling of underwater mines. However, UUVs suffer from poor communication and navigation control because of water’s distorting effect. So researchers have begun to develop machine learning techniques that can help UUVs navigate better autonomously. Perhaps the biggest challenge the researchers are grappling with is the absence of GPS signals, which can’t penetrate beneath the water’s surface. Other types of navigational techniques that rely on cameras are also ineffective, because underwater cameras suffer from low visibility... ...In the study, which was published last month in the journal IEEE Access, researchers from Australia and France used a type of machine learning called deep reinforcement learning to teach UUVs to navigate more accurately under difficult conditions. In reinforcement learning, UUV models start by performing random actions, then observe the results of those actions and compare them to the goal—in this case, navigating as closely as possible to the target destination. Actions that lead to positive results are reinforced, while actions that lead to poor results are avoided.' https://lnkd.in/e8rEHH7U

  • View profile for Kostas Alexis

    Professor at Norwegian University of Science and Technology (NTNU)

    11,548 followers

    Underwater robots will either have to continue relying on expensive, typically bulky, and often power-hungry sensors such as optical gyroscopes or 3D sonars, or they will exploit novel approaches that, at the extreme, allow long-term navigation even in the lack of perceptual data and more broadly facilitate extremely low-light vision-driven odometry. To that end, we present "DeepVL," a novel method on Dynamics- and Inertial Measurements-based Deep Velocity Learning for Underwater Odometry. The work - led by Mohit Singh - presents a learned model to predict the robot-centric velocity of an underwater robot through dynamics-aware proprioception. The method exploits a recurrent neural network using as inputs inertial cues, motor commands, and battery voltage readings alongside the hidden state of the previous time-step to output robust velocity estimates and their associated uncertainty. An ensemble of networks is utilized to enhance the velocity and uncertainty predictions. Fusing the network's outputs into an Extended Kalman Filter, alongside inertial predictions and barometer updates, the method enables long-term underwater odometry without further exteroception. Furthermore, when integrated into visual-inertial odometry, the method assists in enhanced estimation resilience when dealing with an order of magnitude fewer total features tracked (as few as 1) as compared to conventional visual-inertial systems. Tested onboard an underwater robot deployed both in a laboratory pool and the Trondheim Fjord, the method takes less than 5ms for inference either on the CPU or the GPU of an NVIDIA Orin AGX and demonstrates less than 4% relative position error in novel trajectories during complete visual blackout, and approximately \SI{2}{\percent} relative error when a maximum of 2 visual features from a monocular camera are available. https://lnkd.in/dJfREQwP #robotics #autonomy #underwater #vision #odometry #slam #navigation #ntnu #maritime #rcn

  • View profile for Pedro Guillen

    COO @ Centrepolis Accelerator | Helping HardTech & Dual-Use Startups Commercialize Faster | Innovation Ecosystem Builder

    4,367 followers

    🤖 AI Navigation for Underwater Robots! 🚤 Researchers developed a biologically-inspired AI system to improve control of unmanned underwater vehicles (UUVs) in rough seas. It uses two memory buffers - focusing on recent experiences and positive rewards. This lets UUVs quickly adapt to varying conditions. In simulations, the new approach stabilized UUV maneuvering twice as fast as standard methods! AI continues to push autonomy further, enabling robots to tackle unpredictable real-world environments! 🌊 Stay tuned! More ocean exploration innovations ahead! #AI #robotics #oceans authors and affiliations Thomas Chaffre, PhD Flinders University, Adelaide, SA, Australia ARC Training Centre For Biofilm Research and Innovation, Paulo E. Santos College of Science and Engineering, Flinders University CNRS International - NTU - Thales Research Alliance Gilles LE CHENADEC ENSTA Bretagne Researcher at the Naval Group Research Center, Ollioules, France Karl Sammut College of Science and Engineering, Flinders University Crossing Centre for Defense Engineering Research and Training, Flinders University Theme Leader for the Maritime Autonomy Group Clément BENOIT College of Science and Engineering, Flinders University Department, ENSTA Bretagne Published in IEEE Access, DOI: 10.1109/ACCESS.2023.3329136 https://lnkd.in/g3_fjVE2 https://lnkd.in/gr662MdX

  • View profile for Justin Nerdrum

    B2G Growth Strategist | Daily Awards & Strategy | USMC Veteran

    19,978 followers

    Anduril delivers Ghost Shark a year early. A$1.7B for robot submarines. Traditional defense contractors just got disrupted underwater. September 2025. Australia buys dozens of AI-driven autonomous subs. From concept to production in under 4 years. Collins-class replacement timeline was 20+ years. The specs. • 6,000m depth (deeper than any crewed sub) • 10-day autonomous endurance • Modular payload bays for ISR/strike/EW • Lattice AI for swarm operations • No pressure hull = cheaper, faster, stealthier China's building 6 subs annually. We build one every 3 years. Ghost Shark changes the math. Build 200 robot subs for the price of one Virginia-class. 40+ Australian firms in supply chain. Kraken Robotics provides sonar and batteries. Local robotic factory scales to dozens annually. Sovereign capability without decade-long shipyard bottlenecks. Three shifts emerge. 1. Software-defined beats steel hulls 2. Modular design enables rapid iteration   3. AI autonomy solves crew shortages U.S. Navy ordered one prototype. AUKUS Pillar II integration underway. When one ally fields swarms, others follow. Your undersea systems ready for autonomous wolfpacks? Supply chain mapped for robotic production? Traditional shipyards preparing for obsolescence? Speed kills. Above and below the waterline.

  • View profile for Asif Razzaq

    Founder @ Marktechpost (AI Dev News Platform) | 1 Million+ Monthly Readers

    35,056 followers

    University of Michigan Researchers Introduce OceanSim: A High-Performance GPU-Accelerated Underwater Simulator for Advanced Marine Robotics Researchers from the University of Michigan have proposed OceanSim, a high-performance underwater simulator accelerated by NVIDIA parallel computing technology. Built upon NVIDIA Isaac Sim, OceanSim leverages high-fidelity, physics-based rendering, and GPU-accelerated real-time ray tracing to create realistic underwater environments. It bridges underwater simulation with the rapidly expanding NVIDIA Omniverse ecosystem, enabling the application of multiple existing sim-ready assets and robot learning approaches within underwater robotics research. Moreover, OceanSim allows the user to operate the robot, visualize sensor data, and record data simultaneously during GPU-accelerated simulated data generation. OceanSim utilizes NVIDIA’s powerful ecosystem, providing real-time GPU-accelerated ray tracing while allowing users to customize underwater environments and robotic sensor configurations. OceanSim implements specialized underwater sensor models to complement Isaac Sim’s built-in capabilities. These include an image formation model capturing water column effects across various water types, a GPU-based sonar model with realistic noise simulation for faster rendering, and a Doppler Velocity Log (DVL) model that simulates range-dependent adaptive frequency and dropout behaviors. For imaging sonar, OceanSim utilizes Omniverse Replicator for rapid synthetic data generation, establishing a virtual rendering viewport that retrieves scene geometry information through GPU-accelerated ray tracing..... Read full article: https://lnkd.in/gjTAkB2b Paper: https://lnkd.in/gEhq-SNQ

  • View profile for Nicholas Nouri

    Founder | Author

    132,612 followers

    Eelume is not just a robot; it's a highly adaptable autonomous underwater vehicle equipped with robotic arms. Designed to navigate through the challenging and confined spaces of subsea structures, it offers a new level of efficiency and safety in underwater operations. ✨ 𝐊𝐞𝐲 𝐅𝐞𝐚𝐭𝐮𝐫𝐞𝐬 𝐚𝐧𝐝 𝐈𝐧𝐧𝐨𝐯𝐚𝐭𝐢𝐨𝐧𝐬: Autonomous Functionality: Eelume vehicles can travel autonomously over long distances underwater, making them perfect for extensive inspection tasks without human intervention. Advanced Maneuverability: With its snake-like design, Eelume can access tight spots that are typically challenging or risky for human divers or bulkier machines. Equipped with High-Tech Sensors: These robots carry sensors capable of performing detailed environmental surveys-detecting anomalies like gas leaks or oil spills, and monitoring temperature changes. Eco-Friendly Operations: By eliminating the need for surface vessels in many scenarios, Eelume not only reduces operational costs but also significantly cuts down the carbon footprint associated with marine operations. 🚀 𝐓𝐡𝐞 𝐃𝐞𝐯𝐞𝐥𝐨𝐩𝐦𝐞𝐧𝐭 𝐚𝐧𝐝 𝐈𝐦𝐩𝐚𝐜𝐭: Originating from the Norwegian University of Science and Technology (NTNU) in 2015, Eelume was developed to tackle specific challenges in the underwater industry. In collaboration with giants like Kongsberg Maritime and Equinor, its technology has evolved to meet the rigorous demands of underwater inspection and maintenance, particularly for offshore wind farms and oil & gas facilities. 📈 𝐁𝐞𝐧𝐞𝐟𝐢𝐭𝐬 𝐭𝐨 𝐈𝐧𝐝𝐮𝐬𝐭𝐫𝐲: Safety: Reduces the need for human divers in potentially hazardous underwater environments. Cost Reduction: Minimizes the need for expensive surface support vessels and reduces operational costs. Environmental Monitoring: Enhances the ability to monitor and respond to environmental issues quickly and effectively. Operational Efficiency: Increases the frequency and quality of inspections and maintenance, which can extend the lifespan of critical infrastructure. 🤔 𝐂𝐨𝐧𝐬𝐢𝐝𝐞𝐫𝐚𝐭𝐢𝐨𝐧𝐬: While the technology promises a transformation in subsea operations, its implementation requires upfront investment in robotics and training. Moreover, integration with existing systems must be managed carefully to maximize the benefits. How do you see autonomous technologies like Eelume affecting the future of marine research and industry operations? Could this be the key to safer, more sustainable underwater work? #robotics #technology #sustainability #innovation #ai

Explore categories