Data Center Cooling Solutions

Explore top LinkedIn content from expert professionals.

  • View profile for Andy Jassy
    Andy Jassy Andy Jassy is an Influencer
    1,036,488 followers

    Every cloud provider faces the same AI infrastructure challenge: chips need to be positioned close together to exchange data quickly, but they generate intense heat, creating unprecedented cooling demands. We needed a strategic solution that allowed us to use our existing air-cooled data centers to do liquid cooling without waiting for new construction. And it needed to be rapidly deployed so we could bring customers these powerful AI capabilities while we transition towards facility-level liquid cooling. Think of a home where only one sunny room needs AC, while the rest stays naturally cool – that’s what we wanted to achieve, allowing us to efficiently land both liquid and air-cooled racks in the same facilities with complete flexibility. The available options weren't great. Either we could wait to build specialized liquid-cooled facilities or adopt off-the-shelf solutions that didn't scale or meet our unique needs. Neither worked for our customers, so we did what we often do at Amazon… we invented our own solution. Our teams designed and delivered our In-Row Heat Exchanger (IRHX), which uses a direct-to-chip approach with a "cold plate" on the chips. The liquid runs through this sealed plate in a closed loop, continuously removing heat without increasing water use. This enables us to support traditional workloads and demanding AI applications in the same facilities. By 2026, our liquid-cooled capacity will grow to over 20% of our ML capacity, which is at multi-gigawatt scale today. While liquid cooling technology itself isn't unique, our approach was. Creating something this effective that could be deployed across our 120 Availability Zones in 38 Regions was significant. Because this solution didn't exist in the market, we developed a system that enables greater liquid cooling capacity with a smaller physical footprint, while maintaining flexibility and efficiency. Our IRHX can support a wide range of racks requiring liquid cooling, uses 9% less water than fully-air cooled sites, and offers a 20% improvement in power efficiency compared to off-the-shelf solutions. And because we invented it in-house, we can deploy it within months in any of our data centers, creating a flexible foundation to serve our customers for decades to come. Reimagining and innovating at scale has been something Amazon has done for a long time and one of the reasons we’ve been the leader in technology infrastructure and data center invention, sustainability, and resilience. We're not done… there's still so much more to invent for customers.

  • View profile for Rich Miller

    Authority on Data Centers, AI and Cloud

    48,437 followers

    AWS Builds Custom Liquid Cooling System for Data Centers Amazon Web Services (AWS) is sharing details of a new liquid cooling system to support high-density AI infrastructure in its data centers, including custom designs for a coolant distribution unit and an engineered fluid. “We've crossed a threshold where it becomes more economical to use liquid cooling to extract the heat,” said Dave Klusas, AWS’s senior manager of data center cooling systems, in a blog post. The AWS team considered multiple vendor liquid cooling solutions, but found none met its needs and began designing a completely custom system, which was delivered in 11 months, the company said. The direct-to-chip solution uses a cold plate placed directly on top of the chip. The coolant, a fluid specifically engineered by AWS, runs in tubes through the sealed cold plate, absorbing the heat and carrying it out of the server rack to a heat rejection system, and then back to the cold plates. It’s a closed loop system, meaning the liquid continuously recirculates without increasing the data center’s water consumption. AWS also developed a custom coolant distribution unit, which it said is more powerful and more efficient than its off-the-shelf competitors. “We invented that specifically for our needs,” Klusas says. “By focusing specifically on our problem, we were able to optimize for lower cost, greater efficiency, and higher capacity.” Klusas said the liquid is typically at “hot tub” temperatures for improved efficiency. AWS has shared details of its process, including photos: https://lnkd.in/e-D4HvcK

  • View profile for AZIZ RAHMAN

    Strategic Mechanical Engineering Consultant | 32 Years in Heavy Manufacturing, Plant Engineering & QA/QC | Former SUPARCO Leader | Helping Manufacturers Optimize Operations & Scalability | Open for strategic consultancy.

    37,616 followers

    THE TECHNOLOGY BEHIND FLUORINATED INSULATION LIQUID AND IMMERSION COOLING. 1. Fluorinated insulation liquids are engineered fluids that do not conduct electricity, making them ideal for cooling electronics directly. 2. These liquids are chemically inert, meaning they don’t corrode or react with components, ensuring long-term reliability. 3. They have high dielectric strength, allowing safe immersion of high-voltage devices like servers, transformers, and supercomputers. 4. Used in immersion cooling, hardware is fully or partially submerged in the liquid to efficiently dissipate heat. 5. These liquids typically include perfluorocarbons (PFCs) or fluoroketones, which are stable and thermally efficient. 6. Immersion cooling eliminates the need for traditional fans or air conditioning, drastically reducing energy consumption. 7. The liquids have low viscosity, allowing better flow and even heat distribution around all hardware surfaces. 8. Fluorinated liquids are non-flammable and thermally stable up to high temperatures, making them safe in demanding environments. 9. In data centers, immersion cooling using these fluids allows for higher server density, saving space and infrastructure costs. 10. These liquids are reusable and recyclable, lowering long-term operating and environmental costs. 11. They support quiet operations since there are no moving fan parts or airflow systems involved. 12. Fluorinated liquids also have low global warming potential when designed with modern eco-safe formulations. 13. They are used in modular data centers, edge computing stations, and blockchain mining farms for heat control. 14. The technology supports zero water usage, unlike traditional cooling towers that consume large volumes. 15. These liquids allow precise thermal control, even in overclocked or mission-critical systems. 16. They're ideal for cooling GPU-intensive tasks like AI processing, VR simulations, and scientific computing. 17. In telecom and defense, immersion cooling using fluorinated liquids offers high system reliability in harsh environments. 18. The liquids are easy to monitor and maintain with sensors that track clarity, temperature, and level. 19. With no air required, there’s no dust buildup, keeping systems cleaner and reducing maintenance cycles. 20. Fluorinated insulation liquids are pushing the future of sustainable high-performance computing, where silence meets power.

  • View profile for Lubomila J.
    Lubomila J. Lubomila J. is an Influencer

    Group CEO Diginex │ Plan A │ Greentech Alliance │ MIT Under 35 Innovator │ Capital 40 under 40 │ BMW Responsible Leader │ LinkedIn Top Voice

    168,235 followers

    The Water Footprint of AI: Why We Need to Pay Attention to Its Environmental Cost As artificial intelligence continues to advance, its environmental impact, particularly concerning water consumption in data centres, warrants attention. Understanding AI's Water Usage AI models, especially large language models, require substantial computational resources. This computing power, concentrated in data centres, generates significant heat, necessitating extensive cooling, often through water-based systems. - Per Query Water Usage: Each interaction with AI models like ChatGPT consumes water. For instance, a 20-50 question session can use approximately 500 millilitres of water, primarily for cooling purposes. - Industry Impact: Data centres globally consumed over 660 billion liters of water in 2022 to cool servers running various services, including AI workloads. Key Areas of Concern 1. Water Scarcity: Many data centres are located in regions with limited water resources. In areas like California, where numerous tech companies operate, water-intensive cooling for AI adds strain to local supplies. 2. Seasonal Impact: During summer, data centres often double their water usage to maintain optimal temperatures. With climate change leading to more frequent heatwaves, this demand could increase, exacerbating the impact. 3. Comparative Impact: Training large AI models can consume up to five times more water than traditional data center operations, highlighting the need for efficient resource management. Steps Toward Sustainability To foster a more sustainable AI ecosystem, the tech industry can consider the following measures: 1. Adopt Alternative Cooling Solutions: Implementing methods like liquid immersion cooling, direct air cooling, and utilising recycled water systems can reduce water demands by up to 90% in certain environments. 2. Enhance Transparency and Accountability: Publicly reporting water usage and environmental impact data allows companies to foster accountability and enable informed consumer choices. Currently, only a few tech giants release detailed sustainability reports on water use. 3. Optimise Model Efficiency: Redesigning models to perform with lower computational intensity can significantly reduce both water and energy requirements. Model efficiency improvements, even by 10-15%, can save millions of litres of water annually. While AI offers transformative benefits across various sectors, it's crucial to balance its growth with responsible resource use. Focusing on sustainable AI practices is essential not only for environmental preservation but also for the technology's long-term viability.By embracing these strategies, we can ensure AI's advancement doesn't come at the expense of our planet's resources. Visual: The Times #ai #waterconsumption #sustainability #datacenters #environmentalimpact #greenai

  • View profile for Pooja Jain

    Open to collaboration | Storyteller | Lead Data Engineer@Wavicle| Linkedin Top Voice 2025,2024 | Linkedin Learning Instructor | 2xGCP & AWS Certified | LICAP’2022

    194,445 followers

    Curious to know how data engineers impact your daily life? The invisible architects behind every digital experience you love. 🎯 𝗪𝗵𝗮𝘁 𝗗𝗮𝘁𝗮 𝗘𝗻𝗴𝗶𝗻𝗲𝗲𝗿𝘀 𝗗𝗼 • Build data pipelines that process millions of transactions per second • Design scalable systems that handle massive amounts of real-time information• Create recommendation engines that power personalized experiences • Implement security systems that protect your personal data • Optimize performance to ensure apps load instantly 📱 𝗥𝗲𝗮𝗹-𝗪𝗼𝗿𝗹𝗱 𝗜𝗺𝗽𝗮𝗰𝘁: Your Daily Digital Journey - 𝗠𝗼𝗿𝗻𝗶𝗻𝗴: 𝗡𝗲𝘁𝗳𝗹𝗶𝘅 𝗥𝗲𝗰𝗼𝗺𝗺𝗲𝗻𝗱𝗮𝘁𝗶𝗼𝗻𝘀 🎬 • The Experience: “Recommended for You” appears with perfect suggestions • Behind the Scenes: • Data engineers process viewing history from 230M+ subscribers • Real-time analysis of watch patterns, ratings, and genre preferences • Machine learning models updated continuously with fresh data 𝗖𝗼𝗺𝗺𝘂𝘁𝗲: 𝗚𝗼𝗼𝗴𝗹𝗲 𝗠𝗮𝗽𝘀 𝗧𝗿𝗮𝗳𝗳𝗶𝗰 🗺️ • The Experience: Real-time traffic updates and optimal route suggestions • Behind the Scenes: • GPS data from millions of phones processed every second • Traffic sensor data integrated from thousands of sources • Historical pattern analysis predicting congestion before it happens 𝗦𝗵𝗼𝗽𝗽𝗶𝗻𝗴: 𝗣𝗲𝗿𝘀𝗼𝗻𝗮𝗹𝗶𝘇𝗲𝗱 𝗘-𝗰𝗼𝗺𝗺𝗲𝗿𝗰𝗲 🛒 • The Experience: “Customers also bought” and dynamic pricing • Behind the Scenes: • Purchase history analysis across millions of transactions • Inventory management systems updating stock levels instantly • Price optimization algorithms running 24/7 𝗕𝗮𝗻𝗸𝗶𝗻𝗴: 𝗙𝗿𝗮𝘂𝗱 𝗗𝗲𝘁𝗲𝗰𝘁𝗶𝗼𝗻💰 • The Experience: Instant transaction approvals with security alerts for suspicious activity • Behind the Scenes: • Millisecond analysis of spending patterns • Machine learning models detecting anomalies in real-time • Risk scoring systems protecting billions in transactions daily 💡 Are you ready to explore Data Engineering? For Aspiring Engineers: • Start with SQL and Python fundamentals • Practice with cloud platforms (AWS/GCP offer free tiers) • Build portfolio projects using real datasets • Join data engineering communities and bootcamps For Businesses: • Assess your current data infrastructure needs • Consider partnering with experienced data engineering teams • Invest in scalable cloud architectures from day one What’s your real time experience that’s making an impact as a data engineer? ▶️ Stay tuned with me (Pooja) for more on Data Engineering. ♻️ Reshare if this resonates with you! #data #engineering

  • View profile for Alyson Freeman, Ph.D.

    AI | Data Center | Energy | Board Member | Science Communicator | Keynote Speaker | Women in Technology

    12,600 followers

    The theme for Earth Day this year is "Our Power, Our Planet." I know it's meant to be about our individual power to drive change, but it immediately made me think about one of the biggest discussions about power in the news the days: new data centers. People are concerned that data centers will strain local power grids and water supplies, create persistent noise pollution from cooling systems, and receive significant tax breaks without providing substantial long-term employment or protecting the community's aesthetic character. This development is happening so quickly, people rightfully want to understand the impacts. They want to know, five years down the road, "how is it going to impact my life, my family and my neighbors?" And I want to help them.   In my opinion, instead of cities saying "no data centers," they should be saying "no data centers unless..." they: - Implement closed-loop or "waterless" cooling architectures that achieve a Water Usage Effectiveness (WUE) near zero, eliminating the evaporation of millions of gallons of local fresh water. - Integrate waste heat recovery systems to provide "free" thermal energy to local district heating networks, industrial processes, or greenhouses, turning a byproduct into a community asset. - Commit to 24/7 carbon-free energy (CFE) procurement, matching their hourly demand with local clean generation rather than relying on yearly offsets that can strain the local grid during peak hours. - Prioritize liquid-to-chip or immersion cooling for AI-heavy workloads, which significantly reduces the energy overhead required for thermal management compared to traditional air-cooled systems. - Utilize reclaimed or "grey" water for any supplemental cooling needs, ensuring that data center operations do not compete with residents for the municipal potable water supply. There are nuances here: Because cooling systems that save electricity often consume significant amounts of water (evaporative cooling), and systems that save water often require more electricity (dry cooling), a "one size fits all" policy doesn't work. The right way to manage a data center depends on where it is. In places with very little water, saving that water is the top priority, even if the center ends up using more electricity.    Instead of banning these projects entirely, cities should use their control over the power grid to demand better environmental deals. Since we depend on this technology every day, the goal should be to find ways to build data centers that do not hurt the local community. Our power, our planet. #OurPowerOurPlanet #EarthDay2026 #DataCenters

  • View profile for John W Mitchell

    Electronics Industry Champion | Standards | Workforce Advocate | Speaker | Author | CEO

    14,836 followers

    We talk a lot about AI models, but not nearly enough about the hardware that makes them possible. This story deserves a closer look. Microsoft, working with Swiss startup Corintis, just developed a bio-inspired microfluidic cooling system that channels liquid directly across a chip’s surface, removing heat up to three times more efficiently than traditional cold plates. The payoff? Cooler chips, denser servers, and datacenters that use less energy to do the same work. It’s a perfect example of systems thinking, silicon, coolant, server, and data centers all working together as one ecosystem. When we start thinking that way, we don’t just improve performance, we open the door to entirely new architectures, new efficiencies, and new kinds of innovation. http://bit.ly/42uhue3

  • View profile for Erick Hadi

    KOL in Data Center & Digital Infrastructure || Founder at Nusantara Academy || Producer at Podcast Nusantara || CEO at Electronic Science Indonesia // DCP® // DCS® // CSAA®

    25,465 followers

    During the making of this CNA feature, I was asked what kind of #incentives are needed to attract more data center operators to Indonesia. My answer: 𝗜𝗻𝗰𝗲𝗻𝘁𝗶𝘃𝗲𝘀 𝗺𝘂𝘀𝘁 𝗿𝗲𝗱𝘂𝗰𝗲 𝘂𝗻𝗰𝗲𝗿𝘁𝗮𝗶𝗻𝘁𝘆, 𝗻𝗼𝘁 𝗷𝘂𝘀𝘁 𝗿𝗲𝗱𝘂𝗰𝗲 𝗰𝗼𝘀𝘁. On the fiscal side: competitive #tax incentives, accelerated depreciation for #green infrastructure, and clear #customs policies for critical equipment matter. But equally important are non-fiscal incentives: #policy consistency, faster #permitting, transparent access to power and land, and most importantly ....𝗮 𝗿𝗲𝗹𝗶𝗮𝗯𝗹𝗲 𝗹𝗼𝗰𝗮𝗹 𝘁𝗮𝗹𝗲𝗻𝘁 𝗽𝗶𝗽𝗲𝗹𝗶𝗻𝗲. Investors come not only because Indonesia is cost-effective, but because it is predictable, sustainable, and capable of operating data centers at global #standards. Human capital is the strongest incentive of all. Indonesia has a young and capable #workforce, yet the education system has not moved at the same speed as the data center industry. Most graduates are strong in theory, but data centers demand hands-on operational readiness: power systems, cooling, uptime discipline, safety culture, and incident response. These are not skills learned from textbooks alone. Modern data centers are no longer just IT facilities. They are energy-intensive infrastructure. Operators now need professionals who understand not only how servers run, but how energy flows...from efficiency and renewable integration to #carbon accountability and long-term sustainability planning. For us, the future data center engineer is a hybrid professional: technically excellent, energy-aware, and environmentally responsible. At Nusantara Data Center Academy, we focus on closing this gap by bridging education with real operational standards, so Indonesian talent is not just employable, but globally competitive. As shared in the report: "We don't just teach theory. We provide real experience. Besides (becoming) trainers, we're also industry practitioners." — Nawi Jaya "Students learn how to operate and manage data centers. Those talents can be absorbed into the market." — Stephanus Oscar "We are aiming across the board in Indonesia. This is a program for the nation." — Dharma Simorangkir Appreciation to Microsoft for continuously supporting the workforce development program. Special mention to Arina Dafir and Dania Rari Pratiwi. OJT Partners: AREA 31 BDx Data Centers Bitera DC DCI Indonesia Digital Edge DC Indodata K2 STRATEGIC NeutraDC PT CBN Nusantara ( NEX Datacenter ) Princeton Digital Group Pure Data Centres Group Certification Partners: iTEP International DCD Academy Data Center Facilitators: Azbil Southeast Asia & India Centiel Collega Inti Pratama, PT Digital Edge DC Haskoning ISS Indonesia Microsoft Sunway Digital Indonesia STULZ Socomec Group Sumitomo Mitsui Construction Co. Ltd Turner & Townsend Yondr Group Watch the full CNA feature: https://lnkd.in/gEpDcKzs

  • View profile for Sandeep Y.

    Bridging Tech and Business | Transforming Ideas into Multi-Million Dollar IT Programs | PgMP, PMP, RMP, ACP | Agile Expert in Physical infra, Network, Cloud, Cybersecurity to Digital Transformation

    6,876 followers

    The data centre isn't overheating. It's boiling like tandoors. Why? Because air cooling taps out at 30–40 kW/rack. And your AI racks? They're pushing 80–100 kW, as if they’ve something to prove. PAC units? Bro, they’re industrial-sized jugaad with a maintenance contract at this point. Enter: Direct-to-Chip Cooling → Liquid pumped straight to the CPU → No hot aisle drama → Heat pulled directly off the silicon — no airflow guesswork, no aisle math. Why it matters: → Drops PUE to <1.1 (yes, that low) → Handles racks punching 100kW+ → No more “bhaiya, chill water flow badha do” every summer It’s not just a science fair demo anymore. Meta, Microsoft, and government labs are already deploying it. Because guess what? AI workloads don’t care about your airflow diagrams — they care about thermals. Before you go full coolant-core: – Identify thermal thugs (dense racks, rogue GPUs) – Check if racks can handle plumbing (leaks ≠ features) – Rethink PDU layout (liquid + electricity = nope) – Budget for literal plumbing (pipes ≠ low-code) – Train ops in leak control, not just log control – Expect supply chain delays (valves don’t autoscale) – And yes, keep drip trays. For hardware and emotions. Would you run coolant through a $250K server? Too late. CIO already said, “Let’s innovate,” while Procurement screamed, “PO raised.” Liquid cooling: because airflow is cute until GPUs start boiling chai. P.S. You used to fear downtime. Now, you fear thermal maps.

  • View profile for Sainath H.

    145,000+ Followers I Sr. Application Engineer at Grindwell Norton (Saint-Gobain) I IoT, Data, Analytics & AI Solutions I Manufacturing Digitalization Strategist I Industrial Innovation Updates

    145,787 followers

    The idea of submerging computer servers in a liquid coolant to cut data center energy consumption by 70% is a breakthrough in sustainable tech innovation. Traditional cooling systems consume significant energy, but with non-conductive liquid coolants, it's possible to safely dissipate heat while keeping electrical circuits dry and operational. This method optimizes thermal management, capturing all the generated heat and drastically reducing the need for conventional fans and chillers. Sandia National Laboratories approach could set a new standard for energy efficiency in data centers, making them greener and more cost-effective. Florian Palatini ++

Explore categories