Computational Materials Science

Explore top LinkedIn content from expert professionals.

Summary

Computational materials science uses powerful computer models and artificial intelligence to predict, design, and understand the properties of materials at the atomic level. This field is transforming how new materials are discovered, making the process faster and more precise by combining physics-based simulations with data-driven approaches.

  • Embrace AI-driven discovery: Use machine learning models and robust materials databases to screen thousands of candidates quickly and pinpoint promising options for batteries, alloys, and advanced materials.
  • Integrate physics and data: Combine traditional simulations with innovative neural networks that embed physical laws, leading to more reliable predictions even with limited experimental data.
  • Leverage real-time feedback: Implement autonomous laboratories and digital twins to automate experimentation and guide material design, speeding up innovation cycles and improving data quality.
Summarized by AI based on LinkedIn member posts
  • View profile for Alexey Navolokin

    FOLLOW ME for breaking tech news & content • helping usher in tech 2.0 • at AMD for a reason w/ purpose • LinkedIn persona •

    778,902 followers

    AI isn’t just writing code anymore. It’s inventing matter. Material science used to be painfully slow — 10–20 years from discovery to deployment. What do you think about this animation? AI flipped that timeline. Today: • ML models screen millions of material candidates in days, not decades • Databases like the Materials Project now contain 150,000+ computed materials ready for AI-driven discovery • AI-accelerated simulations run 100–1,000× faster than traditional quantum methods • In batteries alone, AI has helped identify materials that cut discovery cycles by ~70% • Autonomous labs can test hundreds of formulations per week, learning in real time This is how we get: + higher-density, longer-life batteries + aerospace alloys that are lighter and stronger + chips with better thermal performance at smaller nodes + low-carbon cement, recyclable plastics, and rare-element replacements The next breakthroughs in AI, energy, climate tech, and hardware won’t come from software alone. They’ll come from materials designed by AI. We’re no longer just training models. We’re training the building blocks of reality. #AI #MaterialScience #DeepTech #AdvancedManufacturing #Semiconductors

  • View profile for Jorge Bravo Abad

    AI/ML for Science & DeepTech | Prof. of Physics at UAM | Author of “IA y Física” & “Ciencia 5.0”

    29,006 followers

    A foundation model for atomistic materials chemistry Atomistic simulations are the “engines” behind much of modern materials science. They tell us how atoms move, how bonds break, how defects migrate. But there’s a catch: accurate ab initio methods like density functional theory (DFT) are extremely expensive, while traditional force fields are fast but miss important quantum-mechanical effects. Most machine-learning force fields sit somewhere in between—but usually have to be rebuilt from scratch for each material or chemistry. Ilyes Batatia and coauthors present MACE-MP-0, a foundation model for atomistic materials chemistry. Instead of training a separate model for every system, they train one general model on a huge dataset of DFT trajectories from the Materials Project, then reuse it across chemistry. Under the hood, MACE-MP-0 is an equivariant graph neural network that predicts energies and forces, effectively replacing repeated DFT calls with a learned surrogate. What’s remarkable is how far this single model goes out of the box. It runs stable molecular dynamics for solids, liquids, gases, interfaces, catalysts, metal–organic frameworks—even a small protein—often reproducing DFT trends without any task-specific retraining. And when full DFT-level accuracy is needed, fine-tuning on just a small number of new configurations is usually enough to closely match reference energies and forces, at a fraction of the cost of building a new potential. Instead of treating every new material as a fresh modeling project, approaches like MACE-MP-0 turn DFT-quality simulation into a reusable backbone: pretrain once on large ab initio datasets, then adapt quickly to new chemistries and conditions. For atomistic modeling, that’s a step change in how we explore materials space—both for expert groups and for teams just starting to adopt ML force fields. Paper: https://lnkd.in/dTBWiQHv #AI #MachineLearning #MaterialsScience #ComputationalChemistry #DFT #MolecularDynamics #MLForceFields #AtomisticModeling #FoundationModels #GraphNeuralNetworks #AIforScience #MaterialsDiscovery #Catalysis #BatteryResearch #ResearchInnovation

  • View profile for Pradyumna Gupta

    Building Infinita Lab - Uber of Materials Testing | Driving the Future of Semiconductors, EV, and Aerospace with R&D Excellence | Collaborated in Gorilla Glass's Invention | Material Scientist

    20,789 followers

    AI-Ready materials databases are quietly rewriting how discovery happens. The next leap in materials science isn’t coming from a new lab technique or a faster simulator. It’s coming from how materials data is being rebuilt for AI. Databases like the Materials Project are no longer designed for human lookup. Under pressure from DOE materials programs, NVIDIA’s AI-for-science stack, and Google’s ML research teams, they’re being reshaped into AI-native knowledge layers. This matters because discovery has changed: → Labs can now pre-screen tens of thousands of candidates computationally before touching synthesis. → Models predict ion conductivity, phase stability, and defect tolerance by learning from structured physics-aware data, not raw numbers. → Experimental metadata is becoming as valuable as the measurements themselves. The real shift is this: AI is starting to understand material behavior, not just fit trends, because the data finally encodes constraints, uncertainty, and failure modes. In batteries, high-entropy alloys, and polymer electrolytes, computation and experiment are merging into a single feedback loop. By 2026, materials databases will not support discovery. They are the discovery pipeline. The contrarian truth senior leaders should internalize: The bottleneck in materials innovation was never samples. It was data quality, and that wall is finally breaking. This isn’t a tooling upgrade. It’s an infrastructure shift that will decide who leads the next decade of materials breakthroughs. #MaterialScience #AIReadyMaterials #MaterialInnovation

  • View profile for Jaafar El-Awady

    Professor at Johns Hopkins University

    4,645 followers

    One of the most widely used principles for predicting polycrystalline metal strength is the Hall-Petch relationship, correlating average grain size to flow stress or yield strength through an inverse square root law. However, despite being proposed over 70 years ago, its applicability has been debated due to inconsistent experimental and simulation datasets. For example, the database for the most widely studied metal spans around 200 points, with reported power law exponents varying from -0.1 to -1 between different studies. Moreover, as an averaging equation, Hall-Petch does not capture variability at a particular grain size that can span orders of magnitude. To overcome such data scarcity barriers, we developed a computational approach combining physics-based modeling and machine learning. We created a large dataset of over 1 million datapoints mapping nickel's flow stress to microstructural features like grain sizes, orientations, dislocation densities, sample dimensions, number of grains through thickness, and strain levels. Gaussian mixture models and neural networks were then trained on this data to enable probabilistic strength predictions as a function of the microstructural features. Analysis of this extensive dataset shows excellent agreement with he limited literature data. More importantly, it provides definitive confirmation that the Hall-Petch formulation with an inverse square root law statistically correlates grain size effects beyond a critical length scale. Rigorous feature analysis also identified grain size distribution, not just average size, as critical in governing strength and variations. Thus, unlike Hall-Petch, the trained machine learning framework captures variability, enabling precise strength statistics. Such computational frameworks can massively accelerate design of structural materials targeting desired strength characteristics. This methodology will be leveraged in upcoming studies on more advanced systems, demonstrating the power of fusing modeling and data science for materials discovery. Stay tuned! #HallPetch #strength #metals #machinelearning #predictions #physicsbased #plasticity This work was made possible by Dr. Yejun Gu and in collaboration with Dr. Christopher Stiles. Check it out and let us know if you have any questions or comments.

  • View profile for Sergei Kalinin

    Weston Fulton chair professor, University of Tennessee, Knoxville

    24,861 followers

    ✨🤖⚛️The Magic of Machine Learning Meets Physics in Materials Science One of the most fascinating parts of the course I teach now at Department of Materials Science and Engineering - University of Tennessee, Knoxville is the most magical area of machine learning, where physics-based models and data-driven methods come together. In Module 4 of the course, I give the basic intro into Physics-Informed Neural Networks (PINNs), Neural ODEs, and Physics-Informed DeepONets (PI-DeepONets). These methods embed the fundamental laws of physics directly into neural networks, making them more data-efficient, generalizable, and interpretable than purely black-box ML 🔍 But what are they, and how do they compare? ✅ PINNs: These use neural networks to solve PDEs by enforcing physics constraints in the loss function. Ideal for materials modeling, inverse problems, and parameter discovery, PINNs allow us to extract physical insights from sparse or noisy experimental data. ✅ Neural ODEs: Instead of discrete models, Neural ODEs treat dynamical systems as continuous functions, solving them via differentiable solvers. They are powerful for tracking material transformations, reaction kinetics, and predicting time-dependent material behaviors. ✅ PI-DeepONets: Unlike PINNs, which solve a single equation, PI-DeepONets learn solution operators, meaning they generalize across families of PDEs. This makes them incredibly useful for rapid simulations, surrogate modeling, and creating digital twins of experimental instruments. 📊 Where do these methods shine in materials science? When model is well defined and materials properties are known, traditional simulators are often best. But for scientific discovery, automation, and real-time decision-making, ML-based approaches are the future. 🚀 Some standout applications: 🔬 Materials Discovery: Enabling active learning workflows where experiments guide simulations, and simulations guide experiments—leading to faster discovery of new materials. 🧪 Digital Twins of Instruments: Creating real-time, data-driven models of experimental tools like electron microscopes or combinatorial synthesis platforms, allowing rapid in silico experimentation. ⚛️ Multi-Scale Modeling: Connecting atomic, mesoscale, and continuum models by learning operator mappings, making it possible to bridge different length scales efficiently. 🔄 Self-Optimizing Experiments: Combining physics-driven ML with reinforcement learning to create autonomous labs that explore material spaces more efficiently than human-driven experiments. I am particularly thankful for Shuai Guo, whose set of tutorials makes these methods simple and easy to use. These tools are rapidly becoming practical, and it’s exciting to see how they will redefine materials research and instrument automation in the coming years. If you're working at this intersection—or are curious about it—let’s connect!💡🚀 https://lnkd.in/eA2vE-Ba

  • View profile for Keith King

    Former White House Lead Communications Engineer, U.S. Dept of State, and Joint Chiefs of Staff in the Pentagon. Veteran U.S. Navy, Top Secret/SCI Security Clearance. Over 16,000+ direct connections & 44,000+ followers.

    43,846 followers

    Headline: AI Is Creating the Next Great Materials—Not Finding Them Introduction: A revolution is underway in materials science. Instead of searching the natural world for new substances, scientists are computing them. Artificial intelligence is transforming how materials are conceived, tested, and brought to life—accelerating discoveries once thought to take decades into just weeks. From Prediction to Production: Massive Discovery Leap: Google DeepMind’s GNoME AI predicted 2.2 million new crystal structures, including 380,000 stable compounds, among them thousands of potential graphene-like materials and battery electrolytes. Autonomous Labs in Action: In one demonstration, a self-driving lab synthesized 41 new compounds in 17 days, all first identified by AI—condensing years of trial-and-error into days. Shift in Workflow: Traditional R&D relied on luck and labor. Now, algorithms preselect the most promising candidates, while human engineers validate, refine, and scale what the computer imagines. Tools Powering the New Alchemy: Bayesian Optimization: AI actively decides which experiment to run next, dramatically cutting time and cost. One alloy study found breakthroughs in 36 trials instead of 800,000. Graph Neural Networks (GNNs): These AIs map atoms and bonds as networks, predicting material properties with striking accuracy. DeepMind’s models used GNNs to identify hundreds of lab-verified crystals. Generative Models: IBM and others now deploy foundation AIs trained on billions of molecular structures, capable of inventing entirely new materials for semiconductors, batteries, and composites. In Practice: Citrine Informatics helped aerospace engineers create AL 7A77, the first 3D-printable aluminum alloy certified for flight. Argonne National Lab’s “Polybot” autonomously produced defect-free conductive polymers using AI-guided experimentation—achieving world-class results. Berkeley Lab and DeepMind have already synthesized hundreds of GNoME’s AI-predicted materials, confirming theory with reality. The Next Frontier: While AI accelerates discovery, scaling up from micrograms to manufacturable tons remains the bottleneck. The future lies in AI-designed materials “born ready” for industrial production, integrating manufacturability, sustainability, and performance into their digital blueprints. Why It Matters: The fusion of AI, robotics, and materials science marks a turning point in human innovation. We are compressing millennia of discovery into months, opening the door to breakthroughs in energy storage, electronics, aerospace, and quantum computing. The next wonder material won’t be mined—it will be computed, tested by robots, and engineered for the world. I share daily insights with 29,000+ followers and 10,000+ professional contacts across defense, tech, and policy. If this topic resonates, I invite you to connect and continue the conversation. Keith King https://lnkd.in/gHPvUttw

  • View profile for Markus J. Buehler
    Markus J. Buehler Markus J. Buehler is an Influencer

    McAfee Professor of Engineering at MIT; Co-Founder & CTO at Unreasonable Labs; AI-Driven Scientific Discovery

    30,101 followers

    How do materials fail, and how can we design stronger, tougher, and more resilient ones? Published in #PNAS, our physics-aware AI model integrates advanced reasoning, rational thinking, and strategic planning capabilities models with the ability to write and execute code, perform atomistic simulations to solicit new physics data from “first principles”, and conduct visual analysis of graphed results and molecular mechanisms. By employing a multiagent strategy, these capabilities are combined into an intelligent system designed to solve complex scientific analysis and design tasks, as applied here to alloy design and discovery. This is significant because our model overcomes the limitations of traditional data-driven approaches by integrating diverse AI capabilities—reasoning, simulations, and multimodal analysis—into a collaborative system, enabling autonomous, adaptive, and efficient solutions to complex, multiobjective materials design problems that were previously slow, expert-dependent, and domain-specific. Wonderful work by my postdoc Alireza Ghafarollahi! Background: The design of new alloys is a multiscale problem that requires a holistic approach that involves retrieving relevant knowledge, applying advanced computational methods, conducting experimental validations, and analyzing the results, a process that is typically slow and reserved for human experts. Machine learning can help accelerate this process, for instance, through the use of deep surrogate models that connect structural and chemical features to material properties, or vice versa. However, existing data-driven models often target specific material objectives, offering limited flexibility to integrate out-of-domain knowledge and cannot adapt to new, unforeseen challenges. Our model overcomes these limitations by leveraging the distinct capabilities of multiple AI agents that collaborate autonomously within a dynamic environment to solve complex materials design tasks. The proposed physics-aware generative AI platform, AtomAgents, synergizes the intelligence of LLMs and the dynamic collaboration among AI agents with expertise in various domains, incl. knowledge retrieval, multimodal data integration, physics-based simulations, and comprehensive results analysis across modalities. The concerted effort of the multiagent system allows for addressing complex materials design problems, as demonstrated by examples that include autonomously designing metallic alloys with enhanced properties compared to their pure counterparts. We demonstrate accurate prediction of key characteristics across alloys and highlight the crucial role of solid solution alloying to steer the development of alloys. Paper: https://lnkd.in/enusweMf Code: https://lnkd.in/eWv2eKwS MIT Schwarzman College of Computing MIT Civil and Environmental Engineering MIT Department of Mechanical Engineering (MechE) MIT Industrial Liaison Program MIT School of Engineering

  • View profile for Yan Barros

    Building Physics AI Infrastructure for Engineering & Digital Twins | Advisor in Clinical AI & Lunar Systems | Creator of PINNeAPPle | Founder @ ChordIQ

    8,558 followers

    🌟 Starting the Week with an Inspiring Paper! Today, let's dive into an intriguing research paper: "Enhanced Physics-Informed Neural Networks for Hyperelasticity". This paper introduces an innovative approach to solving the challenging partial differential equations (PDEs) governing the mechanical behavior of hyperelastic materials. Kudos to the brilliant authors—Diab W. Abueidda, Seid Koric, Erman Guleryuz, and Nahil A. Sobh—for this impactful work! --- 🔍 Overview Physics-informed neural networks (PINNs) have been making waves for their ability to solve PDEs without extensive labeled datasets. However, traditional PINNs often face challenges in accuracy, especially when dealing with complex material behaviors like hyperelasticity. This paper addresses these issues, pushing the boundaries of PINN performance. --- 🚀 Key Contributions 1. Integration of Multiple Loss Terms: The model incorporates a loss function with multiple components, including total potential energy and strong-form residuals of the governing equations, capturing complex input-output relationships more effectively. 2. Dynamic Weighting Scheme: Using a coefficient of variation (CoV) weighting scheme, the model dynamically adjusts the weights of loss terms, ensuring balanced and effective learning across all aspects. 3. No Data Generation Required: Unlike many data-driven models, this framework eliminates the need for data generation, making it efficient and accessible for real-world applications. 4. Improved High-Gradient Performance: The enhanced framework shines in high-gradient regions, crucial for accurately modeling materials under stress. 5. Advanced Techniques: Techniques like Gaussian Fourier feature mapping and curriculum learning further improve the neural network’s ability to learn and generalize complex functions. --- 🔧 Applications The insights from this paper have far-reaching implications, particularly in: Material Science: Modeling and designing hyperelastic materials. Engineering: Accurately predicting material behavior under various loading conditions. Computational Mechanics: Combining machine learning with physics for efficient simulations. This research is a remarkable step in integrating machine learning with physics-based modeling, paving the way for more precise and efficient solutions in engineering and material sciences. --- Brilliant work! This inspires us to continue exploring the synergy between physics and machine learning. 📄 Read the paper here: https://lnkd.in/df-sNukV

  • View profile for Gabe Gomes

    Research Scientist, X, The Moonshot Factory (fka Google X) | AI for Autonomous Science | Assistant Professor (on leave), Carnegie Mellon University

    4,892 followers

    I’m thrilled to share our latest publication in Nature Machine Intelligence: “Advancing molecular machine learning representations with stereoelectronics-infused molecular graphs” (link to paper in the comments) Led by Ph.D. student Daniil Boiko, our work introduces stereoelectronics-infused molecular graphs (SIMGs), a novel molecular representation that explicitly incorporates stereoelectronic effects: stabilizing electronic interactions maximized by specific geometric arrangements through favorable orbital overlap. Traditional molecular representations (e.g., molecular graphs, fingerprints, SMILES strings) often overlook critical quantum-chemical details. SIMGs explicitly address this limitation by embedding orbital interactions, significantly enhancing molecular property predictions. For example, using SIMGs improved the prediction of HOMO-LUMO gaps substantially compared to traditional methods. Models trained on small molecules can accurately predict orbital interactions in much larger systems like proteins, achieving orders of magnitude speed improvement over traditional DFT+NBO calculations. Recognizing that directly computing these orbital interactions is computationally intensive, we developed SIMG*, a machine-learned approximation enabling rapid predictions. This methodology enables stereoelectronically enhanced analysis of macromolecular systems where traditional quantum-chemical calculations are computationally prohibitive, facilitating systematic investigation of stereoelectronic interactions governing protein stability and reactivity. To facilitate broader access, we’ve launched an interactive web application where researchers can easily explore stereoelectronic information in their molecules: https://simg.cheme.cmu.edu. This work exemplifies our group’s mission to revolutionize chemical discovery by integrating quantum chemistry, machine learning, and automation. At the Gomes group, we’re committed to developing intelligent systems that transform how we design molecules, materials, and reactions: from foundational representations like SIMGs to autonomous agents capable of planning and executing experiments. Our goal is to accelerate innovation across domains, from (bio-, organo-)catalysis to materials science. Great work by my trainees Daniil Boiko and Thiago Reschützegger, along with our collaborators + great friends, Benjamin Sanchez-Lengeling (University of Toronto, Google DeepMind) and co-corresponding author Samuel Blau (Lawrence Berkeley National Laboratory). #MachineLearning #QuantumChemistry #MolecularModeling #StereoelectronicEffects

Explore categories