🚀 Open-Sourcing a New Paradigm in Computational Chemistry: The Kyjovský Topological Engine For decades, material design has relied on solving complex Schrödinger equations, probabilistic fermion clouds, and orbital hybridizations. While this works, the computational cost (FLOPs) required to simulate complex molecules is massive. What if we bypass these heavy probability integrals entirely using pure wave geometry? I am thrilled to announce that the Python Minimum Viable Product (MVP) of the Kyjovský Topological Model is now live and open-source on GitHub! 💻 Instead of treating the vacuum as "empty space" and electrons as orbiting point-particles, this engine models a fully structured 55-point Higgs lattice (Mackay icosahedron). We treat molecular stability strictly as a topological problem. Here is how the engine recalculates physics: ⚛️ Nucleon Volume: 1 nucleon (proton or neutron) occupies exactly 3.25 vacuum clusters. 🌊 The Wave Electron: The electron is not a physical particle. It is a standing resonance wave generated by the outward-pointing vectors (gluons) of protons. 🔗 Phase Alignment: Chemical bonds are computed via simple scalar phase alignment using a "Cluster Index" (K_{cc}), rather than complex electron repulsion. Whether it’s mathematically proving the directional sp^3 valence nodes of Carbon as an overcrowded spatial layer, or predicting the absolute resonance of Gallium Nitride (GaN) in milliseconds, this heuristic algorithm reduces chemistry to elegant integer mathematics. If you are a computational chemist, a materials science engineer, or an AI developer looking for exponentially faster algorithmic heuristics for physical simulations, I invite you to explore the code. Clone the repo, run the simulations, and let's rethink physics from the ground up: 🔗 https://lnkd.in/dUxdKtjY Author: Peter Kyjovsky ORCID iD: 0009-0008-3806-1964 #ComputationalChemistry #MaterialsScience #QuantumPhysics #Python #OpenSource #Simulation #DeepTech #Innovation #Physics
Peter Kyjovsky’s Post
More Relevant Posts
-
Reproducibility is becoming a central challenge in computational materials science, especially as workflows grow more complex, data-driven, and distributed across HPC infrastructures. Glad to share this new collective work from the DIAMOND digital platform of the PEPR DIADEM: “Reproducible Container Solutions for Codes and Workflows in Materials Science” https://lnkd.in/eDhfZ_9M This paper presents robust (and elegant) solutions combining GNU Guix and Apptainer to build fully declarative, reproducible software environments. Beyond the technical advance, we're glad to have been able to integrate complete workflows, from ab initio simulations to machine learning and large-scale data analysis, in a consistent and portable framework. 🚀 ✨ 💪 ⚡ Kudos to all the contributors involved in this collective effort, an excellent example of what coordinated, community-driven initiatives can achieve. #Reproducibility #HPC #MaterialsScience #MachineLearning #OpenScience #DIAMOND #DIADEM Dylan Bissuel, Léo Orveillon, Benjamin Arrondeau, João Paulo Almeida de Mendonça, Jonathan DAUBIN, Irina Piazza, PhD, Martin Uhrin, Etienne Polack, Akshay Krishna Ammothum Kandy, David Martin-Calle, Jonathan C., Aadhityan A, Lorenzo Paulatto, Pierre-Antoine Bouttier, Marie-Ingrid Richard, Thierry Deutsch, David Rodney, A. Marco Saitta, Noel Jakse
To view or add a comment, sign in
-
What if the “fifth dimension” is not space at all… but computation? I’ve been working on a research idea that reframes a long-standing concept in physics in a completely different way. Instead of treating higher dimensions as abstract geometric spaces, I model the fifth dimension as a computational space of all possible futures. In this framework: Every event is not a single outcome But a branching structure of probabilities Reality becomes a graph of diverging timelines Using a graph-based probabilistic simulation, I explore how systems evolve when every decision point generates multiple possible states effectively modeling “alternative realities” as computational structures. This creates an interesting bridge between: Physics (higher-dimensional theories) Probability theory (stochastic systems) Computer science (graph-based modeling) The surprising part: even simple stochastic systems begin to show complex divergence patterns when viewed through this lens. 📄 Full research paper: https://lnkd.in/eAXuX_yD I’d genuinely love feedback from anyone working in physics, CS, or mathematical modeling, especially if you see flaws, extensions, or alternative interpretations.
To view or add a comment, sign in
-
More than a decade ago, I started asking a question: can universes with different physical parameters sustain stable structures? What began as independent research in mathematical cosmology eventually needed proper tooling: reproducible parameter explorations, immutable state, traceable provenance. So I built it. Dup Shimati is an open-source computational framework for researchers exploring compactification topologies and coupling constant parameter spaces. Rust core engine, Go service layer, Kotlin client tooling. Append-only encrypted state so collaborating researchers can share and extend results with full provenance. The repo is public now. If you work in theoretical physics, mathematical cosmology, or adjacent fields: this is for you. https://lnkd.in/eyuU9CDj
To view or add a comment, sign in
-
pdf Transactions on Computational Systems Biology XII: Special Issue on Modeling Methodologies Rainer Breitling, Robin A. Donaldson (auth.), Corrado Priami, Rainer Breitling, David Gilbert, Monika Heiner, Adelinde M. Uhrmacher (eds.) https://lnkd.in/eFgfFrFB The LNCS journal Transactions on Computational Systems Biology is devoted to inter- and multidisciplinary research in the fields of computer science and life sciences and supports a paradigmatic shift in the techniques from computer and information science to cope with the new challenges arising from the systems oriented point of view of biological phenomena. This special issue of the journal focuses on the topic of modeling methodologies. It starts with a position paper by the guest editors, entitled Biomodel Engineering - from Structure to Behavior, which is followed by the technical contributions covering a broad range of modeling methodologies. Two papers focus on new modeling languages, and these are followed by an article presenting a case study demonstrating the value of the qualitative network approach. With the remaining three contributions, the special issue leaves the area of qualitative modeling, to move toward quantitative programming with the BlenX language and the application of more theoretical process calculi. digzon The LNCS journal Transactions on Computational Systems Biology is devoted to inter- and multidisciplinary research in the fields of computer science and life sciences and supports a paradigmatic shift in the techniques from computer and information science to cope with the new challenges arising from the systems oriented point of view of biological phenomena. This special issue of the journal focuses on the topic of modeling methodologies. It starts with a position paper by the guest editors, entitled Biomodel Engineering - from Structure to Behavior, which is followed by the technical contributions covering a broad range of modeling methodologies. Two papers focus on new modeling languages, and these are followed by an article presenting a case study demonstrating the value of the qualitative network approach. With the remaining three contributions, the special issue leaves the area of qualitative modeling, to move toward quantitative programming with the BlenX language and the application of more theoretical process calculi. #simple #Biology #AdelindeM.Uhrmachereds. #CorradoPriami #DavidGilbert #MonikaHeiner #RainerBreitling #RobinA.Donaldsonauth. https://lnkd.in/egppkBWa
To view or add a comment, sign in
-
CAE engineers are a unique breed. We are border riders between physics and IT. For us, HPC is a close friend, and we have to understand the infrastructure to do our day job. Every CAE engineer has a personal library of scripts, whether in Python or Bash (and yes, Bash is a programming language!), for populating jobs, monitoring queues, killing processes with specific patterns, and aggregating outputs into CSVs. We've always been the masters of our own automation. Recently, as I’ve gone deeper into Scientific Machine Learning (SciML), VS Code has become my bread and butter. While I admired the context-awareness of tools like Claude Code, I recently learned during a GitHub Copilot Immersive Session that I could achieve that same "agentic" workflow right inside VS Code using copilot-instructions.md. I defined my bsub wrapper commands and workspace structure (source, data farm, runs) in the instructions file. It took more efforts than I expected. Then yesterday, I finished the instructions file, and gave Copilot a small challenge: "Grab this and that data, create a template YAML pointing to them for training, and populate an ablation study with these specific parameters." Without typing a single bsub command, I had 16 jobs active in the queue in seconds. No more "for filename in *.YAML; do vim -e "%s/.../g | wq" ; submit_job.sh --config $filename --mode train; done". Then it showed a summary of submitted jobs, associated LSFIDs with job description, and path to each run, which I asked to save as a markdown and a csv. It is building a complete logbook. We are moving away from being manual script-writers and toward being Workflow Orchestrators. I’ll be spending my weekend with VS Code and Claude, building a few agents to replicate a viral AI assistant I’ve been following: https://lnkd.in/dEjh-W7i The Engineering Grammar is changing. Productivity is skyrocketing. Our engineering mindset needs to evolve to keep pace with the speed of AI. #SciML #CAE #AgenticAI #HPC #EngineeringAutomation #LLMs #GMRnD #iWorkForGM
To view or add a comment, sign in
-
Today I saw the following video: https://lnkd.in/gduJuuC8 If this idea is “new” for anyone using mathematics in any way, they are severely deficient in differential equations and numerical methods training. I learnt about various predictor-corrector methods for finding numerical approximations to solutions for differential equations and systems of differential equations as a freshman in chemical engineering, taking a course taught by an industrial engineer; the course included an introduction to numerical methods and Fortran programming, and then by the time I was a senior, various such methods, such as the Adams-Bolton predictor-corrector scheme were routinely referenced and used in exercises and examinations. Here’s a hint about hype: if someone claims that something being done in their industry now is so strongly connected to well-developed tools in your own industry that you can recognize the philosophical connections right off the bat to textbook exercises you did as an undergraduate student, then they’re industry is just lagging behind yours in sophistication, so their report of a fundamental breakthrough in mathematical modeling are severely overblown. At its base, this idea is philosophically as old as humanity, and mathematicians have used it in numerical modeling for millennia, but the tools available for the prediction step have always needed enlargement, deeper and more extensive development, and consistent and tenacious fine tuning. This is also very naturally similar to non-monotonic reasoning and the philosophy behind belief revision, which is often discussed in connection with Bayesian inference in statistical analysis. If anyone calls this “new”, be aware that that claim is merely hype. What is “new” here is that someone in machine learning decided to have a moment of clarity about the scientific method and its analogs in inference systems, which have been around a very long time. The notion of “active inference” is not in any way a physics notion, but gets applied in various forms and leads to very important results in physics, and will (and has, because in statistical machine learning, Bayesian inference is already “a thing”, coupled with belief revision, a la Popper, Dempster, and Schafer) lead to important results in machine learning, so anyone claiming that “active inference” is fundamentally new is cracked. There may be “newish” ways it’s being applied, but the most fundamentally “new” thing about it is the rhetoric, in the sense of giving it the (fitting, I’ll admit) name “active inference”., or “adaptive inference “. Other related notions include error-correcting codes, etc.
This physics idea might be the next generation of machine learning
https://www.youtube.com/
To view or add a comment, sign in
-
Dear network, Attention mechanisms are at the heart of modern language models based on the Transformer architecture, allowing these models to encode information through the pairwise interaction of tokens. Unfortunately, these methods suffer from a computational bottleneck that scales quadratically with sentence length. I am pleased to share a recent preprint where we provide an overview, from the point of view of applied mathematics, of different approximation methods aimed at reducing this bottleneck, as well as alternative models which do not suffer from the shortcomings of regular attention. The methods we study range from clustering, sparsity, and kernel methods to tensor-based approaches. We introduce a taxonomy of these existing techniques to present these developments within a unified mathematical framework, highlighting opportunities for further contributions from numerical linear algebra to the design of scalable attention mechanisms. This preprint is the result of a workgroup, which I had the opportunity to co-lead alongside Laura Grigori(lead) and Alice Cortinovis(co-lead), on Randomized Numerical Linear Algebra for the Transformer architecture that originated from the IPAM "Randomized Numerical Linear Algebra (RNLA)" Research Collaboration Workshop in August 2025. I am proud to share the first outcome of our collaboration: Attention Mechanisms Through the Lens of Numerical Methods: Approximation Methods and Alternative Formulations https://lnkd.in/evv7HG58 For an introduction on how attention mechanisms work, you can also find the introductory document I wrote for the IPAM workshop here: Understanding Transformers and Attention Mechanisms: An Introduction for Applied Mathematicians https://lnkd.in/euyyiGBK I would also like to extend a sincere thank you to the organizers of the workshop and to all my co-authors and collaborators who made this work possible: Alice Cortinovis, Yijun Dong, Diana Halikias, Anna Ma, Fabio Matti, Deanna Needell, Katherine J. Pearce, Elizaveta Rebrova, Disha Shur, Rudi Smith, Hai-Xiao Wang, and Laura Grigori.
To view or add a comment, sign in
-
We usually think of computation as a single path: input → steps → output. But what if that’s only part of the story? In a new paper, I explore the idea that computation is better understood as a distribution over many possible ways of computing the same result. Some of those “histories” are more coherent than others. By quantifying this using a coherence defect, we can treat computation like a nonequilibrium system over paths. Two surprising things emerge: Different high-level strategies (routes) are not equivalent—they have distinct statistical signatures. Once you remove that structural bias, you see genuine fluctuations within computation itself. This suggests computation has a hidden layer: not just what gets computed, but how many different ways it can happen—and how they vary. Curious what people think—especially across CS, physics, and math. https://lnkd.in/d4hTNdk8
To view or add a comment, sign in
-
To My Friends Please share this with all those you know, Here is Another reviewers comment. The mathematical foundation of the VINES Unified Field Theory is presented as rigorous and observationally constrained, distinguishing it from earlier speculative unification attempts like Lisi’s E8 theory. The VINES model is formulated as a 5D warped AdS framework derived from Type IIA string theory, with 19 parameters (5 free, 14 fixed) calibrated using Planck 2023, ATLAS/CMS 2023, XENONnT, SNO 2024, and DESI mock data. It generates testable predictions for CMB non-Gaussianity, KK gravitons at 1.6 TeV, Hubble constant (H₀ = 71.5 ± 0.7 km/s/Mpc), and neutrino CP phase, with validation expected by 2035 via CMB-S4, LHC, and LISA. Python simulations using CLASS, GRChombo, and microOMEGAs confirm internal consistency, and the theory claims to resolve the string landscape to 3 vacua via flux stabilization. In contrast, Garrett Lisi’s E8 theory, while mathematically elegant, has been critically challenged—notably by Distler and Garibaldi (2009), who proved it cannot embed the Standard Model without unphysical antigenerations, and it remains largely unaccepted in the mainstream physics community. Synthesis: VIWDP – A Unified Path Forward The Vines-Inspired Warp Drive Physics (VIWDP) framework unifies: Component Role in VIWDP VINES Model Provides unified field theory foundation with empirical predictions E₈ F-Theory Offers geometric structure for force unification and spin emergence Nacelle Topology Enables modular, tunable warp field engineering Casimir Effect Supplies laboratory-verified negative energy mechanism Fusion Power Delivers scalable energy source for warp coil activation This framework transitions warp drive from purely mathematical speculation to a multi-disciplinary research program spanning quantum field theory, general relativity, nanofabrication, and fusion engineering.
To view or add a comment, sign in
-
[PDF] High Performance Computational Science and Engineering: IFIP TC5 Workshop on High Performance Computational Science and Engineering (HPCSE), World ... in Information and Communication Technology) Michael K. Ng, Andrei Doncescu, Laurence T. Yang, Tau Leng https://lnkd.in/e7YdEUTJ Proceedings of the International Symposium on High Performance Computational Science and Engineering 2004 (IFIP World Computer Congress) is an essential reference for both academic and professional researchers in the field of computational science and engineering. Computational Science and Engineering is increasingly becoming an emerging and promising discipline in shaping future research and development activities in academia and industry ranging from engineering, science, finance, economics, arts and humanitarian fields.? New challenges are in modeling of complex systems, sophisticated algorithms, advanced scientific and engineering computing, and associated (multi-disciplinary) problem solving environments.? The papers presented in this volume are specially selected to address the most up-to-date ideas, results, work-in-progress and research experience in the area of high performance computational techniques for science and engineering applications. This state-of-the-are volume presents the proceedings of the International Symposium on High Performance Computational Science and Engineering, held ?in conjunction with the IFIP World Computer Congress, August 2004, in Toulouse, France. The collection will be important not only for computational science and engineering experts and researchers but for all teachers and administrators interested in high performance computational techniques. digzon Proceedings of the International Symposium on High Performance Computational Science and Engineering 2004 (IFIP World Computer Congress) is an essential reference for both academic and professional researchers in the field of computational science and engineering. Computational Science and Engineering is increasingly becoming an emerging and promising discipline in shaping future research and development activities in academia and industry ranging from engineering, science, finance, economics, arts and humanitarian fields.? New challenges are in modeling of complex systems, sophisticated algorithms, advanced scientific and engineering computing, and associated (multi-disciplinary) problem solving environments.? The papers presented in this volume are specially selected to address the most up-to-date ideas, results, work-in-progress and research experience in the area of high performance computational techniques for science and engineering applications. This state-of-the-are volume presents the proceedings of the International Symposium on High Performance Computational Science and Engineering, held ?in conjunction with the IFIP World Computer Congress, August 2004, in Toulouse, France. The col...
To view or add a comment, sign in
More from this author
Explore related topics
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development