[PDF] High Performance Computational Science and Engineering: IFIP TC5 Workshop on High Performance Computational Science and Engineering (HPCSE), World ... in Information and Communication Technology) Michael K. Ng, Andrei Doncescu, Laurence T. Yang, Tau Leng https://lnkd.in/e7YdEUTJ Proceedings of the International Symposium on High Performance Computational Science and Engineering 2004 (IFIP World Computer Congress) is an essential reference for both academic and professional researchers in the field of computational science and engineering. Computational Science and Engineering is increasingly becoming an emerging and promising discipline in shaping future research and development activities in academia and industry ranging from engineering, science, finance, economics, arts and humanitarian fields.? New challenges are in modeling of complex systems, sophisticated algorithms, advanced scientific and engineering computing, and associated (multi-disciplinary) problem solving environments.? The papers presented in this volume are specially selected to address the most up-to-date ideas, results, work-in-progress and research experience in the area of high performance computational techniques for science and engineering applications. This state-of-the-are volume presents the proceedings of the International Symposium on High Performance Computational Science and Engineering, held ?in conjunction with the IFIP World Computer Congress, August 2004, in Toulouse, France. The collection will be important not only for computational science and engineering experts and researchers but for all teachers and administrators interested in high performance computational techniques. digzon Proceedings of the International Symposium on High Performance Computational Science and Engineering 2004 (IFIP World Computer Congress) is an essential reference for both academic and professional researchers in the field of computational science and engineering. Computational Science and Engineering is increasingly becoming an emerging and promising discipline in shaping future research and development activities in academia and industry ranging from engineering, science, finance, economics, arts and humanitarian fields.? New challenges are in modeling of complex systems, sophisticated algorithms, advanced scientific and engineering computing, and associated (multi-disciplinary) problem solving environments.? The papers presented in this volume are specially selected to address the most up-to-date ideas, results, work-in-progress and research experience in the area of high performance computational techniques for science and engineering applications. This state-of-the-are volume presents the proceedings of the International Symposium on High Performance Computational Science and Engineering, held ?in conjunction with the IFIP World Computer Congress, August 2004, in Toulouse, France. The col...
High Performance Computational Science and Engineering Proceedings
More Relevant Posts
-
Numerical analysis, scientific computing, or computational science? A short essay written in 2009 by David Bindel, professor of computer science at Cornell Univ. (I came upon this after a brief discussion with Gemini) https://lnkd.in/eChwTn5g Bindel writes "When I worry about cache architecture, or when I parallelize numerical methods, or when I build little tools to automatically generate parts of my scientific codes, I work on scientific computing. " but when reading this I hear computer science, not scientific computing, and for me they are two different things.
To view or add a comment, sign in
-
What if the “fifth dimension” is not space at all… but computation? I’ve been working on a research idea that reframes a long-standing concept in physics in a completely different way. Instead of treating higher dimensions as abstract geometric spaces, I model the fifth dimension as a computational space of all possible futures. In this framework: Every event is not a single outcome But a branching structure of probabilities Reality becomes a graph of diverging timelines Using a graph-based probabilistic simulation, I explore how systems evolve when every decision point generates multiple possible states effectively modeling “alternative realities” as computational structures. This creates an interesting bridge between: Physics (higher-dimensional theories) Probability theory (stochastic systems) Computer science (graph-based modeling) The surprising part: even simple stochastic systems begin to show complex divergence patterns when viewed through this lens. 📄 Full research paper: https://lnkd.in/eAXuX_yD I’d genuinely love feedback from anyone working in physics, CS, or mathematical modeling, especially if you see flaws, extensions, or alternative interpretations.
To view or add a comment, sign in
-
Dear network, Attention mechanisms are at the heart of modern language models based on the Transformer architecture, allowing these models to encode information through the pairwise interaction of tokens. Unfortunately, these methods suffer from a computational bottleneck that scales quadratically with sentence length. I am pleased to share a recent preprint where we provide an overview, from the point of view of applied mathematics, of different approximation methods aimed at reducing this bottleneck, as well as alternative models which do not suffer from the shortcomings of regular attention. The methods we study range from clustering, sparsity, and kernel methods to tensor-based approaches. We introduce a taxonomy of these existing techniques to present these developments within a unified mathematical framework, highlighting opportunities for further contributions from numerical linear algebra to the design of scalable attention mechanisms. This preprint is the result of a workgroup, which I had the opportunity to co-lead alongside Laura Grigori(lead) and Alice Cortinovis(co-lead), on Randomized Numerical Linear Algebra for the Transformer architecture that originated from the IPAM "Randomized Numerical Linear Algebra (RNLA)" Research Collaboration Workshop in August 2025. I am proud to share the first outcome of our collaboration: Attention Mechanisms Through the Lens of Numerical Methods: Approximation Methods and Alternative Formulations https://lnkd.in/evv7HG58 For an introduction on how attention mechanisms work, you can also find the introductory document I wrote for the IPAM workshop here: Understanding Transformers and Attention Mechanisms: An Introduction for Applied Mathematicians https://lnkd.in/euyyiGBK I would also like to extend a sincere thank you to the organizers of the workshop and to all my co-authors and collaborators who made this work possible: Alice Cortinovis, Yijun Dong, Diana Halikias, Anna Ma, Fabio Matti, Deanna Needell, Katherine J. Pearce, Elizaveta Rebrova, Disha Shur, Rudi Smith, Hai-Xiao Wang, and Laura Grigori.
To view or add a comment, sign in
-
The Still Wave: Solving the Equation of Civilization. We are currently attempting to build a Type 1 Civilization on a "leaky" foundation. In every domain—from the heat-dissipating circuits in our pockets to the volatile oscillations of global markets—we face the same fundamental mathematical crisis: The Problem of Topological Anchoring in High-Entropy Flows. In pure mathematics, the challenge is simple yet profound: How do you map a discrete, stable symbol (a "1" or a "0") onto a continuous, transient field (a wave of light or a market trend) without the state decaying into entropy? I call this the Discrete-Continuous Correspondence Hypothesis (DCCH). It is not just a theory of computing; it is a framework for a "Theory of Everything" that bridges the logic of the mind with the physics of the field. Why this matters now: Computing: This is the "Holy Grail" of the Photonic Latch. If we can anchor light waves into stable topological states, we move beyond the energy-expensive "von Neumann Bottleneck" into room-temperature, general-purpose photonic computing. Finance & Economics: Markets are high-entropy flows. By identifying Invariant Attractors within these flows, we can design financial systems that are inherently stable, resisting the "entropy" of crashes while maintaining the fluidity of global trade. Civilization: To move from a Type 1 to a more advanced civilization, we must stop fighting entropy and start anchoring within it. The Vision: The next stage of human history isn't just "faster"; it is clearer. In my research, I envision a civilization that reflects the ancient promise of a "Sea of Glass, clear as crystal." This is a state of perfect, lossless information—where the "Light" (energy) and the "Word" (information) are finally one. A world without the "heat" of friction, only the "clarity" of stable, infinite state. I am currently seeking to transition my professional engineering background into deep-tech scholarship to formalize the DCCH manuscript. I am particularly drawn to the historical mathematical rigor of Trinity College Dublin and the innovative computational ecosystems in Vancouver. The goal isn't just to build a better computer. It is to solve the problem of the "Still Wave" and unlock the next era of human potential. #SystemsThinking #MathematicalPhysics #DCCH #PhotonicComputing #FinTech #TCD #UBC #Type1Civilization #DigitalSovereignty #TheoryOfEverything
To view or add a comment, sign in
-
-
Happy to share our first exploration into LLM fine-tuning for theoretical physics. In this work, we developed a data pipeline for verifiable QFT problems, both fully synthetic and adapting existing human-authored problems. Using this data, we compared the performance gains of RL and SFT on small reasoning models, and explored changes in reasoning and error frequencies after fine-tuning. 1. RL vs. SFT Performance: The models achieved similar overall performance. SFT held the advantage on in-distribution QFT tasks, whereas RL performed better on out-of-distribution tasks, both inside and outside of QFT. 2. Significant Error Reduction: Both RL and SFT dramatically reduced errors made by approximately 65%. While factual errors saw the most significant reduction, mathematical errors remained prevalent. 3. Changes in Reasoning Behavior: Beyond raw errors, we found RL and SFT impact Chain-of-Thought (CoT) length and backtracking frequency in different ways. Both models increase the length of correct CoTs, but SFT inflates incorrect CoTs beyond the base or RL models. For backtracking, RL backtracks more frequently on correct answers, while both models reduce backtracking on incorrect attempts. Alongside this work, we release our data pipeline, synthetic QFT training data, and almost 200M tokens of rejection sampled QFT reasoning traces used for SFT. This work was a collaboration between Zhiqi Gao Yurii Kvasiuk Kendrick Smith Frederic Sala Moritz Münchmeyer. Excited to continue exploring this rapidly evolving space! Link: https://lnkd.in/gYGBiCeN
To view or add a comment, sign in
-
Reproducibility is becoming a central challenge in computational materials science, especially as workflows grow more complex, data-driven, and distributed across HPC infrastructures. Glad to share this new collective work from the DIAMOND digital platform of the PEPR DIADEM: “Reproducible Container Solutions for Codes and Workflows in Materials Science” https://lnkd.in/eDhfZ_9M This paper presents robust (and elegant) solutions combining GNU Guix and Apptainer to build fully declarative, reproducible software environments. Beyond the technical advance, we're glad to have been able to integrate complete workflows, from ab initio simulations to machine learning and large-scale data analysis, in a consistent and portable framework. 🚀 ✨ 💪 ⚡ Kudos to all the contributors involved in this collective effort, an excellent example of what coordinated, community-driven initiatives can achieve. #Reproducibility #HPC #MaterialsScience #MachineLearning #OpenScience #DIAMOND #DIADEM Dylan Bissuel, Léo Orveillon, Benjamin Arrondeau, João Paulo Almeida de Mendonça, Jonathan DAUBIN, Irina Piazza, PhD, Martin Uhrin, Etienne Polack, Akshay Krishna Ammothum Kandy, David Martin-Calle, Jonathan C., Aadhityan A, Lorenzo Paulatto, Pierre-Antoine Bouttier, Marie-Ingrid Richard, Thierry Deutsch, David Rodney, A. Marco Saitta, Noel Jakse
To view or add a comment, sign in
-
pdf Transactions on Computational Systems Biology XII: Special Issue on Modeling Methodologies Rainer Breitling, Robin A. Donaldson (auth.), Corrado Priami, Rainer Breitling, David Gilbert, Monika Heiner, Adelinde M. Uhrmacher (eds.) https://lnkd.in/eFgfFrFB The LNCS journal Transactions on Computational Systems Biology is devoted to inter- and multidisciplinary research in the fields of computer science and life sciences and supports a paradigmatic shift in the techniques from computer and information science to cope with the new challenges arising from the systems oriented point of view of biological phenomena. This special issue of the journal focuses on the topic of modeling methodologies. It starts with a position paper by the guest editors, entitled Biomodel Engineering - from Structure to Behavior, which is followed by the technical contributions covering a broad range of modeling methodologies. Two papers focus on new modeling languages, and these are followed by an article presenting a case study demonstrating the value of the qualitative network approach. With the remaining three contributions, the special issue leaves the area of qualitative modeling, to move toward quantitative programming with the BlenX language and the application of more theoretical process calculi. digzon The LNCS journal Transactions on Computational Systems Biology is devoted to inter- and multidisciplinary research in the fields of computer science and life sciences and supports a paradigmatic shift in the techniques from computer and information science to cope with the new challenges arising from the systems oriented point of view of biological phenomena. This special issue of the journal focuses on the topic of modeling methodologies. It starts with a position paper by the guest editors, entitled Biomodel Engineering - from Structure to Behavior, which is followed by the technical contributions covering a broad range of modeling methodologies. Two papers focus on new modeling languages, and these are followed by an article presenting a case study demonstrating the value of the qualitative network approach. With the remaining three contributions, the special issue leaves the area of qualitative modeling, to move toward quantitative programming with the BlenX language and the application of more theoretical process calculi. #simple #Biology #AdelindeM.Uhrmachereds. #CorradoPriami #DavidGilbert #MonikaHeiner #RainerBreitling #RobinA.Donaldsonauth. https://lnkd.in/egppkBWa
To view or add a comment, sign in
-
A computer system has uncovered a fundamental flaw in a widely cited physics paper. Joseph Tooby-Smith, a researcher at the University of Bath, utilized the specialized programming language "Lean" to formalize a 2006 study regarding the stability of the two-Higgs doublet model (2HDM). While the paper had been cited for years as a foundational piece of research, the software identified a critical logical error: a specific condition previously thought to guarantee a stable solution actually failed. Originally intended as a routine exercise to add the research to the new PhysLib database, the discovery has since been confirmed by the original authors, who have acknowledged the error and plan to publish a formal correction. This landmark finding has reignited a debate over the rigor of theoretical physics compared to pure mathematics. Unlike mathematicians, who require exhaustive proofs, physicists often omit explicit details in their calculations, creating logical gaps that human peer reviewers frequently overlook. Experts suggest that making automated formalization a standard part of the publishing process could prevent these errors from propagating through years of subsequent research. However, the transition requires building a massive digital library of formalized physics—millions of lines of code—to train the next generation of AI tools to act as tireless sentinels for the scientific community. source: Conway, N. (2026). For the first time, a computer found an error in a major physics paper. New Scientist Pics Credit : HAG
To view or add a comment, sign in
-
-
A computer system has uncovered a fundamental flaw in a widely cited physics paper. Joseph Tooby-Smith, a researcher at the University of Bath, utilized the specialized programming language "Lean" to formalize a 2006 study regarding the stability of the two-Higgs doublet model (2HDM). While the paper had been cited for years as a foundational piece of research, the software identified a critical logical error: a specific condition previously thought to guarantee a stable solution actually failed. Originally intended as a routine exercise to add the research to the new PhysLib database, the discovery has since been confirmed by the original authors, who have acknowledged the error and plan to publish a formal correction. This landmark finding has reignited a debate over the rigor of theoretical physics compared to pure mathematics. Unlike mathematicians, who require exhaustive proofs, physicists often omit explicit details in their calculations, creating logical gaps that human peer reviewers frequently overlook. Experts suggest that making automated formalization a standard part of the publishing process could prevent these errors from propagating through years of subsequent research. However, the transition requires building a massive digital library of formalized physics—millions of lines of code—to train the next generation of AI tools to act as tireless sentinels for the scientific community. Source: Conway, N. (2026). For the first time, a computer found an error in a major physics paper. New Scientist.
To view or add a comment, sign in
-
-
Title: A Mathematical Framework for the Evolution of Collective Intelligence Across Civilizations 🧠🌍 I’m excited to share a new theoretical project by Ouadi Maakoul that builds a rigorous mathematical foundation for understanding how civilizations think, adapt, and evolve. This work moves beyond metaphors to develop a well-posed dynamical system for collective intelligence, defined by three necessary capacities: Information Storage, Technological Production, and Social Coordination. Key Contributions & Theorems: 📐 A Rigorous Framework: · Well-Posed Dynamics: The system is defined on a compact manifold with forward invariance proven via Nagumo’s theorem, guaranteeing solutions always stay within meaningful bounds. · Single-Civilization Analysis: A complete phase portrait reveals how synergy strength can lead to Hopf bifurcations (limit cycles) or fold bifurcations (civilizational collapse), with a proof of no chaos in the low-synergy regime. 🌐 Networked Civilizations: · Using spectral graph theory and a Master Stability Function on regular graphs, we derive conditions for Turing instabilities (the emergence of "intelligent hubs") and synchronization across a network of societies. 📈 Stochastic & Mean-Field Limits: · We derive the mean-field ODE for dense random graphs and a Fokker-Planck equation to study the distribution of capacities, allowing for a mathematical analysis of inequality and phase transitions. 🛡️ Robustness to Misinformation (A Novel Result): · Maakoul’s Robustness Theorem proves that collective intelligence is input-to-state stable (ISS) under bounded misinformation ("poisoned evidence"). Even with persistent, bounded errors, the system’s trajectory remains in a neighborhood of the ideal equilibrium—a formal guarantee of resilience. ⚙️ Computational Architecture: A modular Python library (civ_intel) accompanies the work for numerical verification and future empirical parameterization. This purely theoretical project provides a toolkit for studying civilizational rise, collapse, and differentiation, with broad applicability to other networked biological and social systems. Why this matters: It formalizes how societies process information, offering a mathematical language for resilience and fragility. GitHub 👇 https://lnkd.in/eW_xrykW #Mathematics #DynamicalSystems #CollectiveIntelligence #ComplexSystems #PhD #Research #NetworkScience #Robustness
To view or add a comment, sign in
-
Explore related topics
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development