pdf Transactions on Computational Systems Biology XII: Special Issue on Modeling Methodologies Rainer Breitling, Robin A. Donaldson (auth.), Corrado Priami, Rainer Breitling, David Gilbert, Monika Heiner, Adelinde M. Uhrmacher (eds.) https://lnkd.in/eFgfFrFB The LNCS journal Transactions on Computational Systems Biology is devoted to inter- and multidisciplinary research in the fields of computer science and life sciences and supports a paradigmatic shift in the techniques from computer and information science to cope with the new challenges arising from the systems oriented point of view of biological phenomena. This special issue of the journal focuses on the topic of modeling methodologies. It starts with a position paper by the guest editors, entitled Biomodel Engineering - from Structure to Behavior, which is followed by the technical contributions covering a broad range of modeling methodologies. Two papers focus on new modeling languages, and these are followed by an article presenting a case study demonstrating the value of the qualitative network approach. With the remaining three contributions, the special issue leaves the area of qualitative modeling, to move toward quantitative programming with the BlenX language and the application of more theoretical process calculi. digzon The LNCS journal Transactions on Computational Systems Biology is devoted to inter- and multidisciplinary research in the fields of computer science and life sciences and supports a paradigmatic shift in the techniques from computer and information science to cope with the new challenges arising from the systems oriented point of view of biological phenomena. This special issue of the journal focuses on the topic of modeling methodologies. It starts with a position paper by the guest editors, entitled Biomodel Engineering - from Structure to Behavior, which is followed by the technical contributions covering a broad range of modeling methodologies. Two papers focus on new modeling languages, and these are followed by an article presenting a case study demonstrating the value of the qualitative network approach. With the remaining three contributions, the special issue leaves the area of qualitative modeling, to move toward quantitative programming with the BlenX language and the application of more theoretical process calculi. #simple #Biology #AdelindeM.Uhrmachereds. #CorradoPriami #DavidGilbert #MonikaHeiner #RainerBreitling #RobinA.Donaldsonauth. https://lnkd.in/egppkBWa
Transactions on Computational Systems Biology: Modeling Methodologies
More Relevant Posts
-
What Biology Has Been Missing — and What the Life Protocol Finally Provides For more than a century, biology has been a science of observation. We learned to sequence, to catalog, to classify, to describe. But beneath all of that knowledge, something essential was missing. Biology had data. Biology had tools. Biology had models. Biology had metaphors. What biology did not have was architecture. There was no unified structure that explained how living systems were organized. No protocol that governed how information flows. No layered model that connects molecules to meaning, meaning to behavior, behavior to coordination, coordination to stability. Life was treated as complexity, not structure. As emergence, not design. As exception, not system. This is the gap the Life Protocol fills. The Life Protocol is the first unified architecture of living systems — a 7‑layer conceptual and operational stack that makes life predictable, programmable, and engineerable. And at the center of the architecture is something biology has never had: The Life / Genome Compiler. The Life Compiler transforms biological intent into structured biological behavior. It takes unstructured genomic substrate and compiles it into: • semantics • workflows • agents • orchestration • envelopes • governance It is the architectural function that turns life into a system. The Life Compiler runs in both directions. It can interpret biological substrate upward into the 7‑Layer Life Architecture — and it can also compile architectural intent downward into genomic expression plans. This is what makes life not just understandable, but engineerable. The full architecture will be released this weekend. It’s the second pillar in the canon after The Architecture of Quantum Computing. Biology is becoming an engineered discipline. The architecture is now visible. The future of life begins here.
To view or add a comment, sign in
-
-
Reproducibility is becoming a central challenge in computational materials science, especially as workflows grow more complex, data-driven, and distributed across HPC infrastructures. Glad to share this new collective work from the DIAMOND digital platform of the PEPR DIADEM: “Reproducible Container Solutions for Codes and Workflows in Materials Science” https://lnkd.in/eDhfZ_9M This paper presents robust (and elegant) solutions combining GNU Guix and Apptainer to build fully declarative, reproducible software environments. Beyond the technical advance, we're glad to have been able to integrate complete workflows, from ab initio simulations to machine learning and large-scale data analysis, in a consistent and portable framework. 🚀 ✨ 💪 ⚡ Kudos to all the contributors involved in this collective effort, an excellent example of what coordinated, community-driven initiatives can achieve. #Reproducibility #HPC #MaterialsScience #MachineLearning #OpenScience #DIAMOND #DIADEM Dylan Bissuel, Léo Orveillon, Benjamin Arrondeau, João Paulo Almeida de Mendonça, Jonathan DAUBIN, Irina Piazza, PhD, Martin Uhrin, Etienne Polack, Akshay Krishna Ammothum Kandy, David Martin-Calle, Jonathan C., Aadhityan A, Lorenzo Paulatto, Pierre-Antoine Bouttier, Marie-Ingrid Richard, Thierry Deutsch, David Rodney, A. Marco Saitta, Noel Jakse
To view or add a comment, sign in
-
Automated Response Surface Methodology: Computational Replication and Validation Framework for Optimizing Supercapattery Materials https://lnkd.in/g53NjvyJ By Thiago F. and Simoni Margareti Plentz Meneghetti From the 1st International Online Conference on Designs Combining Response Surface Methodology (RSM) with Central Composite Design (CCD) is a powerful statistical approach to optimizing materials in energy storage systems. This study presents an open-source Python (v3.8+) framework that replicates and validates the RSM-based optimization of NiCo2S4–graphene supercapattery materials. We validated the framework by replicating a 20-experiment CCD analyzing graphene/NCS ratios, hydrothermal time, and S/Ni molar ratios. Advanced optimization using the Differential Evolution algorithm was integrated to efficiently solve the high-dimensional response surface space. The model explained 97.16% of the variance, and comprehensive diagnostic tests confirmed the assumptions of normality and residual independence. This approach provides an open-source methodology that supports reproducible and scalable data-driven material design and facilitates transparent computational materials science studies. #Python #OpenScience #DataDriven
To view or add a comment, sign in
-
Beyond the Theorem: Mapping the Structural DNA of Schur’s Theorem. Mathematical mastery is rarely found in the result alone; it is found in the ability to navigate the underlying graph of logical dependencies. For researchers and educators, the "Black Box" of complex proofs is often the greatest barrier to deep understanding. At Arkheion, we are deconstructing this complexity using the Math Tree’s Subtree feature. The Architecture of Logic. Instead of a static page, we provide a dynamic, high-fidelity deconstruction of mathematical truth. Our Subtree Builder and Educational Dashboard allow users to isolate specific logical pathways, creating a personalized "map" of a theorem’s lineage. Take Schur’s Theorem as the prime example. By utilizing the subtree feature, we can visually peel back the layers of its derivation: - The Foundation: The subtree begins with the fundamental definition of The Complex Field and Inner Product Spaces. - The Bridge: It tracks the necessity of an Orthonormal Basis. - The Result: It culminates in the formal proof that for an operator, there exists an orthonormal basis such that the matrix representation is upper triangular. Precision via Interactive Visualization. The power of this feature lies in its Interactive Graph Rendering. Users can create a Subtree to focus purely on the relevant axioms and definitions without losing the broader context. Custom Curation: Educators can use the Subtree Builder to build custom pedagogical paths for their students, highlighting only the "ancestor nodes" required for a specific lecture. Logical Scaffolding: Students can use History Tracking and Favorites to bookmark their progress through a complex derivation, ensuring no concept is left misunderstood. We aren't just building a database; we are building a new standard for academic clarity, one node at a time. Explore Schur's Theorem Subtree and the future of mathematical discovery at: TheMathTree.net #Mathematics #AcademicResearch #EdTech #LinearAlgebra #ComplexAnalysis #DataVisualization #TheMathTree #Arkheion
To view or add a comment, sign in
-
𝐌𝐞𝐭𝐚 𝐅𝐀𝐈𝐑 𝐫𝐞𝐬𝐞𝐚𝐫𝐜𝐡𝐞𝐫𝐬 𝐢𝐧𝐭𝐫𝐨𝐝𝐮𝐜𝐞𝐝 𝐑𝐞𝐚𝐬𝐨𝐧𝐢𝐧𝐠 𝐌𝐞𝐦𝐨𝐫𝐲. A retrieval-augmented generation framework designed to improve language model reasoning. Instead of indexing generic documents, the system builds a datastore of 32 million subquestion-subroutine pairs extracted from existing reasoning trajectories. At inference time, models verbalize their current subquestion to retrieve relevant procedural knowledge, which is then injected into the reasoning trace. This approach consistently outperforms standard RAG and compute-matched baselines across math, science, and coding benchmarks. Reasoning Memory improves accuracy by up to 19.2% over non-retrieval methods and 7.9% over the strongest compute-matched baselines. paper 👉 https://lnkd.in/dPrWMafW code 👉 https://lnkd.in/dXpQXR5D
To view or add a comment, sign in
-
-
I just published a book. Project Event Horizon: A Complete Theory of Market Microstructure, Topology, and Causal Intelligence. 13 chapters. 80 figures. 473 equations. Seven branches of mathematics unified into a single framework for detecting market collapse before it happens. This is not a blog post collection. This is a full monograph with three parts, a mathematical appendix formalizing every method from first principles, plain language explanations for practitioners, and an implementation guide for anyone who wants to build on it. What it covers: Part I lays the foundations. Why Gaussian assumptions fail at the boundaries. The mathematical toolkit spanning persistent homology, Ricci curvature, Hawkes processes, do-calculus, extreme value theory, transfer entropy, and multifractal analysis. Part II is the signal discovery program. Seven phases of progressive experimentation. Phase I proves most alpha is regime-dependent noise. Phase II proves intervention fails during topological collapse. Phase III proves the arbitrage exists but physics prevents exploitation. Phase IV identifies locally deterministic windows. Phase V maps microstructure contagion across asset classes. Phase VI shows DeFi signals TradFi 35 bars early. Phase VII integrates all 15 signals into a Grand Unified Model with a Sharpe of 2.362. Part III is the synthesis. A unified theory of market phase transitions. A practical implementation guide. Conclusions and future directions. Every equation is derived. Every figure is reproducible. Every claim is grounded in 26 citations spanning Sandhu et al. in Science Advances, Gidea and Katz, Pearl, Hawkes, Granger, and more. Sornette's "Why Stock Markets Crash": $65. Lopez de Prado's "Advances in Financial Machine Learning": $55. This one: Free on Academia. One author. No team. No funding. No institution. A desktop GPU and 18 months of work. Read it free here: https://lnkd.in/enJUmG95 #QuantitativeFinance #AlgorithmicTrading #Topology #MachineLearning #Mathematics #Research
To view or add a comment, sign in
-
#Research #QuantitativeResearchMethods #Optimization #NatureInspiredAlgorithms Mathematical Foundations of Nature-Inspired Algorithms An insightfull Book by Xin-She Yang & Xing-Shi He, as a Part of the book series: SpringerBriefs in Optimization (BRIEFSOPTI), Publisher, Springer Cham, 2019: XI+107 pp. Such as explaned in the preface, " This book will attempt to provide a systematic approach to rigorous mathematical analysis of these algorithms based on solid mathematical foundations, and the analyses will be from different angles and perspectives. There are many good textbooks on the detailed introduction of traditional optimization techniques and new algorithms. In contrast, the introduction of algorithms in this book will be relatively brief and will introduce the most relevant algorithms for the purpose of analysing and understanding nature-inspired algorithms. Thus, the introduction of algorithms includes traditional algorithms such as gradient-based methods and new algorithms such as swarm intelligence-based algorithms. The emphasis will be placed on the latest algorithms and their analyses from a wide spectrum of theories and frameworks. Thus, readers can gain greater insight into the main characteristics of different algorithms and understand how and why they work for solving optimization problems in real-world settings. More specifically, in-depth mathematical analyses will be carried out from different perspectives, including complexity theory, fixed-point theory, dynamical system, self-organization, Bayesian framework, Markov chain framework, filter theory, statistical learning and statistical measures." Keywords: General Formulation of Optimization, Essence of an Algorithm, Unconstrained Optimization, Gradient-Based Optimization Techniques, Gradient-Free Methods and Metaheuristics, Nature-Inspired Algorithms, Stability of an Algorithm, Algorithm Analysis, Markov Chain Monte Carlo, Bayesian Framework, Swarm Intelligence, Ant Colony Optimization, Particle Swarm Optimization, Bees-inspired Algorithms, Bat Algorithm, Firefly Algorithm, Cuckoo Search, Hyper-Optimization, Parameter Tuning and Control, Filter Theory. DOI https://lnkd.in/d4yEVqR2
To view or add a comment, sign in
-
Universal Recursion (UR) is a structural framework for understanding how complex systems are generated, organized and persist across scale. It proposes that phenomena in mathematics, physics, biology and computation can be described in terms of recursive processes—simple rules applied repeatedly to produce increasingly complex structure. UR does not replace existing theories, but reframes them within a common generative structure, preserving their validity while clarifying their relationships across scale. At the core of the framework is the recursive kernel ℛ, a minimal operator capable of generating rich pattern spaces through iteration. The kernel is not directly observable; what we encounter instead are resolution-dependent descriptions, where complexity becomes intelligible at different levels of detail. UR organizes these descriptions into five approximation regimes. The Geometric regime captures visible form, including spatial patterns, symmetries and fractal structure. The Field regime describes continuous quantities distributed across space and time, including physical fields and differential equations. The Statistical regime applies where large numbers of interacting components are best understood through probability and macroscopic laws. The Informational regime concerns the encoding and constraints of information. The Algorithmic regime describes rule-based processes and computational procedures that generate or simulate recursive dynamics. Some of the most interesting phenomena arise at the interfaces between these regimes. The approximation regimes are connected by structure-preserving mappings, or functors, which relate descriptions across scale while maintaining persistent structure. Their existence suggests that transitions between levels are not arbitrary, but systematically related. A key component of UR is the logarithmic scale parameter, s = log(R), which governs resolution. Moving along this axis corresponds to observing systems at different levels of detail, where different mathematical descriptions are appropriate. Patterns that persist across scale represent structurally significant features, while others resolve and disappear. Support for the UR framework comes from two complementary directions. The Well Bred Fractals Atlas (WBFA), a curated collection of recursively generated forms, provides controlled, repeatable evidence that simple recursive rules generate rich, multi-scale structure. Each image serves as a viewing port into a vast mathematical universe. In the natural world, structurally similar patterns appear across a wide range of systems, including branching forms, diffusion-limited growth, turbulence, biological pattern formation, and scaling laws in complex systems. The recurrence and persistence of these structures across different domains suggest that related generative principles are operating across scale. I welcome comments, questions and objections, particularly from specialists in relevant fields.
To view or add a comment, sign in
-
-
We are always looking for ways to optimize self-improvement recursion orchestrations. The Universal Recursion model (#UR) is quite comprehensive. Since many underlying contextual data fragments (DF), intermediate states, plans, rubrics, etc. are managed through geometrical and other (mathematical) functions this might be the perfect meta-level description of recursion at different levels of granularity. #AI #agenticAI #genAI #recursiveselfimprovement Vince Kellen PhD
Universal Recursion (UR) is a structural framework for understanding how complex systems are generated, organized and persist across scale. It proposes that phenomena in mathematics, physics, biology and computation can be described in terms of recursive processes—simple rules applied repeatedly to produce increasingly complex structure. UR does not replace existing theories, but reframes them within a common generative structure, preserving their validity while clarifying their relationships across scale. At the core of the framework is the recursive kernel ℛ, a minimal operator capable of generating rich pattern spaces through iteration. The kernel is not directly observable; what we encounter instead are resolution-dependent descriptions, where complexity becomes intelligible at different levels of detail. UR organizes these descriptions into five approximation regimes. The Geometric regime captures visible form, including spatial patterns, symmetries and fractal structure. The Field regime describes continuous quantities distributed across space and time, including physical fields and differential equations. The Statistical regime applies where large numbers of interacting components are best understood through probability and macroscopic laws. The Informational regime concerns the encoding and constraints of information. The Algorithmic regime describes rule-based processes and computational procedures that generate or simulate recursive dynamics. Some of the most interesting phenomena arise at the interfaces between these regimes. The approximation regimes are connected by structure-preserving mappings, or functors, which relate descriptions across scale while maintaining persistent structure. Their existence suggests that transitions between levels are not arbitrary, but systematically related. A key component of UR is the logarithmic scale parameter, s = log(R), which governs resolution. Moving along this axis corresponds to observing systems at different levels of detail, where different mathematical descriptions are appropriate. Patterns that persist across scale represent structurally significant features, while others resolve and disappear. Support for the UR framework comes from two complementary directions. The Well Bred Fractals Atlas (WBFA), a curated collection of recursively generated forms, provides controlled, repeatable evidence that simple recursive rules generate rich, multi-scale structure. Each image serves as a viewing port into a vast mathematical universe. In the natural world, structurally similar patterns appear across a wide range of systems, including branching forms, diffusion-limited growth, turbulence, biological pattern formation, and scaling laws in complex systems. The recurrence and persistence of these structures across different domains suggest that related generative principles are operating across scale. I welcome comments, questions and objections, particularly from specialists in relevant fields.
To view or add a comment, sign in
-
-
Dear network, Attention mechanisms are at the heart of modern language models based on the Transformer architecture, allowing these models to encode information through the pairwise interaction of tokens. Unfortunately, these methods suffer from a computational bottleneck that scales quadratically with sentence length. I am pleased to share a recent preprint where we provide an overview, from the point of view of applied mathematics, of different approximation methods aimed at reducing this bottleneck, as well as alternative models which do not suffer from the shortcomings of regular attention. The methods we study range from clustering, sparsity, and kernel methods to tensor-based approaches. We introduce a taxonomy of these existing techniques to present these developments within a unified mathematical framework, highlighting opportunities for further contributions from numerical linear algebra to the design of scalable attention mechanisms. This preprint is the result of a workgroup, which I had the opportunity to co-lead alongside Laura Grigori(lead) and Alice Cortinovis(co-lead), on Randomized Numerical Linear Algebra for the Transformer architecture that originated from the IPAM "Randomized Numerical Linear Algebra (RNLA)" Research Collaboration Workshop in August 2025. I am proud to share the first outcome of our collaboration: Attention Mechanisms Through the Lens of Numerical Methods: Approximation Methods and Alternative Formulations https://lnkd.in/evv7HG58 For an introduction on how attention mechanisms work, you can also find the introductory document I wrote for the IPAM workshop here: Understanding Transformers and Attention Mechanisms: An Introduction for Applied Mathematicians https://lnkd.in/euyyiGBK I would also like to extend a sincere thank you to the organizers of the workshop and to all my co-authors and collaborators who made this work possible: Alice Cortinovis, Yijun Dong, Diana Halikias, Anna Ma, Fabio Matti, Deanna Needell, Katherine J. Pearce, Elizaveta Rebrova, Disha Shur, Rudi Smith, Hai-Xiao Wang, and Laura Grigori.
To view or add a comment, sign in
Explore related topics
- Computational Biology Algorithms
- Systems Biology Simulation Techniques
- Synthetic Biology and Computational Design
- Biophysical Modeling Techniques
- Structural Biology and Molecular Modeling
- Systems Biology Modeling
- Mathematical Models of Biological Processes
- Predictive Modeling of Biological Systems
- Integrative Biology and Systems Science
- Biostatistical Methods for System Modeling
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development