Today, we're open-sourcing our flagship project: Maat -- a Turing-complete programming language designed from the ground up for writing zero-knowledge proofs (ZKPs). Rust syntax, Rust semantics, ZK-only execution. If you write Rust, you already know Maat. And every construct that's illegal in a zero-knowledge circuit -- floating-point arithmetic, global mutable state, raw pointers, dynamic dispatch, I/O -- is a compile-time error in Maat. There is no "standard mode." Every program that compiles is provable by construction. We started with a single question: what if a language was Rust-native but ZK-constrained from day one? That question became 11 releases, 14 compiler crates, and a production-quality pipeline: - A logos DFA lexer and winnow combinator parser - Hindley-Milner type inference (Algorithm W) with generics, algebraic data types, and exhaustive pattern matching - Structs, enums, traits, impl blocks, `Option<T>`, `Result<T, E>`, and the `?` operator - A file-based module system with dependency resolution, cross-module type checking, and visibility enforcement - A stack-based bytecode VM with 44 opcodes and deterministic execution - A standard library with higher-order methods on collections, typed numeric parsing, and comparison utilities - Security hardening: `#![forbid(unsafe_code)]` across all 14 crates, checked arithmetic on every operation, 7M+ fuzz runs with zero crashes, 9 property-based tests verifying type soundness and execution determinism, and a published threat model This is where Proof-Driven Development (PDD) begins -- software development where formal verification and mathematical proofs replace trust assumptions. The compiler frontend is complete. Next is the ZK backend. Version 0.12 will introduce a trace-generating VM that records execution traces suitable for STARK proof generation via the FRI protocol. Same compiled bytecode, dual backends -- one for development and testing, one for proving. We're targeting the Winterfell library first, with the architecture designed for Plonky3 and Stwo swappability. No trusted setup. Post-quantum secure. Transparent proofs. Beyond that, the roadmap includes native field element arithmetic (`Felt` type), linear/affine types to prevent unconstrained witness bugs, an effect system for provable functional purity, STARK-to-SNARK wrapping for compact on-chain verification, dependent types for expressing constraint invariants, and ultimately self-hosting. Maat is early stage, not yet audited, and not production-ready. But the foundation is solid and the direction is clear. If you're working in zero-knowledge cryptography, formal verification, or language design -- or if you're simply a Rust developer curious about what provable computation looks like -- we'd love for you to take a look. Star the repo. Try the REPL. Execute custom programs/examples. Break things. Tell us what you think: https://lnkd.in/dyYGQ2gj
Maat: Rust-based Zero-Knowledge Programming Language
More Relevant Posts
-
I'm very pleased to share what our team at Maat Labs have been cooking in the past few months... Maat: a Rust-native, Turing-complete programming language for writing zero-knowledge proofs (ZKPs). First milestone achieved: complete compiler frontend. Next is the ZK backend. We can now confidently build in the open. Star it, try it, break it! Your feedback and contributions are very much welcome.
Today, we're open-sourcing our flagship project: Maat -- a Turing-complete programming language designed from the ground up for writing zero-knowledge proofs (ZKPs). Rust syntax, Rust semantics, ZK-only execution. If you write Rust, you already know Maat. And every construct that's illegal in a zero-knowledge circuit -- floating-point arithmetic, global mutable state, raw pointers, dynamic dispatch, I/O -- is a compile-time error in Maat. There is no "standard mode." Every program that compiles is provable by construction. We started with a single question: what if a language was Rust-native but ZK-constrained from day one? That question became 11 releases, 14 compiler crates, and a production-quality pipeline: - A logos DFA lexer and winnow combinator parser - Hindley-Milner type inference (Algorithm W) with generics, algebraic data types, and exhaustive pattern matching - Structs, enums, traits, impl blocks, `Option<T>`, `Result<T, E>`, and the `?` operator - A file-based module system with dependency resolution, cross-module type checking, and visibility enforcement - A stack-based bytecode VM with 44 opcodes and deterministic execution - A standard library with higher-order methods on collections, typed numeric parsing, and comparison utilities - Security hardening: `#![forbid(unsafe_code)]` across all 14 crates, checked arithmetic on every operation, 7M+ fuzz runs with zero crashes, 9 property-based tests verifying type soundness and execution determinism, and a published threat model This is where Proof-Driven Development (PDD) begins -- software development where formal verification and mathematical proofs replace trust assumptions. The compiler frontend is complete. Next is the ZK backend. Version 0.12 will introduce a trace-generating VM that records execution traces suitable for STARK proof generation via the FRI protocol. Same compiled bytecode, dual backends -- one for development and testing, one for proving. We're targeting the Winterfell library first, with the architecture designed for Plonky3 and Stwo swappability. No trusted setup. Post-quantum secure. Transparent proofs. Beyond that, the roadmap includes native field element arithmetic (`Felt` type), linear/affine types to prevent unconstrained witness bugs, an effect system for provable functional purity, STARK-to-SNARK wrapping for compact on-chain verification, dependent types for expressing constraint invariants, and ultimately self-hosting. Maat is early stage, not yet audited, and not production-ready. But the foundation is solid and the direction is clear. If you're working in zero-knowledge cryptography, formal verification, or language design -- or if you're simply a Rust developer curious about what provable computation looks like -- we'd love for you to take a look. Star the repo. Try the REPL. Execute custom programs/examples. Break things. Tell us what you think: https://lnkd.in/dyYGQ2gj
To view or add a comment, sign in
-
Introducing Serena: An MCP Toolkit for AI Coding Agents 🚀 Instead of fragile text operations, Serena provides: • Symbol-level code retrieval across entire codebases • Intelligent cross-file renames & refactoring • Support for 40+ programming languages • Integration with Claude Code, Codex, and more • Problem solved: Agents can perform complex refactoring in one atomic operation vs. 8-12 error-prone steps • Real-world impact: Makes AI agents work faster and more reliably on complex codebases Built with: Python, LSP (Language Server Protocol), Model Context Protocol (MCP) https://lnkd.in/gzF9_UrB
To view or add a comment, sign in
-
How many programming languages do you know? Most devs say 3-4. The real number is closer to a dozen. SQL is a programming language. So is Regex, your Dockerfile, your Terraform config, your .gitignore. Every one has a grammar, a parser, and a compiler. You've been a polyglot without knowing it. I spent a few days going deep on how compilers actually work - the machinery underneath every language we use. It turned my mental model of "code runs on machine" into something much richer. Every compiled language follows the same 7-stage pipeline: Lexer → Parser → AST → Name Resolution → Type Checker → IR → Code Generation The magic is Stage 6 - the Intermediate Representation. It decouples "what your code means" from "what platform runs it." That's how Haxe compiles to 8 targets from one source. How Kotlin Multiplatform ships the same code to JVM, browser, and native. How you could compile one business logic spec into both a Go server binary AND a WASM module in the browser - enforcing the same validation rules on both sides. The type system determines what gets rejected before your code runs. And type systems aren't binary - they exist on a ladder: - Level 0: Untyped (Python dict) - Level 1: Structural (JSON Schema) - Level 2: Nominal types (Go's type CustomerID int64) - Level 3: Algebraic types (Rust enums, exhaustive match) - Level 4: Dimensional types (F# prevents dollars + kilograms at compile time) - Level 5: Refinement types (negative balances become unrepresentable) - Level 6: Dependent types (research territory) Most codebases sit at Level 1 or 2. Each step up catches entire bug classes the level below misses entirely. The Mars Climate Orbiter lost $327.6M to a unit confusion bug. One team sent thrust in pound-force seconds. The other expected newton-seconds. The spacecraft burned up entering Mars' atmosphere. A dimensional type system would have caught that at compile time. Every numeric field in your codebase that's just float64 - with no distinction between dollars, kilograms, percentages, days - is a smaller Mars Climate Orbiter waiting to happen. --- What I took away: Types catch categorical bugs - wrong shape, wrong unit, missing case. Tests catch semantic bugs - wrong formula, wrong business logic. You need both. Neither replaces the other. Parse, don't validate: check untrusted input at the boundary, exactly once. Wrap in a refined type. Trust it downstream. Defensive checks scattered across 20 files collapse into one smart constructor at the edge. The type system is already in your language. Use it. Also posted on my blog: https://lnkd.in/gVJBUpMt #Programming #Compilers #TypeSafety #SoftwareEngineering #DevTools
To view or add a comment, sign in
-
Python continues to stand out as a powerful choice for backend development — and for good reason. Its simplicity and readability allow developers to write clean, maintainable code faster, reducing complexity and improving long-term scalability. This makes it an excellent option for both rapid prototyping and large-scale applications. One of Python’s biggest strengths is its vast ecosystem. With a rich set of libraries and frameworks, developers can build robust systems without reinventing the wheel. From web development to data processing and AI integration, Python provides tools that accelerate development and innovation. Performance is often discussed, but in many real-world applications, development speed, maintainability, and ecosystem support outweigh raw execution speed — and that’s where Python truly excels. Additionally, Python’s strong community ensures continuous improvement, extensive documentation, and quick problem-solving, which are invaluable in a production environment. In a world where time-to-market and adaptability matter more than ever, Python empowers teams to deliver reliable backend solutions efficiently. #Python #BackendDevelopment #SoftwareEngineering #Programming #Tech
To view or add a comment, sign in
-
-
Hello connections Python is often praised for its simplicity, but its true power lies in advanced features that enable developers to write efficient, scalable, and elegant code. If you're looking to level up, here are some key concepts that define advanced Python programming. 1. Decorators – Writing Smarter Functions** Decorators allow you to modify the behavior of functions without changing their code. They’re widely used for logging, authentication, and performance monitoring. 2. Generators & Iterators – Memory Efficient Coding** Instead of loading entire datasets into memory, generators yield values one at a time. This is especially useful when working with large data streams. 3. Context Managers – Clean Resource Handling** Using `with` statements ensures proper acquisition and release of resources like files or database connections, making your code safer and cleaner. 4. Multithreading & Multiprocessing – Performance Boost Python provides powerful libraries to run tasks concurrently. While multithreading is useful for I/O-bound tasks, multiprocessing helps in CPU-bound operations. 5. Async Programming – The Future of Python With `async` and `await`, Python handles asynchronous operations efficiently, making it ideal for web applications and APIs. 6. Metaclasses – Controlling Class Creation** Metaclasses allow you to customize how classes themselves are created. Though complex, they are powerful tools in frameworks and libraries. 7. Type Hinting – Writing Maintainable Code Type hints improve code readability and help catch bugs early, especially in large-scale projects. Advanced Python isn't just about writing complex code—it's about writing *better* code. It improves performance, scalability, and maintainability, making you stand out as a developer. Don’t just learn Python—master it. The deeper you go, the more opportunities you unlock in fields like AI, backend development, and automation. #Python #AdvancedPython #Programming #SoftwareDevelopment #Coding #Learning #Tech #snsinstitutions #snsdesignthinkers#designthinking
To view or add a comment, sign in
-
💀 Python, C++, and Java are the new Assembly. And you don't need to write them anymore. Let's be honest, even if this triggers a lot of developers right now. All modern programming languages have finally degraded (or evolved?) to the level of machine code. Today, there is zero difference between manually writing Python or C++ and poking around in Assembly registers. It’s just low-level grunt work. The only true, genuinely high-level, and efficient way for a creator to communicate with their project is a surgically precise query language for Opus and Sonnet. We are no longer programmers in the traditional sense. We are architects of meaning. AI models are our new compilers, translating pure logic into that syntactic garbage of brackets, indents, and strict typing. What actually dictates whether you’re a Senior or a fossil today? Your prompt. If the model doesn’t spit out working code without crutches on the very first try, you simply don't know how to define a task. Your token greed. We used to fight for CPU cycles; now we fight for context windows. Every extra word is wasted money and a dumbed-down neural network. Cut the fluff. Leave only the pure concentrate of meaning. Everything else—holy wars over syntactic sugar, framework battles, patterns for the sake of patterns, and manual refactoring—absolutely does not matter anymore. If you’re still proudly smashing your keyboard to manually type out boilerplate, congratulations: you’re punching cards in the quantum computing era. The future is already here. You either drive the compiler via Opus/Sonnet, or you become the one this compiler is about to replace. 🤷♂️
To view or add a comment, sign in
-
I switched from n8n to Python + Claude Code mid-project. Best call I made all quarter. Here's the honest comparison. n8n is not the automation tool you think it is. It's perfect for 3-step workflows. It becomes a debugging nightmare past that. I've built workflows in both — here's the honest breakdown. n8n wins when: → The workflow is small (under 5 nodes) → Speed to first result matters more than everything → The person building it isn't a developer But complexity changes the math fast. A 20-node workflow breaks. You open the visual editor to find the problem. Half your afternoon is gone. And the AI token cost while building medium to large flows? Every tweak, every node adjustment burns more than you'd expect. It compounds quietly. That's where OpenClaw(or Claude Code) + Python changes everything. For medium to large workflows: → Debugging is just reading code — no visual maze → Building is faster, less back-and-forth with AI → Token usage drops significantly The visual layer feels like a feature when you start. It becomes friction when the workflow grows. Code doesn't have that problem. My rule now: → Quick, simple automations → n8n → Everything from medium up → Python + Claude Code (And I am NOT a Python Developer! I just can understand the generated code. But that is not the point. I just have to specify what I want and if anything breaks have to say what broke and how it is supposed to be. On the other hand, with n8n debugging is a nightmare! Try it out!!! The tool you prototype with isn't always the one you should scale with. Follow me for more honest takes on AI tooling. What's your experience been? Drop your thoughts below.
To view or add a comment, sign in
-
I sat through 3 semesters of OOP. . . . I could recite the four pillars. I could not tell you why they existed. That's a problem. Because OOP isn't about memorizing Inheritance or Polymorphism — it's about asking: "Who is responsible for making this happen?" That one shift changes how you write code forever. I just published a full breakdown on Dev.to: → What each pillar actually solves → When and where to use them → Real Python examples you can steal 🔗 https://lnkd.in/dnvBBuj4 Go give it a read and a ❤️ And if you want more no-nonsense engineering content, follow me on Dev.to. I write for developers who want to understand, not just copy-paste. #Programming #Python #OOP #SoftwareDevelopment #TechCommunity #Developers #CodeNewbie
To view or add a comment, sign in
-
I heard a tip to use Rust instead of Python whenever you are coding with AI due to the speed and more importantly the validation. The code wont compile if there are errors. Unlike writing in the Python where you have to do the validation for AI and go back and forth with prompts to fix it. I'm finding it way faster to generate to code. Even though I dont know Rust that well it will be a great learning experience. Right now I'm using Claude Code but I might switch back to OpenCode's models again to see if that works. https://lnkd.in/gaDkaHXu
To view or add a comment, sign in
-
Got it — I’ll create a complete 15-slide Gamma-ready presentation using a clear, student-friendly + professional style (mix of Outline 1 & 2) so it looks clean and easy to present. You can copy-paste this directly into Gamma AI 👇 --- 🎯 Presentation Title: Parse Tree & Ambiguity --- Slide 1: Title Slide Parse Tree & Ambiguity Understanding Syntax Trees in Compiler Design Presented by: Shakil Course: Compiler Design / Theory of Computation --- Slide 2: Introduction In programming languages, structure matters Syntax defines how code is written Parse trees help represent structure Important for compilers and interpreters --- Slide 3: What is a Grammar? Grammar defines rules of a language Uses production rules Example: E → E + E E → id Known as Context-Free Grammar (CFG) --- Slide 4: What is a Parse Tree? A tree representation of a string Shows how grammar generates a string Root = Start symbol Leaves = Final string (terminals) --- Slide 5: Components of Parse Tree Root Node → Start symbol Internal Nodes → Non-terminals Leaf Nodes → Terminals Edges → Production rules --- Slide 6: Derivation Concept Step-by-step rule application Two types: Leftmost derivation Rightmost derivation Both can produce same string --- Slide 7: Example Grammar Grammar: E → E + E E → E * E E → id Example string: id + id * id --- Slide 8: Constructing Parse Tree Start from root (E) Apply production rules Expand step-by-step Reach terminal string --- Slide 9: What is Ambiguity? A grammar is ambiguous if: 👉 One string has more than one parse tree Leads to multiple meanings Causes confusion in compilers --- Slide 10: Ambiguous Grammar Example Grammar: E → E + E E → E * E E → id String: id + id * id --- Slide 11: Two Parse Trees (Concept) First interpretation: 👉 (id + id) * id Second interpretation: 👉 id + (id * id) ⚠️ Same string, different meanings! --- Slide 12: Problems Caused by Ambiguity Confuses compiler Incorrect evaluation Hard to predict output Not suitable for programming languages --- Slide 13: Removing Ambiguity Use rules like: ✔ Operator precedence ✔ Associativity Example (fixed grammar): E → E + T | T T → T * F | F F → id --- Slide 14: Why Ambiguity Matters Important in compiler design Ensures correct program execution Used in parsing algorithms Helps avoid logical errors --- Slide 15: Conclusion Parse tree shows structure of strings Ambiguity creates multiple interpretations Must be removed for correct parsing Essential concept in computer science --- ✅ Bonus Tip for Gamma AI When you paste this into Gamma: Choose "Presentation" mode Select modern theme Add diagrams for: Parse tree Ambiguity example --- If you want next level 👇 I can also: ✅ Add **visual parse tree diagrams** ✅ Convert into **PowerPoint / PDF** ✅ Make it **super stylish (design + icons + animations)** Just tell me 👍
To view or add a comment, sign in
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development