You write code. But do you know what actually runs? 💻⚙️ Python. Java. C++. All of it eventually turns into something much closer to raw silicon and electricity. And that layer? Most developers never truly understand it. I’m building something to fix that. Introducing: Silicon to Assembly 🚀 A documentation project that breaks down what really happens inside a CPU — from the lowest level up to assembly language. No shortcuts. No black boxes. No “just accept it”. Just pure logic. As Charles E. Leiserson once said: "If you really wanna understand something, you want to understand it to a level that’s necessary and then one level below that, because it gives you an insight as to why that layer is what it is and what’s really going on." 📌 What makes it different? Explains every term — nothing assumed Completely architecture-independent Focuses on understanding, not memorizing Covers the full journey: → CPU internals → Registers & memory → Instruction flow → Assembly If this sounds interesting: Check it out → https://lnkd.in/gVDr7Jmp 🔗 And if you find it valuable, consider dropping a ⭐ on the repo — it really helps the project grow. Most people code. Very few understand what their code becomes. Be one of them. #AssemblyLanguage #LowLevel #ComputerScience #CPU #OpenSource #LearnInPublic #TechEducation
Silicon to Assembly: Understanding CPU Internals
More Relevant Posts
-
A conversation that happens more often than it should: Customer: Can we just add a Python script to handle that part? Embedded Engineer: stares into the distance We have 64KB of RAM. The OS is bare metal. There is no file system. Python itself is 30MB. Customer: So... by when? If you have lived this conversation, you know the specific silence that follows. The good news is that Python genuinely belongs in embedded development — just not where everyone tries to put it. Test automation, host-side tooling, data analysis from captured logs — Python is excellent at all of that. Running on the device itself at 48MHz with no MMU? That is a different story. Use the right tool for the right layer. Always. Happy Friday to everyone debugging things that should not need debugging. 🙂 #EmbeddedSystems #Python #Firmware #EmbeddedHumour #PandianPosts #Embien
To view or add a comment, sign in
-
Programming languages are not all the same! They are categorized into three main levels: 1. Machine Language – Binary code directly understood by the CPU 2. Assembly Language – Low-level, human-readable code using mnemonics 3. High-Level Language – Easy-to-read languages like Python, C++, Java, JavaScript. High-level languages are widely used today because they save time, reduce errors, and are easy to learn. #ProgrammingBasics #FutureSkills #Python #Cplusplus #Java #JavaScript #CodingEducation #LearnProgramming
To view or add a comment, sign in
-
-
Day 56 - DSA Practice Solved the Isomorphic Strings problem, which checks if two strings can be transformed into each other by consistently mapping characters. Each character in the first string must map to exactly one character in the second string without conflicts. A mapping structure is used to ensure that no two characters map to the same character incorrectly. The approach validates both forward mapping and uniqueness of mapped values. This problem strengthens understanding of hash-based mapping and character relationships in strings. #LeetCode #DataStructures #Algorithms #Java #CodingPractice #ProblemSolving #TechSkills #Programming
To view or add a comment, sign in
-
-
In real, THREADS don’t actually run in parallel in Python… Here’s what really happens: >They just act like they are running in parallel. >Thread A starts working like it owns the system. Thread B is ready to jump in, but Python says, “Wait for your turn.” So, B waits. Then A pauses, B runs, then C gets a chance. It’s super-fast switching, so it looks like everyone is working together. >It’s basically one laptop shared by multiple users. Everyone looks busy, but only one is typing at a time. >This happens because of the Global Interpreter Lock in Python, which allows only one thread to execute at a time. >However, for I/O-bound tasks like API calls or file handling, threads still improve performance because while one waits, another runs. #Important_Point And here’s the key: in languages like Java, C++, and Go, threads can truly run in parallel across multiple CPU cores. #python #threads #multithreading #java #c++ #coding
To view or add a comment, sign in
-
-
Scoping is a fundamental concept in any programming language, continuing the series I have tried to cover some basics of scope. Read “Js scopes“ by Utkarsh Sahay on Medium: https://lnkd.in/gC2Q8EVh
To view or add a comment, sign in
-
A couple of years ago, I wrote that "The Builder pattern is a finite state machine!". A state machine consists of states and transitions between them. As a developer, I want to make illegal states unrepresentable, i.e., users of my API can’t create non-existent transitions. My hypothesis is that only a static typing system allows this at compile-time. Dynamic typing systems rely on runtime validation. In this blog post, I will show that it holds true, with a caveat. If your model has many combinations, you also need generics and other niceties to avoid too much boilerplate code. My exploration will use #Python, #Java, #Kotlin, #Rust, and #Gleam. With that background in mind, let’s move on to the model itself. #builderpattern #finiteStateMachine #illegalState #programming #coding (link in the comments)
To view or add a comment, sign in
-
-
**"C and C++ are dead" is the ultimate "cutting the branch you're sitting on" moment in programming.** Python, JavaScript, Go, Rust… all the shiny high-level languages you love? They’re literally **sitting on top of C/C++**. - CPython = written in C - NumPy, pandas, PyTorch, TensorFlow, OpenCV = blazing-fast C/C++ cores - Your browser, OS kernel, game engines, self-driving code… still C/C++ Scripting languages didn’t replace C/C++. They **grew on top of them** and made the foundation invisible. Saying we should ditch C/C++ is like a bird claiming the tree is useless while chilling in the nest built on its branches. The low-level world still carries everything. And it’s not going anywhere. #Cpp #C #Python #Programming
To view or add a comment, sign in
-
I just hit a massive milestone with the Chuks programming language compiler. Our AOT (Ahead-of-Time) compiler now builds a 50,000 file project into a native binary in 30 seconds. For context, this same build took 22 minutes just days ago. That's a 43x speedup. How I got there: - Parallelized module transpilation across all CPU cores - Replaced 16M+ per-build regex compilations with pre-compiled patterns and fast string matching - Used single-pass string replacement instead of sequential passes - Kept deterministic output ordering despite concurrent execution How this compares for native binary compilation at 50K files: - Chuks: ~30s - Rust: 10-30+ min - C#/NativeAOT: 5-20 min - Java/GraalVM: 10-30+ min - C++: 10-60+ min 274/274 tests passing. Zero regressions. This will be added in the next release of Chuks. - Follow ChuksLang on X for more updates: https://x.com/Chukslang - Join Chuks community on Cleset: https://lnkd.in/exgQCvrK #ProgrammingLanguages #CompilerDesign #Performance #Chuks #SoftwareEngineering #AOT #chukslang
To view or add a comment, sign in
-
-
Some people downplay the benefits of agentic coding based on the fear that developers are losing their capacity to understand the code. IMO this is like saying people will lose their capacity to understand math if they start using a calculator. To get some perspective, it's important to think about the primitives we are working with. Instead of thinking in terms of syntax and control flow we are now thinking in terms of systems and results. This is a natural evolution since we've been abstracting away these details with high level programming languages and frameworks anyway. Perhaps it's true we've generally lost the nuances of how CPUs work when we transitioned from assembly to C/C++, or the nuances of memory allocation when we transitioned to Java and Python. We are now transitioning to a phase where the nuances of the programming language are *mostly* unimportant. And yet we are still building useful things. Isn't that the point?
To view or add a comment, sign in
-
A more apt analogy would be asking someone else to do the calculations for you. The output of large language models is not deterministic like programming languages or mathematics. If I create a source file with some lines of code, then I know exactly what will happen when it is compiled every single time as the result is invariant. (Of course, holding constant the operating system and any number of parameters taken into account during compilation.) A calculator is analogous as it takes in a finite number of deterministic calculations and yields the same result every time. A large language model doesn't work this way. It tokenizes your input and passes that back and forth through however many layers and matrix multiplications to produce the output that you end up using. You cannot always reason about how the model came to those conclusions. Chain-of-thought can lie, mathematical interpretability is difficult and potentially post-hoc fallacious with SHAP and other models, memory caching and other conditions can cause outputs to be non-deterministic in many cases, etc. In many instances, the end user does not have the expertise required to even understand what changes need to be made to iterate upon what the model produced. Compilers and calculators take in a finite set of inputs formatted with a strictly typed language and produce a deterministic output that can be reliably traced back to your inputs. A large language model can take multi-modal inputs and will generally give a probabilistic output that is difficult to trace back to your inputs, and that might not even give you a sufficient answer for your prompt! It's a fundamentally different paradigm, as you go from knowing what will happen (mathematics, strictly typed programming languages) next to rolling the dice (probabilistic model outputs).
Some people downplay the benefits of agentic coding based on the fear that developers are losing their capacity to understand the code. IMO this is like saying people will lose their capacity to understand math if they start using a calculator. To get some perspective, it's important to think about the primitives we are working with. Instead of thinking in terms of syntax and control flow we are now thinking in terms of systems and results. This is a natural evolution since we've been abstracting away these details with high level programming languages and frameworks anyway. Perhaps it's true we've generally lost the nuances of how CPUs work when we transitioned from assembly to C/C++, or the nuances of memory allocation when we transitioned to Java and Python. We are now transitioning to a phase where the nuances of the programming language are *mostly* unimportant. And yet we are still building useful things. Isn't that the point?
To view or add a comment, sign in
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development