Becoming Quantum Computing Something* (Week 87, 2026-04-24) I think this series reached its natural ending last week. Not because quantum stopped mattering to me. Quite the opposite. It became part of how I think. I love the field, the people around it, and even some of the math. Especially when it is trying to kill my pride ;). But it is not the only thing. For most of my life, I felt pressure to pick one thing. Product. Teaching. Coding. Community. People love putting other people into boxes, I guess :) Now I feel almost the opposite. Thanks to AI, switching between roles and topics is manageable. I can think about a new product, work through code with Claude, and read things that would have scared me two years ago. It is not perfect, but it is more than "good enough". And honestly, there is so much joy in building stuff again. So the next chapter is not about becoming only “quantum something”. It is more about finding out what one person can build end-to-end when powered by today’s tools. Quantum stays. But the center of gravity is shifting toward product building and exploring the limits of an individual amplified by technology. That sounds like a good next experiment, right? ;) #QuantumComputing #ProductBuilding #FullStack #ContinuousLearning
Jan Veselý’s Post
More Relevant Posts
-
The AI hardware space moves fast. And for a long time, I was drowning in it. There’s new announcements every week. Research papers piling up. Newsletters I subscribed to but never read. A saved articles folder that became a graveyard. I felt behind even when I wasn't. So at some point I just… stopped. And rebuilt the whole thing from scratch. Here is exactly what I do now. 1. Three sources. Nothing more. ->IEEE Spectrum, SemiAnalysis, and Hacker News filtered for hardware. That's it. I don't chase every publication. -> I pick three I trust, and I show up for them consistently. 2. One paper a week. -> Not one a day. One a week. I pick something relevant from arXiv or IEEE, and I don't even read it fully. -> Abstract, problem, conclusion. That's enough to stay sharp without losing hours I don't have. 3. I follow people, not platforms. -> There are maybe 15 people whose thinking I actually trust in this space. When they post something, I read it. -> When they share something, I check it. 4. I teach what I'm still learning. -> When I have to explain something to a room of students, I have to truly understand it first. -> If I can't make it simple, I don't know it well enough yet. Teaching is the most honest filter I've found. That's the whole system! It took me way too long to figure out that doing less, better, was the answer!! #AIHardware #Semiconductors #ChipDesign #STEMEducation #StemAChip #Learning #Engineering
To view or add a comment, sign in
-
-
A recent Yahoo Finance article reports that computer science enrollment in the United States has fallen more than any other major in six years. AI has taken some of the mystique out of software development, but the reality is that most engineers have spent decades on tedious maintenance and wrestling with clunky frameworks rather than producing breakthroughs like Steve Wozniak or Linus Torvalds. The field still needs innovators of that stature, perhaps even more now, but those rare talents will always be few.
To view or add a comment, sign in
-
-
Ex Nihilo - Simulating how matter emerges from “nothing” Recent results published in Nature Portfolio, inspired by experiments at the CERN and Brookhaven National Laboratory, suggest something fascinating: under extreme conditions, particles can emerge directly from quantum vacuum fluctuations. This weekend, I tried to understand that mechanism the only way I know how: I coded it. Important disclaimer: This is not a physically accurate simulation. A real model would require quantum field theory, experimental data (eg: CERN), and heavy computation. This is a conceptual model - designed to capture the logic, not the precision. How the model works (in simple terms) I represent the vacuum as a dynamic energy field: → At each timestep, random fluctuations are added → A damping factor prevents energy from diverging quantum_field += noise quantum_field *= 0.9 1. Energy threshold → matter appears Particles are only created when local energy exceeds a threshold: quantum_field > energy_threshold This is a simplified way to encode: no energy → no matter enough energy → conversion becomes possible 2. Matter is always created in pairs When a particle appears, its opposite appears with it: +1 (matter) and -1 (antimatter) This enforces a fundamental constraint: total charge remains zero No “single” creation. Only balanced systems. 3. Local correlation (shared origin) Pairs are generated in neighboring cells: neighbor_offset = random(-1, 1) Meaning: both particles come from the same fluctuation they are intrinsically linked at birth 4. Lifetime depends on energy The more energy a pair “borrows”, the shorter it lives: lifespan ∝ 1 / energy Inspired by uncertainty principles: high energy → fast annihilation low energy → longer persistence 5. Motion + annihilation Particles move across the grid: np.roll(...) Then eventually disappear: particles → 0 energy → returned to field What this really is Not a physics simulator. A translation layer between: abstract theory and algorithmic logic Same mindset I apply in business or law: → identify constraints → model them → observe emergent behavior #Python #Simulation #QuantumPhysics #Tech #ConceptModel #kaggle
To view or add a comment, sign in
-
-
-- Quantum ML Series 2 | Post 03 of 11 #QFD2 -- I spent this week trying to understand Quantum Neural Networks. Not from a physics textbook. From a developer's perspective. Here is what actually clicked for me. A QNN is a composition of parameterized functions optimized iteratively. Sound familiar? It should. That is also what a classical neural network is. The difference is in the physics. The optimization philosophy is the same. In a classical NN you learn weights. In a quantum NN you learn rotation angles. Both are just numbers updated during training. That single connection made QNNs click for me as a developer. The architecture has 4 components: Data encoding -> classical data converted into quantum states Variational layer -> parameterized quantum gates that actually learn Measurement -> quantum state collapses to classical output Classical postprocessing -> final prediction layer Most real QML systems today are hybrid. The quantum component handles one specific step. Everything else is classical. Training uses something called the parameter shift rule instead of backpropagation. Run the circuit twice with the parameter shifted forward and backward. Take the difference. That is your exact gradient. No approximation. PennyLane handles this by default. One honest note before I close. A reader left a comment on my last post that I am carrying into every article going forward. ( These insightful comments from experts is learning gold 😊) Most of what gets described as quantum advantage is still theoretical or exploratory. NISQ hardware today is noisy, shallow and constrained. The gap between what quantum circuits can do in theory and what they can do on real hardware right now is real and significant. I will keep that line explicit in every post from here. Full breakdown of QNN architecture, the parameter shift rule and barren plateaus in the article. https://lnkd.in/g8wmFZdZ Article link in first comment 👇 #QuantumComputing #QuantumML #MachineLearning #LearnInPublic #Developer #PennyLane #QML #Coding #FutureTech #QuantumMLSeries
To view or add a comment, sign in
-
-
🚀 “Is Quantum Machine Learning actually useful—or just hype?” Over the past few years, this has been the most common question I’ve been asked. The honest answer is nuanced: 👉 Not every ML problem benefits from quantum speedups 👉 But there are regimes where quantum models offer fundamentally different representations—especially through quantum kernels and variational circuits. This is exactly what I focus on in my course: Quantum Machine Learning for ML Engineers ⚙️ What makes this course different This is not a physics-first or theory-heavy program. It is designed for ML practitioners. All sessions are live + recorded, with full implementation notebooks. 🎯 Why does this matter? For industry • Quantum is emerging in finance, optimization, and materials • Practical QML talent is still rare → strong signaling advantage For research • Access to high-dimensional Hilbert space representations • New kernel constructions beyond classical limits • Hybrid quantum-classical learning systems For long-term positioning • QML sits at the intersection of AI + physics + HPC • Early movers benefit disproportionately Prerequisite: solid ML background (SVMs, kernels, NN, NumPy/Sklearn). No prior quantum knowledge required. ⏳ Early Bird Discount (30% off — ends April 30) 📅 Cohort: May 27 – June 19, 2026 👉 Explore details & enroll: https://lnkd.in/gWUMHG4y If you're still evaluating, feel free to DM—happy to help you assess if this aligns with your goals.
To view or add a comment, sign in
-
The Backbone of Modern Technologies 🌟 In this fast-paced era of innovation, one essential discipline stands tall as the foundation of groundbreaking technologies—Mathematics ! 🔑 Why Mathematics Matters 🔑 Mathematics isn't merely a school subject; it’s the universal language that drives our world. From artificial intelligence to data analytics , math shapes the tools that enhance our everyday lives. 💡 Key Areas Where Math Fuels Innovation 💡 1. 🤖 Artificial Intelligence & Machine Learning 🤖 Complex algorithms depend on statistical models and mathematical logic to decipher data patterns, enabling machines to learn and adapt. 2. 🔐 Cryptography 🔐 Secure communications rely on mathematical theories, such as number theory, to protect sensitive information through encryption. 3. ⚙️ Physics & Engineering ⚙️ The principles of physics and engineering are deeply rooted in mathematical concepts, helping to design everything from bridges to spacecraft. 4. 📊 Data Science 📊 Big data analytics employs mathematical formulas to derive insights, guiding businesses in making informed decisions. 🌍 Conclusion 🌍 Mathematics is not just about numbers; it’s about empowering innovation and technology that shapes our future. Join the Conversation! What role do you think mathematics plays in the technologies we rely on today? Comment below! #️⃣ #Mathematics #Innovation #Technology #DataScience #ArtificialIntelligence #MachineLearning #Cryptography #Engineering #mathtutor
To view or add a comment, sign in
-
-
A Simple Introduction to Quantum Machine Learning (QML) If you have worked with Machine Learning and heard about Quantum Computing, you might have come across this term: Quantum Machine Learning (QML) 🚀 Here is the simplest way to understand it: --> Machine Learning = models that learn patterns from data --> Quantum Computing = a new way of computing using qubits instead of bits In classical computing, a bit is either 0 or 1 In quantum computing, a qubit can be both at the same time This opens up new possibilities for how we process and learn from data. So when we combine both, we get QML where quantum systems are used to build or enhance machine learning models 😬 Right now, it is still an emerging field, but people are already experimenting with: • classification problems • optimization tasks • hybrid models using classical and quantum systems You do not need a quantum computer to start, tools like Python libraries let you simulate and build basic quantum models. And honestly, learning in public sounds fun, so I will be sharing what I learn along the way. More on this soon 👽 If you wanna collaborate for a project, or just a fun learn session, let's connect! #QuantumComputing #MachineLearning #QML #AI #LearningInPublic #TechExploration #QML #hybridmodels
To view or add a comment, sign in
-
The new 2026–2027 NOVA catalog is out. Included: Career Studies Certificate in Computer Science: Algorithms and Artificial Intelligence A 20-credit, stackable credential focused on: • programming • data structures • algorithmic thinking • applied AI Not awareness. Capability. This is the direction. Link in comments. #AIinEducation #HigherEd #ComputerScience #WorkforceDevelopment #AI
To view or add a comment, sign in
-
Most people associate Computer Science with coding. But recently, I’ve been exploring something deeper—how computation itself is defined. While reading Introduction to the Theory of Computation by Michael Sipser, one idea stood out: 👉 Not everything that can be asked can be computed. This completely shifts how you think about technology. We often focus on: building faster systems scaling data pipelines optimizing algorithms But theoretical computer science asks a more fundamental question: “Is this problem even solvable by a machine?” Concepts like: Automata Computability Complexity …aren’t just academic—they shape everything from modern AI systems to distributed architectures. For me, this has been a powerful reminder: ➡️ Before optimizing a solution, understand the limits of computation. ➡️ Before building systems, question whether the problem itself is tractable. In a world driven by AI and large-scale systems, grounding yourself in first principles is not optional—it’s a competitive advantage. Curious—how many of us in industry actually revisit the fundamentals behind what we build? #ComputerScience #TheoryOfComputation #AI #Learning #FirstPrinciples #Engineering
To view or add a comment, sign in
-
More from this author
Explore related topics
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development