🚀 𝗡𝗼𝘁 𝗝𝘂𝘀𝘁 𝗪𝗲𝗯𝘀𝗶𝘁𝗲𝘀 𝗔𝗻𝘆𝗺𝗼𝗿𝗲: 𝗕𝘂𝗶𝗹𝗱𝗶𝗻𝗴 𝗥𝗲𝗮𝗹-𝗧𝗶𝗺𝗲 𝟯𝗗 𝗦𝘆𝘀𝘁𝗲𝗺𝘀 𝗶𝗻 𝘁𝗵𝗲 𝗕𝗿𝗼𝘄𝘀𝗲𝗿 I built a mini real-time 3D system inspired by Robot Bobby’s tutorial(Bobby Roe), where objects interact with physics, respond to mouse input, and generate dynamic visual effects — all in the browser. 🧠 𝗪𝗵𝗮𝘁’𝘀 𝗯𝗲𝗵𝗶𝗻𝗱 𝘁𝗵𝗲 𝗰𝗼𝗱𝗲? - A physics-driven system where objects move based on forces (not predefined animations) - A kinematic body controlled by the mouse → enabling real interaction with the scene - A render loop that syncs physics, input, and visuals in real time - Procedural visual layers (sprites + postprocessing) to enhance depth and atmosphere 💡 𝗨𝘀𝗲 𝗖𝗮𝘀𝗲𝘀: - 𝗜𝗻𝘁𝗲𝗿𝗮𝗰𝘁𝗶𝘃𝗲 𝗨𝗜/𝗨𝗫 → move beyond static interfaces - 𝟯𝗗 𝗽𝗿𝗼𝗱𝘂𝗰𝘁 𝘃𝗶𝘀𝘂𝗮𝗹𝗶𝘇𝗮𝘁𝗶𝗼𝗻 (𝗘-𝗰𝗼𝗺𝗺𝗲𝗿𝗰𝗲) → e-commerce in real time - 𝗗𝗮𝘁𝗮 𝘃𝗶𝘀𝘂𝗮𝗹𝗶𝘇𝗮𝘁𝗶𝗼𝗻 → spatial, physics-based insights - 𝗖𝗿𝗲𝗮𝘁𝗶𝘃𝗲 𝗱𝗲𝘃𝗲𝗹𝗼𝗽𝗺𝗲𝗻𝘁 & 𝗯𝗿𝗮𝗻𝗱𝗶𝗻𝗴 → memorable landing pages & portfolios - 𝗚𝗮𝗺𝗶𝗳𝗶𝗲𝗱 𝗮𝗽𝗽𝗹𝗶𝗰𝗮𝘁𝗶𝗼𝗻𝘀 → game mechanics in web platforms ⚠️ 𝗕𝗶𝗴 𝘁𝗮𝗸𝗲𝗮𝘄𝗮𝘆 Understanding how to structure 𝗽𝗵𝘆𝘀𝗶𝗰𝘀 + 𝗿𝗲𝗻𝗱𝗲𝗿𝗶𝗻𝗴 + 𝗶𝗻𝘁𝗲𝗿𝗮𝗰𝘁𝗶𝗼𝗻opens the door to 𝗻𝗲𝘅𝘁-𝗴𝗲𝗻 𝘄𝗲𝗯 𝗲𝘅𝗽𝗲𝗿𝗶𝗲𝗻𝗰𝗲𝘀. Let’s connect if you’re into WebGL or creative development 🤝 #ThreeJS #WebGL #CreativeCoding #JavaScript #Frontend #3D #WebDevelopment
More Relevant Posts
-
A Gesture-Controlled 3D Particle System. The goal was to build a highly performant, visually reactive experience using purely Web APIs and JavaScript without any heavy backend processing. 🛠️ The Tech Stack: • Three.js / WebGL: Rendering 15,000+ particles with custom color mapping, additive blending, and smooth matrix transformations to ensure a buttery 60 FPS. • Google MediaPipe (Hand Landmarker): To infer 21 3D landmarks per hand in real-time straight from the webcam stream. • Vanilla JS & Math: Mapping 2D camera coordinates into a rich 3D space and calculating volumetric transformations on the fly. The hardest (and most rewarding) part was tuning the gesture thresholds and smoothing the Three.js camera/group lerping to make the hand movements feel genuinely "haptic" and responsive. Closing your fist collapses the particles into a dense sphere, while opening two hands splats them into a massive rotating galaxy. 🌌 I'm constantly amazed by how powerful the modern web has become. Try it out yourself: https://lnkd.in/dRvv8wRE #ThreeJS #JavaScript #MediaPipe #WebGL #WebDevelopment #Frontend #Engineering #ComputerVision
To view or add a comment, sign in
-
🚀 Built an AI-Powered Air Drawing System using Hand Tracking ✋🎨 Just finished developing a real-time drawing app where you can draw in the air using your fingers — no mouse, no touch! 💡 How it works: Using computer vision, the system tracks hand landmarks and converts finger movement into drawing strokes on a canvas. 🔧 Tech Stack: • React + Next.js • MediaPipe Hands • HTML5 Canvas ✨ Key Features: • ✍️ Draw using your index finger • 🎨 Change brush colors (red, blue, green, black) • ✌️ Gesture-based eraser (two fingers up) • 🧽 Clear canvas • 💾 Save drawing as image • ⚡ Smooth real-time rendering 🧠 This project helped me explore: • Real-time hand tracking • Gesture recognition • Canvas-based drawing systems • Interactive UI with React Excited to keep improving this with more advanced gesture controls and features 🚀 #React #NextJS #ComputerVision #MediaPipe #JavaScript #WebDevelopment #AI #OpenCV #FrontendDevelopment
To view or add a comment, sign in
-
🚀 Just Launched: FOX-Verse — Pushing the Limits of the Modern Web 🦊✨ For the past few weeks, I’ve been obsessed with one question: How far can we push the modern browser? So I stopped building standard components… and started building an immersive digital experience. A portfolio that doesn’t just display content — it reacts, responds, and feels alive. 🔗 Live Experience: https://lnkd.in/gF9zHgFW 💻 Source Code: https://lnkd.in/ghjAJWPS ⚙️ What’s happening under the hood (60FPS ⚡) ⚛️ Foundation Built with React 19 + Vite for speed and modularity 🧊 3D Core A living 3D system powered by Three.js + React Three Fiber → dynamically responds to cursor movement in a volumetric space ⚡ Scroll Physics Integrated GSAP + Lenis to override native scrolling → velocity-based motion, kinetic transitions, and fluid navigation 💧 WebGL Shaders Replaced basic CSS effects with custom GLSL shaders → liquid ripple distortions & dynamic transitions 🧲 Custom Cursor System Rebuilt the pointer using GSAP spring physics → magnetic snapping + blend-mode interactions 🧠 What I Learned This wasn’t just frontend development. It was a mix of: 🎨 UI/UX design 🧠 Real-time systems thinking ⚙️ Physics-based interactions 🌌 3D rendering & performance optimization 🐾 Hidden Detail There’s a small Easter egg in the experience 👀 Try clicking the 3D dog 3 times quickly and watch what happens. The web is shifting — from static interfaces to interactive environments. FOX-Verse is my attempt to explore that direction. Would love your thoughts on the scroll physics and liquid shaders 👇 #WebDevelopment #ThreeJS #ReactJS #GSAP #CreativeCoding #Awwwards #FrontendDevelopment #WebGL #UIUX #CreativeTechnology #BuildInPublic
To view or add a comment, sign in
-
Pushing the limits of the Browser: 20,000 Images, One Canvas. 🚀 I’ve always been fascinated by how much we can demand from the frontend before it pushes back. In my latest project, I decided to find out. I built a high-performance visual experience using React and Canvas, where I’m rendering and animating between 10,000 and 20,000 images simultaneously. The Technical Challenge: Rendering that many assets in the DOM would instantly freeze any browser. By moving the heavy lifting to the Canvas API, I was able to manage the draw calls efficiently while using GSAP to orchestrate complex rotations and transitions. Key Features: 🌀 High-Density Canvas Rotation: Managing a massive array of image data points, allowing them to rotate and shift in perfect synchronization. 🏔️ Layered Parallax: Integrated a multi-layered parallax effect to create a sense of depth and scale that feels truly immersive. ⏱️ GSAP Orchestration: Used GSAP timelines to handle the motion logic for thousands of elements without dropping frames. 🧶 Interactive Details: Added a custom vibrating string effect on the cursor and high-precision 2D element pinning to anchor the experience. This project was a masterclass in performance optimization—specifically handling timeline picture motion and complex pinning logic. It’s currently a desktop-optimized experience, as pushing this many assets requires the full power of a dedicated GPU. 📍 Check out the build here: gsap-canvas.vercel.app #GSAP #GreenSock #ScrollTrigger #CreativeCoding #WebAnimation #MotionDesign #InteractiveDesign #UXDesign #FrontendDeveloper #UIAnimation #MicroInteractions #WebDevelopment #JavaScript #ReactJS #Frontend #Coding #Programming #WebDesign #UIUX #SoftwareEngineering #MERNStack #FullStack #ModernWeb #CSS3 #HTML5 #AdvancedCSS #JavaScriptFrameworks #WebPerformance #CleanCode #ResponsiveDesign #ThreeJS #Spline3D #BuildInPublic #SoftwareDeveloper #WebDev #PortfolioProject #FrontendEngineering #IndieDev #TechCommunity #Innovation #CanvasAPI #HTML5Canvas #WebPerformance #BigDataVisualization #GenerativeArt #GraphicsProgramming #GPUAcceleration #Optimization #SmoothScroll #ParallaxEffect
To view or add a comment, sign in
-
WebAssembly is no longer just a “cool browser feature” — it’s becoming a serious tool for building compute-heavy web apps that feel native. Where it shines: - Image/video processing in the browser - CAD, 3D, and visualization tools - Audio editing and real-time effects - Scientific simulations and data analysis - Running existing C/C++/Rust code on the web - On-device ML inference with better performance Why teams are adopting it: - Near-native performance for CPU-intensive workloads - Faster load times than rewriting everything as a desktop app - Safer sandboxed execution in the browser - Reuse of proven native libraries - Better user experience without forcing installs A few real-world patterns: - Photo editors applying filters locally instead of round-tripping to a server - Browser-based IDEs compiling code client-side - Figma-style design tools handling complex rendering smoothly - Financial and engineering apps running heavy calculations interactively - Media platforms doing transcoding, waveform generation, or compression in-browser Important caveat: WebAssembly is not a replacement for JavaScript. It works best when JavaScript handles the UI and app logic, while WebAssembly powers the hot paths. The takeaway: If your web app is hitting performance limits because of compute-heavy tasks, WebAssembly is worth evaluating — especially when responsiveness, offline capability, or code reuse matters. Curious: where do you see the biggest opportunity for WebAssembly in production web apps today? #WebAssembly #WebDev #Performance #Frontend #SoftwareEngineering #BrowserTech #Rust #JavaScript #ProductEngineering #WebDevelopment #TypeScript #Frontend #JavaScript
To view or add a comment, sign in
-
-
UI is still the biggest bottleneck for agents. But I think we can solve this by adding just around 20 new classes to Tailwind CSS. I got about 80% of the way there last year, but that final 20% was the hardest part. To make it work, I needed to iterate incredibly fast, and the AI models back then just couldn’t keep up. But GPT-5.5 completely unblocked me. In just a week, I managed to spin up a custom CSS editor tailored specifically for this new framework. Working with the agent to iterate on both the framework and the editor in the exact same project has been an unreal workflow. Do you remember what it felt like moving from Photoshop to Figma? Going from a general graphic tool to something built specifically for UI? Building a dedicated tool for your own framework feels like that next massive leap. More updates coming soon.
To view or add a comment, sign in
-
-
🚀 Built something fun using web technologies! I created Pinch Particle 3D — an interactive particle simulation where particles respond dynamically to user input in real time. While building this, I explored and took reference from Google Anti-Gravity to understand how interactive physics-like behavior can be implemented on the web. 🛠️ Tech Stack: • HTML • CSS • JavaScript (Canvas & DOM) 💡 What I learned: • Handling real-time user interactions • Working with animations using Canvas • Improving performance for smooth visuals 🎥 Check out the interaction in the video 👇 🔗 Live Demo: https://lnkd.in/dDKdSryq 💻 GitHub: https://lnkd.in/dx4yFS4e Would love your feedback! #JavaScript #WebDevelopment #Frontend #HTML #CSS #Coding #Projects
To view or add a comment, sign in
-
Bridging the Gap Between Modern Tech and Retro Aesthetics: PixelGlyph I’ve recently found myself fascinated by ASCII art—the art of using text characters to represent complex imagery. There is something uniquely compelling about the constraints of a character-based grid and how it can still convey depth, texture, and emotion. What started as a personal curiosity turned into this weekend’s engineering challenge: PixelGlyph, a high-performance, real-time ASCII generator built entirely for the browser. Visit: https://lnkd.in/gdCYMEx4 The Motivation My goal was to take the vintage charm of 1970s terminal graphics and merge it with modern web capabilities. I wanted to build a tool that felt less like a utility and more like an "illegal terminal" experience from a cyberpunk noir, focusing on low-latency processing and high-impact visual design. Technical Implementation Real-time Image Processing: Using the Canvas API, the engine calculates pixel luminance and maps grayscale values to character density strings in real-time. WebRTC Integration: I implemented live camera support, allowing users to transform their surroundings into a dynamic ASCII stream instantly. Custom Rendering Engine: Beyond just display, I built a secondary rendering pipeline to allow users to export their creations as high-quality .png files or raw .txt data. UX/UI Design: Developed with a responsive, dual-pane layout using Tailwind CSS, featuring custom scanline overlays and glitch-style animations to reinforce the retro-futurist aesthetic. Reflections Working on "just for fun" projects like this is a great reminder of why I love development. It’s an opportunity to experiment with client-side performance and creative UI patterns that we don't always get to use in traditional enterprise applications. I’m currently exploring adding color-sampled ASCII support and perhaps video file conversion next. How do you approach creative coding? I'd love to hear about your recent side projects in the comments! #SoftwareEngineering #creativecoding #webdevelopment #asciiart #javascript #tailwindcss #sideproject #InnovationInAction
To view or add a comment, sign in
-
Building "Bloom": How to turn a static UI into a living, breathing ecosystem. 🌿 I recently wrapped up the frontend development for Bloom, an AI plant and floral design platform. The goal wasn't just to make it look good—it had to feel tactile, cinematic, and premium. Here is how we pushed it to the next level: ✨ Liquid Glassmorphism: Built a custom 2-tier glass effect using strict grayscale HSL values, SVG noise textures, and complex gradient masking. 🚀 The Botanical Boot Sequence: Replaced the standard loading bar with an immersive HUD that tracks latency/memory, ending in an iridescent SVG portal wipe that reveals the looping video background. 🧲 Fluid Physics: Built a custom, physics-based cursor that intelligently morphs and blends using Framer Motion. The tech stack: React, Tailwind CSS, and Framer Motion. Check out the Live website below to see the micro-interactions and staggered entry animations in action. Let me know what you think of the preloader! 👇 https://lnkd.in/dXq2Mj8Y #FrontendDevelopment #UIUX #WebDesign #React #FramerMotion #CreativeCoding #WebDevelopment Contra React Framer
To view or add a comment, sign in
-
“In my initial days of working with Three.js…” 😅 Mistakes I made while building real-world 3D systems (so you don’t) After working on dashboards + real-time environments, I realized — most issues weren’t visual… they were architectural 👇 ❌ Treating Three.js like a UI library 👉 Result: Messy structure, hard-to-scale code 👉 Instead: Separate scene logic, rendering, and state management ❌ Uncontrolled re-renders with React 👉 Result: FPS drops, unnecessary updates 👉 Instead: Use refs + isolate Three.js from React render cycle ❌ No optimization strategy 👉 Result: Performance bottlenecks at scale 👉 Instead: • Instancing • Frustum culling • Geometry merging ❌ Ignoring camera as UX 👉 Result: Confusing navigation 👉 Instead: Design camera like interaction system (OrbitControls, transitions, focus states) ❌ Heavy real-time updates without control 👉 Result: Lag in dashboards / live systems 👉 Instead: Throttle updates + batch state changes ❌ Poor resource management 👉 Result: Memory leaks in long sessions 👉 Instead: Dispose geometries, materials, textures properly ❌ No abstraction layer 👉 Result: Repeated logic, hard maintenance 👉 Instead: Create reusable hooks / controllers for scene logic 💡 Biggest lesson: Three.js is not about rendering objects. It’s about managing systems in real-time. 👉 The challenge is no longer visuals 👉 It’s performance, architecture, and experience 💬 What challenge did YOU face while scaling Three.js apps? #threejs #webgl #frontenddevelopment #3dweb #javascript #realtime #softwarearchitecture #reactjs
To view or add a comment, sign in
-
Explore related topics
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development
real-time 3D in the browser is such an underexplored space. what's the rendering approach here, raw threejs or react three fiber?