TypeScript Generics: The 'All or Nothing' Rule! One of the most confusing moments in TypeScript is when you try to be helpful to the compiler, and it suddenly stops being helpful to you. We often run into a scenario where we want to manually specify one type for a generic function but let TypeScript figure out the rest automatically. Logic suggests: "I'll give you a hint for part A, you guess part B." TypeScript says: "Nope. If you tell me part A, I’m stopping the guesswork entirely." This is the 'Partial Inference' problem in TypeScript. TypeScript’s inference engine is currently an all-or-nothing system. This means, you either explicitly type everything, or you let it infer everything. You can't mix and match easily. This limitation is actually why you often see 'Curried' or 'Factory' function patterns in popular libraries. By splitting a function into two steps, developers can bypass this restriction, locking in explicit types in step one, and restoring automatic inference in step two. It’s a small structural shift that brings back full type safety when the compiler otherwise defaults to any or unknown. #TypeScript #SoftwareArchitecture #WebDev #Engineering #TypeSafety
TypeScript Generics: Bypassing Partial Inference Limitations
More Relevant Posts
-
Rust is faster than TypeScript. Except when it isn't. A team at OpenUI swapped their Rust WASM parser for a TypeScript rewrite — and got 2.2x to 4.6x faster performance. The Rust code stayed the same. It just no longer ran. Here's what actually happened: Rust was never the bottleneck. Every call crossed the WASM-JavaScript boundary, which meant copying strings between memory spaces, serializing to JSON, copying results back, and deserializing in V8. That overhead dwarfed any gains from compiled code. When they moved to TypeScript, V8's JIT compiler handled the computation fast enough that Rust's native speed advantage stopped mattering. Then they fixed the real problem: an O(N²) algorithm hiding in the caching layer. Switching to O(N) caching gave the largest speedup by far — not the language change. This is the pattern I see constantly in engineering decisions. Teams spend months optimizing the wrong layer. They pick the "faster" technology, skip the profiling, and end up slower than a naive implementation in the "slower" one. Measure first. Profile before you choose your stack. The bottleneck is almost never where you think it is. The engineers who compound fastest aren't the ones who know the most languages — they're the ones who know where the actual cost is. What's the most counterintuitive performance lesson you've learned building software? #SoftwareEngineering #Performance #WebDevelopment #Rust #JavaScript #Engineering #TechLeadership Join Agentic Engineering Club → t.me/villson_hub
To view or add a comment, sign in
-
-
𝗠𝗮𝘀𝘁𝗲𝗿𝗶𝗻𝗴 𝗧𝘆𝗽𝗲𝗦𝗰𝗿𝗶𝗽𝘁 𝟱.𝘅: 𝗔𝗱 v𝗮𝗻𝗰𝗲𝗱 𝗣𝗮𝘁𝘁𝗲𝗿𝗻𝘀 𝗳𝗼𝗿 𝗙𝘂𝗹𝗹-𝗦𝘁𝗮𝗰𝗸 𝗗𝗲𝘃𝗲𝗹𝗼𝗽𝗲𝗿𝘀 TypeScript is a powerful tool that helps you catch complex bugs before they hit your IDE. You can build resilient codebases that scale. You can use generics to create reusable components that work with many types while maintaining type safety. - Use constraints to ensure type safety - Create new types based on existing ones - Enforce naming conventions or routing patterns You can use mapped types to create API wrappers or state managers. You can use template literal types to enforce naming conventions. You can use conditional types to extract types from complex structures. Advanced TypeScript helps you move the burden of validation from runtime to the compiler. You can reduce unit tests and improve developer experience. What's your favorite TypeScript pattern? Source: https://lnkd.in/g83W8M39
To view or add a comment, sign in
-
TypeScript 6 changes how global types are loaded - and it may break existing projects. Types in compilerOptions now does not auto-include everything from node_modules/@types. This means you need to explicitly list global/ambient type packages in your tsconfig[.]app[.]json. For example: "types": ["node", "jasmine"] But don't just dump every @types/* package in there. Here's the key distinction! Add to the array: Packages that inject globals:(e.g., @types/node for process, Buffer, fs; @types/jasmine for describe, it etc) Skip: Packages you explicitly import -- those still resolve automatically. For example, @types/lodash will import automatically. Why the TS team made this change: The old "include everything" default was slow and pulled in unrelated type packages. The explicit list scopes what the compiler sees in node_modules/@types, reducing noise and improving compiler performance. Quick mental model: if it provides globals, list it. If you import it, skip it. #TypeScript #TypeScript6 #WebDevelopment #FrontendDevelopment #DeveloperExperience
To view or add a comment, sign in
-
Most TypeScript projects ship with a hidden landmine: strict mode off. Without it, null and undefined sneak into every type. A variable you think is a string? It could be null. That button handler? Maybe undefined. The compiler won't tell you until production breaks. Enabling strict mode is like packing a first-aid kit before a hike. You might not need it, but when you do, you will be grateful you prepared. The two heavy hitters: - noImplicitAny: No more mystery types - strictNullChecks: Null becomes its own problem to solve, not a silent failure New projects should start strict. Existing ones can migrate incrementally. Either way, the quality jump is immediate. Are you building with strict mode on, or are you still finding null bugs in prod? #TypeScript #WebDevelopment #FrontEndDev #CodeQuality #SoftwareEngineering
To view or add a comment, sign in
-
-
If you're still manually writing useMemo and useCallback everywhere, you might be optimizing for a problem that no longer exists. The React Compiler hit v1.0 in late 2025. What that means in practice: memoization is now handled at build time, automatically. The compiler analyzes your component tree and decides what to cache. You don't have to. Which means a lot of the useMemo(...) scattered across codebases today is either redundant — or worse — masking component design issues that should have been fixed instead. I've been thinking about how much time I've spent on manual memoization over the years. Not just writing it, but debugging stale closures, wrong dependency arrays, and explaining to every new dev why wrapping a click handler in useCallback probably isn't helping. If you're starting a new React project right now: test with the compiler first. Optimize manually only where the profiler tells you to. The shift isn't just about tooling — it's about what "thinking in React" actually means now. What's your team's current stance on the React Compiler in production? #React #TypeScript #FrontendDevelopment #WebDev #JavaScript
To view or add a comment, sign in
-
React is introducing something big: React Compiler Its goal is simple. Automatically optimize React rendering. Today, developers often add manual optimizations like: • "useMemo" • "useCallback" • "React.memo" Example: const memoizedValue = useMemo(() => compute(data), [data]) These help avoid unnecessary re-renders. But they also add extra complexity to the code. React Compiler aims to handle this automatically. You write simple code: const value = compute(data) The compiler analyzes the component during build time and optimizes re-renders automatically. No manual memoization needed. Less optimization code. Simpler React components. Source: React documentation (React Compiler) in comments #React #ReactCompiler #FrontendDevelopment 🚀
To view or add a comment, sign in
-
Exacting for TypeScript 7, which is coming this summer—and it's 10x faster. The compiler is being completely rewritten in Go (codenamed "Project Corsa"), and early benchmarks are wild: • VS Code codebase: 77.8s → 7.5s (10.4x speedup) • Playwright: 11.1s → 1.1s (10.1x faster) • TypeORM: 17.5s → 1.3s (13.5x faster) What this actually means: ✅ Editor startup drops from ~10 seconds to ~1 second ✅ Auto-imports, go-to-definition, rename—all instant ✅ CI builds that actually finish before your coffee gets cold ✅ Memory usage roughly cut in half Why Go and not Rust? The TypeScript team chose Go because its patterns closely mirror existing TypeScript code, making the port line-for-line compatible. Plus, goroutines handle the heavy AST traversal and type-checking parallelism that JavaScript's single-threaded event loop After years of "TypeScript is slow" complaints in large monorepos, this changes everything. The language service was the last bottleneck—now it's about to become the fastest part of your stack. #TypeScript #JavaScript #WebDevelopment #DeveloperExperience #Programming #TechNews #Microsoft #GoLang
To view or add a comment, sign in
-
🤯 TypeScript 7 will be written in Go. Yes, you read that right. Microsoft announced that TypeScript 6.0 will likely be the last version built on the current JavaScript codebase. The next generation TypeScript 7 compiler is being rewritten in Go to unlock: ⚡ Much faster compilation ⚡ Native multi-threading ⚡ Better scalability for large codebases ⚡ Deterministic type checking This means the future TypeScript compiler will behave more like a native toolchain (similar to Rust / Go tooling) instead of a Node.js tool. Meanwhile TypeScript 6.0 RC acts as the bridge between TS 5 and TS 7. Some notable additions in TS 6: • "RegExp.escape()" support • New "Map.getOrInsert()" methods • ES2025 target support • Temporal API types • "--stableTypeOrdering" flag to prepare for TS7 Install the RC: npm install -D typescript@rc The TypeScript ecosystem is evolving fast. If you're building with React, Next.js, Node, or full-stack TypeScript, this transition will be very interesting to watch. I’m currently experimenting with TypeScript + Next.js + tRPC tooling to understand these changes deeper. What do you think about TypeScript moving toward a native compiler? #typescript #javascript #webdev #nextjs #programming
To view or add a comment, sign in
-
-
TypeScript 7.0 is being rewritten in Go. Here's why that matters to you today. TypeScript 6.0 RC dropped last week. It's the last JavaScript-based release. Ever. Starting with 7.0, the compiler is native Go. The early benchmarks show 10x faster type checking. But 6.0 is the release you should care about right now. It introduces deprecations and behavioral changes designed to prepare your codebase for the Go-based compiler. Think of it as the migration guide disguised as a release. What this means practically: Your tsc today takes 45 seconds on a large project. With tsgo, that becomes 4-5 seconds. Incremental builds, project references, --build mode — all ported and working. But the catch: TypeScript 7.0 already passes 19,926 out of 20,000 compiler test cases. Those 74 edge cases that don't match? If your codebase relies on one of them, 6.0 is your window to find out. What to do now: Install the preview: npm i @typescript/native-preview Run tsgo alongside your current tsc. Compare the output. If something breaks, fix it while 6.0 is still current — not after 7.0 ships and the old compiler is gone. The teams that test now avoid the scramble later. Have you tried tsgo yet? How much faster is it on your codebase? #TypeScript #SoftwareEngineering #WebDev #BackendDevelopment
To view or add a comment, sign in
-
-
You React codebase is full of useMemo and useCallback you didn't need to write. 👇 Most devs wrap functions and values in memo hooks by default — thinking it makes things faster. It doesn't. Wrong dependency arrays cause stale bugs. Unnecessary memoization adds overhead. And nobody on your team can read it easily. ❌ Manual useMemo/useCallback More lines, wrong dependency arrays, stale closures, harder to read — and you're still guessing if it actually helped performance ✅ React Compiler (v1.0 — Oct 2025) Analyzes your component at build time, inserts stable refs automatically, skips re-renders without you touching a single dependency array The React Compiler (v1.0, October 2025) analyzes your components at build time and handles memoization automatically. Write plain, readable code — the compiler inserts stable refs where they're actually needed. Only keep manual useMemo/useCallback for third-party library interop or truly expensive calculations. That's maybe 10% of your current usage. Simple code + smart compiler = cleaner codebase and faster UI. That's the move now. ⚡ When to still use useMemo / useCallback manually → External libs that compare by identity (e.g. maps, virtualization) → Truly expensive calculations with unstable inputs → Third-party subscriptions that need a stable function reference → Everything else? Let the compiler handle it. #ReactJS #ReactCompiler #JavaScript #WebDevelopment #FrontendDevelopment #Programming #useMemo #useCallback #React19 #CleanCode #WebDev #FrontendDeveloper #SoftwareEngineering #100DaysOfCode #JavaScriptDeveloper #ReactDeveloper #CodeQuality #Performance #TechCommunity #SoftwareDevelopment
To view or add a comment, sign in
-
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development