TypeScript ships with 20+ utility types. Most developers use 3. Here are the ones I actually reach for in production: ───────────────────────────── Partial<T> When you want optional updates without a new interface. type UpdateUser = Partial<User> // All fields optional — perfect for PATCH requests ───────────────────────────── Pick<T, K> Expose only what the consumer needs. type UserPreview = Pick<User, 'id' | 'name' | 'avatar'> // No accidental exposure of sensitive fields ───────────────────────────── ReturnType<T> Infer the type from a function — not the other way around. type ApiResponse = ReturnType<typeof fetchUser> // Single source of truth: the function itself ───────────────────────────── NonNullable<T> Strip null/undefined before passing down. type SafeId = NonNullable<User['id']> // No optional chaining hell downstream ───────────────────────────── Parameters<T> Extract function arguments as a tuple. type LogArgs = Parameters<typeof logger> // Wrap functions without redefining their signatures ───────────────────────────── Awaited<T> Unwrap Promise types cleanly. type UserData = Awaited<ReturnType<typeof fetchUser>> // No more Promise<Promise<...>> nesting ───────────────────────────── The underrated combo: type SafePartialUpdate<T> = Partial<Pick<T, 'name' | 'email'>> Composing utility types is where TypeScript really pays off. Which utility type do you reach for most? #TypeScript #React #JavaScript #WebDevelopment #SoftwareArchitecture #DeveloperProductivity #NodeJS
Alex Rogov’s Post
More Relevant Posts
-
🚀 TypeScript: any vs unknown — why one line of code can save you from bugs While working with TypeScript, I realized something important: 👉 Code that works is not always safe. 🔹 The problem with any let data: any = ["a", "b", "c"]; console.log(data[0].toUpperCase()); // ✅ works (but risky) Yes — this works. But only because the data is correct. Now imagine this: let data: any = [1, 2, 3]; console.log(data[0].toUpperCase()); // 💥 runtime error 👉 TypeScript didn’t warn you. 👉 Your app crashes at runtime. 🔹 A safer approach with unknown function getData(arr: unknown): arr is string[] { return Array.isArray(arr) && arr.every(item => typeof item === "string"); } Usage: let data: unknown = ["a", "b", "c"]; if (getData(data)) { console.log(data[0].toUpperCase()); // ✅ safe } ⚖️ Key idea any → no safety ❌ unknown → forces validation ✅ 🎯 Why this matters In real-world projects: API data is unpredictable User input is unreliable Small mistakes can break production Using unknown + type guards: 👉 prevents hidden bugs 👉 improves code reliability 👉 shows strong TypeScript fundamentals 📌 My takeaway 💡 “any works… until it doesn’t. unknown makes sure it always works safely.” #TypeScript #WebDevelopment #SoftwareEngineering #CleanCode #MERN #LearningInPublic
To view or add a comment, sign in
-
-
Zod is the best thing to happen to TypeScript APIs since TypeScript itself. I spent 3 years writing manual validation logic in Node.js APIs. Checking if req.body.email is a string. Checking if it's actually an email. Checking if req.body.age is a number and not negative. Writing the error message manually. Remembering to do this on every route. Then I found Zod. I genuinely don't know how I shipped APIs without it. WHAT ZOD DOES Zod lets you define a schema once. That schema does three things: 1. Validates the data at runtime 2. Infers the TypeScript type automatically 3. Produces clean, structured error messages // One schema. Three things at once. import { z } from 'zod' const CreateOrderSchema = z.object({ userId: z.string().uuid(), items: z.array(z.object({ productId: z.string().uuid(), quantity: z.number().int().min(1).max(100) })).min(1, 'Order must have at least one item'), deliveryDate: z.string().datetime().optional(), promoCode: z.string().toUpperCase().optional() }) // TypeScript type — inferred automatically, no duplication type CreateOrder = z.infer USING IT IN AN EXPRESS / NESTJS API const result = CreateOrderSchema.safeParse(req.body) if (!result.success) { return res.status(422).json({ errors: result.error.flatten().fieldErrors }) // Returns exactly which field failed and why // { items: ['Order must have at least one item'] } } // result.data is now fully typed — no casting, no assertions const order = await orderService.create(result.data) 3 ZOD PATTERNS I USE ON EVERY PROJECT 1. .transform() — sanitise on parse, not separately z.string().trim().toLowerCase().email() 2. .refine() — custom logic type-safety can't express z.string().refine(s => isValidIBAN(s), 'Invalid IBAN') 3. Shared schemas between frontend and backend One package, one source of truth, zero API contract drift Zod replaced about 400 lines of manual validation in the last codebase I cleaned up. 400 lines that were inconsistent, untested, and spread across 30 files. One Zod schema file. Consistent everywhere. #TypeScript #NodeJS #WebDevelopment #BackendDevelopment #SoftwareEngineering
To view or add a comment, sign in
-
-
Most React devs handle API state like this: isLoading, isError, data, three separate booleans that can contradict each other. Loading = true AND error = true at the same time? Shouldn't happen, but nothing prevents it. Discriminated unions fix this cleanly: type AsyncState<T> = | { status: 'idle' } | { status: 'loading' } | { status: 'success'; data: T } | { status: 'error'; error: Error } Now TypeScript enforces that you handle each case. Inside the 'success' branch, state.data is always defined. No optional chaining, no null checks. The real win: when you switch on state.status, TypeScript narrows automatically. No more wondering "is data null because it's loading, or because it failed?" It's a small change that eliminates a whole class of bugs and makes component logic self-documenting. What state patterns are you reaching for in your React projects right now? #TypeScript #React #WebDevelopment
To view or add a comment, sign in
-
🚀 𝗗𝗮𝘆 𝟮: 𝗧𝘆𝗽𝗲𝗦𝗰𝗿𝗶𝗽𝘁 𝗣𝗼𝘄𝗲𝗿 𝗖𝗼𝗻𝗰𝗲𝗽𝘁𝘀 👉 Write Scalable Code This is where TypeScript stops being “nice to have”… and becomes essential for real-world applications. 🔹 1. Generics (The Game Changer) Generics let you write reusable code without losing type safety. function wrap<T>(value: T): T { return value; } 👉 Same function works for string, number, objects… anything. 🔹 2. Real-world Usage of Generics Instead of writing separate types for every API: interface ApiResponse<T> { data: T; success: boolean; } 👉 One structure → works for all APIs ✔ Cleaner code ✔ Less duplication 🔹 3. Utility Types (Built-in Superpowers) TypeScript gives powerful helpers: • Partial<User> → make fields optional • Pick<User, "id"> → select specific fields • Omit<User, "password"> → remove fields • Readonly<User> → prevent modification 👉 These reduce repetitive code a lot. 🔹 4. Type Narrowing (Avoid Runtime Bugs) function print(value: string | number) { if (typeof value === "string") { console.log(value.toUpperCase()); } } 👉 TypeScript understands your checks ✔ Ensures correct operations ✔ Prevents unexpected crashes 🔹 5. Immutability (Critical for Scaling) ❌ Don’t mutate objects: user.name = "New"; ✔ Create new state instead: const updatedUser = { ...user, name: "New" }; 👉 Required for: Predictable state Better debugging Frameworks like Angular/React 🔹 6. Async + Type Safety async function getData(): Promise<User[]> { return []; } 👉 Combine async logic with strong typing ✔ Safer API handling ✔ Clear return expectations 💡 Final Thought Intermediate TypeScript isn’t about syntax… It’s about writing code that is: ✔ Clean ✔ Reusable ✔ Scalable ⏭️ Tomorrow: Real-world architecture & how TypeScript fits into large applications 🚀 #TypeScript #SoftwareEngineering #CleanArchitecture #Angular #Frontend
To view or add a comment, sign in
-
-
I avoided TypeScript generics for a long time in my early days with TypeScript. I'd copy one from Stack Overflow, it worked, and I moved on without fully understanding why. That's a problem, because generics are not just advanced TypeScript; they're a core part of what makes TypeScript powerful. A generic is just a type that takes another type as a parameter. That's it. Instead of writing a function that accepts a string and returns a string, you write one that accepts a T and returns a T, where T is whatever the caller passes in. The type is preserved and reused instead of being hardcoded. function identity<T>(value: T): T { return value; } No magic. Just a placeholder that gets filled in at the call site. Without generics, you often end up with two bad options: hardcode a type and lose flexibility, or use any and lose safety entirely. Generics give you both. A function that wraps an API response, a utility that transforms an array, a service that handles paginated results - all of these can be written once, typed correctly, and reused across every data shape in your codebase. Constraints are where it gets useful. Raw generics accept anything. Constraints narrow that down. function getProperty<T, K extends keyof T>(obj: T, key: K): T[K] { return obj[key]; } Now, K isn't just any string. It has to be an actual key of T. TypeScript will catch invalid keys at compile time. That's the kind of safety that pays for itself the first time it catches a typo before it reaches production. Stop reading <T> as intimidating syntax. Read it as a question: 'What type is the caller going to pass in?' The generic is just your way of saying 'I don't know yet, but I'll be consistent about it'. Once that clicks, generics stop being something you avoid and start being something you reach for.
To view or add a comment, sign in
-
-
🚀 Day 38 – Node.js Core Modules Deep Dive (fs & http) Today I explored the core building blocks of Node.js by working directly with the File System (fs) and HTTP (http) modules — without using any frameworks. This helped me understand how backend systems actually work behind the scenes. 📁 fs – File System Module Worked with both asynchronous and synchronous operations. 🔹 Implemented: • Read, write, append, and delete files • Create and remove directories • Sync vs async execution • Callbacks vs promises (fs.promises) • Error handling in file operations • Streams (createReadStream) for large files 🔹 Key Insight: Streams process data in chunks, improving performance and memory efficiency. Real-time use cases: • Logging systems • File upload/download • Config management • Data processing (CSV/JSON) 🌐 http – Server Creation from Scratch Built a server using the native http module to understand the request-response lifecycle. 🔹 Explored: • http.createServer() • req & res objects • Manual routing using req.url • Handling GET & POST methods • Sending JSON responses • Setting headers & status codes • Handling request body using streams 🔹 Key Insight: Frameworks like Express are built on top of this. ⚡ Core Concepts Strengthened ✔ Non-blocking I/O → No waiting for file/network operations ✔ Event Loop → Efficient handling of concurrent requests ✔ Single-threaded architecture with async capabilities ✔ Streaming & buffering → Performance optimization Real-World Understandings • How client requests are processed • How Node.js handles multiple requests • What happens behind APIs • Better debugging of backend issues Challenges Faced • Managing async flow • Handling request body streams • Writing scalable routing without frameworks 🚀 Mini Implementation ✔ File handling using fs ✔ Basic HTTP server ✔ Routing (/home, /about) ✔ JSON response handling Interview Takeaways • Sync vs Async in fs • Streams in Node.js • Event Loop concept • req & res usage #NodeJS #BackendDevelopment #JavaScript #LearningJourney #WebDevelopment #TechGrowth 🚀
To view or add a comment, sign in
-
𝗝𝗮𝘃𝗮𝗦𝗰𝗿𝗶𝗽𝘁 𝗶𝘀 𝘀𝗶𝗻𝗴𝗹𝗲-𝘁𝗵𝗿𝗲𝗮𝗱𝗲𝗱. But your async code doesn't run in the order you think. Most developers get this wrong — including seniors. What does this print? console.log('1') setTimeout(() => console.log('2'), 0) Promise.resolve().then(() => console.log('3')) console.log('4') Take 10 seconds. Write your answer. Then keep reading. ━━━━━━━━━━━━━━━━━━━━━━━ 𝗧𝗵𝗲 𝗮𝗻𝘀𝘄𝗲𝗿: 1 → 4 → 3 → 2 Most people predict: 1 → 4 → 2 → 3 They're wrong. Here's exactly why. ━━━━━━━━━━━━━━━━━━━━━━━ 𝗧𝗵𝗲 𝗲𝘃𝗲𝗻𝘁 𝗹𝗼𝗼𝗽 𝗵𝗮𝘀 𝘁𝘄𝗼 𝗾𝘂𝗲𝘂𝗲𝘀, 𝗻𝗼𝘁 𝗼𝗻𝗲. 𝗠𝗶𝗰𝗿𝗼𝘁𝗮𝘀𝗸 𝗾𝘂𝗲𝘂𝗲 → Promises, queueMicrotask(), MutationObserver 𝗠𝗮𝗰𝗿𝗼𝘁𝗮𝘀𝗸 𝗾𝘂𝗲𝘂𝗲 → setTimeout, setInterval, I/O, UI events The rule nobody tells you: After every task, the engine drains the ENTIRE microtask queue before picking the next macrotask. Not one microtask. ALL of them. ━━━━━━━━━━━━━━━━━━━━━━━ 𝗧𝗵𝗲 𝗲𝘅𝗮𝗰𝘁 𝗲𝘅𝗲𝗰𝘂𝘁𝗶𝗼𝗻 𝗼𝗿𝗱𝗲𝗿: Step 1 — Run the call stack (synchronous code first) → prints '1', queues setTimeout, queues Promise, prints '4' Step 2 — Call stack is empty. Check microtask queue. → Promise.then is there → prints '3' → Microtask queue now empty. Step 3 — Now pick the next macrotask. → setTimeout callback → prints '2' setTimeout(fn, 0) does NOT mean "run immediately." It means "run after all microtasks are done." ━━━━━━━━━━━━━━━━━━━━━━━ 𝗧𝗵𝗲 𝗽𝗿𝗮𝗰𝘁𝗶𝗰𝗮𝗹 𝗶𝗺𝗽𝗮𝗰𝘁: This is why React state updates inside Promises resolve before a setTimeout that was queued at the same time. This is why async/await in Node.js doesn't block I/O — I/O callbacks are macrotasks, but .then() chains are microtasks that run between them. And this is the trap: If you keep creating microtasks inside microtasks, you can starve the macrotask queue permanently. setTimeout never fires. UI never updates. ━━━━━━━━━━━━━━━━━━━━━━━ 𝗡𝗼𝘄 𝘁𝗵𝗲 𝗵𝗮𝗿𝗱𝗲𝗿 𝗼𝗻𝗲: console.log('start') setTimeout(() => console.log('timeout'), 0) Promise.resolve() .then(() => { console.log('promise 1') return Promise.resolve() }) .then(() => console.log('promise 2')) console.log('end') What's the output? Drop your answer below — I'll reply with the explanation. #JavaScript #FrontendDevelopment #ReactJS #NodeJS #SoftwareEngineering #ImmediateJoiner #OpenToWork #FrontendDeveloper #React #ReactDeveloper
To view or add a comment, sign in
-
-
⚠️ STOP using any in TypeScript. It’s not flexibility… it’s silent technical debt. You’ve written this 👇 let data: any; Looks harmless. Feels fast. But here’s what you actually did: 🧨 You lose type safety — TypeScript stops protecting you. 🧩 You lose IntelliSense — your IDE can’t autocomplete or infer. 🕳️ You lose refactor confidence — renames and changes won’t propagate. 🐛 You gain runtime bugs — errors sneak past compile‑time checks. Using any is like: 🚗 Driving without a seatbelt 🧪 Skipping tests “just for now” 🧱 Removing guardrails from your code It works… until it doesn’t. 🧠 Why We Still Use It Be honest 👇 • “I’ll fix types later” • “API response is messy” • “This is just a quick hack” 👉 That “temporary” any? It never gets removed. ✅ What Senior Devs Do Instead They don’t avoid flexibility. They use safe flexibility 👇 ✔️ unknown → forces validation ✔️ generics <T> → reusable + type-safe ✔️ interface / type → clear data contracts ✔️ Partial<T> → safe optional updates ✔️ Record<K, V> → structured dynamic objects ⚡ Angular Reality Check ❌ Bad: getTickets(): any { return this.http.get('/api/tickets'); } ✅ Good: getTickets(): Observable<Ticket[]> { return this.http.get<Ticket[]>('/api/tickets'); } Now your: ✨ IDE helps you ✨ Compiler protects you ✨ Bugs get caught early 🧩 Golden Rule 👉 If you use any, you’re telling TypeScript: “Don’t help me. I’ll debug in production.” Where have you seen any create real problems? (or are you still using it 👀) 👇 Drop your experience 🔖 Save this before your next refactor #Angular #TypeScript #Frontend #CleanCode #WebDevelopment #SoftwareEngineering #JavaScript #DevCommunity
To view or add a comment, sign in
-
-
"Did you know 76% of developers struggle with maintaining type safety across a full-stack TypeScript application using tRPC? Here's how you can master it. 1. Use tRPC to connect your client and server without REST or GraphQL. This cuts your boilerplate code dramatically and keeps types in sync. 2. Build your API procedures in a way that leverages TypeScript's powerful type inference. Less manual type annotation means fewer errors. 3. Avoid the common pitfall of skipping input validation. Even with TypeScript, ensure you validate inputs to catch runtime errors early. 4. Try using vibe coding to rapidly prototype your tRPC endpoints. This method keeps you in the flow and speeds up development. 5. Experiment with advanced TypeScript features like mapped types and conditional types for even more robust type safety. 6. Integrate AI-assisted development into your workflow to automate repetitive tasks. I've found this significantly increases my productivity. 7. Maintain a lean data transfer by defining precise types for your API responses. This optimizes both performance and clarity. How do you ensure type safety across your full-stack applications? Share your approach below! ```typescript import { initTRPC } from '@trpc/server'; const t = initTRPC.create(); const appRouter = t.router({ getUser: t.procedure.query(() => { return { id: 1, name: 'John Doe' }; }), }); type AppRouter = typeof appRouter; ```" #WebDevelopment #TypeScript #Frontend #JavaScript
To view or add a comment, sign in
-
𝐄𝐯𝐞𝐫 𝐟𝐨𝐮𝐧𝐝 𝐲𝐨𝐮𝐫𝐬𝐞𝐥𝐟 𝐦𝐚𝐧𝐮𝐚𝐥𝐥𝐲 𝐨𝐯𝐞𝐫𝐥𝐨𝐚𝐝𝐢𝐧𝐠 𝐟𝐮𝐧𝐜𝐭𝐢𝐨𝐧 𝐭𝐲𝐩𝐞𝐬 𝐟𝐨𝐫 𝐝𝐢𝐟𝐟𝐞𝐫𝐞𝐧𝐭 𝐫𝐞𝐭𝐮𝐫𝐧 𝐯𝐚𝐥𝐮𝐞𝐬? TypeScript's conditional types, combined with generics, unlock an incredibly powerful way to ensure type safety when your function's return signature depends directly on its input parameters. It means less boilerplate and more robust type inference. Consider a function that formats a message. You might want a simple string for some cases, but a detailed object (with a timestamp, for instance) for others. Instead of multiple function overloads, you can do this: ```typescript type MessageOutput<T extends boolean> = T extends true ? { message: string; timestamp: number } : string; function getFormattedMessage<IncludeDetails extends boolean>( msg: string, includeDetails: IncludeDetails ): MessageOutput<IncludeDetails> { if (includeDetails) { return { message: msg, timestamp: Date.now() } as MessageOutput<IncludeDetails>; } return msg as MessageOutput<IncludeDetails>; } // TypeScript now magically infers the correct type: const simpleLog = getFormattedMessage('User login successful.', false); // Type: string const detailedLog = getFormattedMessage('User login successful.', true); // Type: { message: string; timestamp: number } ``` This pattern keeps your type definitions DRY and your codebase robust, reducing errors from incorrect type assertions. It's a fundamental technique for building flexible APIs and libraries. How are you leveraging advanced TypeScript features in your daily work? Drop a comment or DM me if you've got a cool use case! #TypeScript #Generics #FrontendDevelopment #SoftwareEngineering #WebDev
To view or add a comment, sign in
More from this author
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development