Gabriele Ferreri’s Post

Optional chaining (?.) cost me 300ms per interaction. Every ?. is a runtime null check: user?.profile?.settings?.theme = 3 null checks × 1000 renders = 3,000 operations I profiled our app (transpiled to ES6): • 1,247 optional chains per render • 5 renders per interaction • 6,235 null checks total • 300ms wasted on checks for data we KNOW exists The problem: Defensive coding everywhere. ❌ Bad: const Profile = ({ user }) => <h1>{user?.name}</h1>; If user can be null, this component shouldn't render. ✅ Better: const Profile = ({ user }: { user: User }) => <h1>{user.name}</h1>; // Caller guards: {user && <Profile user={user} />} Now TypeScript knows user is defined. No runtime checks needed. When to use ?.: • API data (genuinely uncertain) • Optional props • Early returns When NOT to: • Data you KNOW exists • Deep chains (>2 levels) • Inside loops Example loop issue: items.map(item => item?.data?.value) // 1000 items × 2 checks = 2000 checks Better: items.filter(item => item?.data).map(item => item.data.value) // Filter once, access directly My refactor: • Removed ?. from guaranteed props • Added type guards at boundaries • Kept ?. only for uncertain data Results: • 300ms faster interactions • 40% fewer runtime checks • Clearer contracts Important: Native ?. in modern browsers is fast. The overhead comes from: 1. Transpiling to older JavaScript (ES6/ES5) 2. Overuse on known data 3. Deep chaining Use TypeScript to encode what you KNOW. Use ?. only for what you DON'T. #TypeScript #JavaScript #Performance #WebDevelopment

  • No alternative text description for this image

Performance rage-bait. In addition to many issues with this post, in modern js engines (V8), a simple null check typically takes nanoseconds. 3,000 x 0.000001ms (approx) = 0.003ms. Your claim is off by a factor of roughly 100,000x compared to raw execution. When you transpile ?. to older JS, it becomes a nested ternary or series of if statements. If the transpiler is inefficient and the engine can't optimize the path, it adds up but still not to 300ms for 6,000 checks. TypeScript is erased at runtime. Whether you use user.name or user?.name, the underlying JavaScript still has to access that memory address. If one interaction costs you 300ms you have much bigger problems than optional chaining

It's not defensive coding. It's FLAKY DATA MODEL. Also, there are way more (or bigger) problems than performance. Generally speaking, you want as few optional properties as possible. Moreover, 1000 renders sounds like A LOT. How much time did it take for 1000 renders? I'd be concerned about 1000 renders first.

I saw the viral chart claiming optional chaining adds 300ms to your interaction time. That sounded impossible, so I decided to audit the code myself. I built a benchmark replicating the exact scenario with over 6,000 runtime checks. I tested both native modern JavaScript and older transpiled versions. Here is what I found: • The Claim: 300.00ms latency. • The Reality: 0.35ms latency. The original claim was off by a factor of 850x. Modern V8 engines optimize these checks down to nanoseconds. Even when transpiled to ES5, it is just a simple lightweight check. If your application is logging 300ms delays, it is not because of syntax. It is likely due to re-rendering heavy lists or architectural bottlenecks. Write clean, safe code. Do not let performance myths scare you away from useful features.

  • No alternative text description for this image

Optional chaining is perfectly fine, when used correctly. You can scale anything to become an issue if you put the numbers high enough, some people go mad when they see optional - I see a precautionary programmer who dont want to watch the world burn. If Data should be there, fail it hard instead, if it is optional, this is fine.

Worth being careful with the conclusion here. Optional chaining itself is just a null/undefined check. Even thousands of those per interaction are typically well under 1ms on modern CPUs, including transpiled ES5/ES6 output. It’s very unlikely that 6k checks account for ~300ms by themselves. What likely changed with the refactor is render behavior: fewer or shallower React re-renders simpler component boundaries better guarding higher in the tree more stable props during reconciliation Profilers show where time is spent, not necessarily the root cause. Using TypeScript to encode guarantees is definitely a win — mainly for correctness and clearer contracts. But ?. itself is generally not a meaningful performance bottleneck. Optional chaining is best used where data is genuinely uncertain. If performance improved, it’s probably due to reduced render work rather than the cost of the null checks themselves.

This is a great reminder that 'defensive coding' isn't free. As a Backend System Engineer, I see a direct parallel here with how we handle API contracts. If the backend guarantees a field via schema validation, the frontend shouldn't have to 'guess' with optional chaining at every level. Overusing ?. is often a sign of weak data contracts between services. By enforcing strict Type Guards at the boundaries (where data enters the app), we not only gain performance by reducing transpilation overhead but also make the code much more predictable. Don't just hide nulls-handle them where they belong.

Like
Reply

I tried to understand the post, but I got a bit confused by some parts. In general, the issue is not from the optional chaining or null check but from the TypeScript types definition or type-validation library schemes ( ex: Zod ). > user?.profile?.settings?.theme Here, you are defining profile as optional but also settings as optional while most of the time, a user has a profile and each profile has a settings is mandatory ( IMHK ), but we may have optional properties in the profile or in the settings objects for example ( avatar ) where we use default image or 2-factor-auth not activated in settings ..etc. > {user && <Profile user={user} />} It would be better to return early when the user is null/undefined before you reach the JSX. > items.map(item => item?.data?.value) // 1000 items × 2 checks = 2000 checks The issue here is not from null checks but from rendering 1000 items once time. Using virtualization technique where we render only what's in the viewport is the preferred one ( aka in JS Import on Visibility ).

Like
Reply

how about this? function tryGetProp(obj, fn) {   try {     return fn(obj);   } catch (e) {     if (e.name === 'TypeError') {       console.log(`${e.name} - ${e.message}`);       return null;     }     // re throw error or take any other action     throw e;   } } // Caller   const user = {     profile: {       username: 'velociraptor',       settings: {         theme: 'dark'       }     }   };   const user1 = {     profile: {       username: 'stegosauras'     }   }; function UserProfile() {   const myTheme = tryGetProp(user1, (u) => u.profile.settings.theme);   if (myTheme === null) return null;       return <User theme={myTheme} /> }

Like
Reply
See more comments

To view or add a comment, sign in

Explore content categories