⚡ Stop using .filter() and .map() together. Your bundle size will thank you. I've seen this pattern in 100+ React codebases this year: const users = data .filter(user => user.active) .map(user => <UserCard {...user} />); Looks clean. Feels right. But here's the problem? Your code loops through that array TWICE. Your browser processes it twice. Your bundle gets heavier. ━━━━━━━━━━━━━━━━━━━━━━━━━ The Issue: ❌ Two iterations over same array ❌ Creates intermediate arrays in memory ❌ Slows down large datasets ❌ Unnecessary re-renders ━━━━━━━━━━━━━━━━━━━━━━━━━ The Better Way: ✅ Use reduce() → Single loop ✅ Cleaner code, fewer dependencies ✅ Smaller bundle size ✅ Better edge case handling ━━━━━━━━━━━━━━━━━━━━━━━━━ Why This Matters: • Performance: Only loops once • Dependencies: Zero extra libraries • Bundle Size: Noticeably smaller • Scalability: Handles 10K+ items smoothly Master the basics before reaching for optimization libraries. ━━━━━━━━━━━━━━━━━━━━━━━━━ 💬 Question for you: Do you prefer readability (.filter().map()) or performance (reduce())? Is there a middle ground I'm missing? Let's debate in the comments. 👇 #JavaScript #ReactJS #WebDevelopment #CodingTips #Performance #FrontendDevelopment #BestPractices #Coding #TechTips #DeveloperCommunity
Did you measure the difference of time it takes to process 10k itens? Is it worth the bad code readability?
Both are correct, I prefer reduce for readability and performance, because .filter().map() is not better for readability, instead of have accumulated and current, I now need to see the function in the filter, see what it does and now i need to see what the map function does. Oh yeah, Ive seen cases where there are inconsistencies in naming, use case and more Case: const authors: Author[] = [] persons.filter((p: Author) => p && p.name !== null).map((item: Person, index) => { If(item && item.name){ authors.push(item as unknown as Author) } }) True story, ive seen these things emerge and dont ask why or how. Hence the reason why reduce is better. You get one function that handles whatever you are trying to achieve. Another fact people tend to forget and ask about what is the win for 10 entries vs 10k, It is it an accumulated action - the more you use reduce the better your code persons on scale as you minimize compute times. I havent yet verified, but here is a hypothesis: Using reduce inside a multidimensional array to mimic n different functions with entries in the range of x - y that should filter and map something will be faster than the same criteria with .filter.map. Maybe that could be a fun article 👀
filter().map() is just more readable, you see and immediately understand it. Also I doubt that you would be rendering 100+ user cards, otherwise it doesn't effect performance much
filter+map is fine for readability if perf matters most of the time a plain for loop wins anyway
flatmap
There is no pro way. I don't understand relating everything to speed. If it's that important to you and not to the project, just write it in machine code.
Well not to be that guy, but these code snippets do not imply the same semantics. If you're really dealing with a huge amount of data and aggregation performance is a key subject, try using Streams instead. https://developer.mozilla.org/en-US/docs/Web/API/Streams_API/Concepts
Both are correct, I prefer filter + map for readability. I would use reduce only if the context justifies it in terms of performance or composition.
.filter().map() is absolutely fine for your 50 item list in the frontend and very clear to understand. Only use .reduce in this case when the performance really differs which is for many thousands of items
We use .filter().map() for readability, not performance. If we'd like to go faster, then a simple for (let i =... i++) is much faster than .reduce, but also less readable.