Day 16 of me reading random and basic but important coding facts....... Today I read about Base 36 in JS numbers...... JS engine has a native ability to handle Base 36. Base 36 is a numbering system that uses digits 0-9 and letters a-z ....means total of 36 symbols. This allows us to pack a lot more value into fewer characters because z in Base 36 is equal to 35 in decimal......... Here how it does Compression and decompression......... Use the .toString() method. const myNumber = 46656; const compressed = myNumber.toString(36); console.log(compressed); // Output: "1000" // (It shrank 5 digits down to 4 characters) Use parseInt with the second argument. const id = "1000"; const original = parseInt(id, 36); console.log(original); // Output: 46656 now how it is internally implemented......it's quite simple by the way...... It doesn't just lookup values, it uses optimized math...... Divide the number by 36. Take the remainder (0-35). Map that remainder to a character. Repeat until the number hits 0. engine doesn't use a slow dictionary object like { 10: 'a' }. It uses ASCII codes If remainder is 0-9: It adds 48. If remainder is 10-35: It subtracts 10 and adds 97. there are lots of real world use cases of this feature.... and the best I read were.... URL Shortening: If you have a database ID like 1048576, converting it to Base 36 gives you m2s0. This is perfect for cleaner, shorter URLs (like Youtube video IDs). Generating Random IDs: A quick way to generate a random alphanumeric string: Math.random().toString(36).substring(2); but with every advantage there exists some disadvantages..... It is not human friendly:- "1ka" is harder for a human to type or remember than "2026". Collision Risk: If we use the Math.random() trick above, we can get duplicates...... Keep Learning!!!!! #JavaScript #Coding #WebDevelopment #SoftwareEngineering #FrontEndDev
Base 36 in JavaScript: Compression and Decompression
More Relevant Posts
-
This is a great update to share! The introduction of the if() function is a significant milestone because it brings CSS closer to the "logic" we usually see in languages like JavaScript or Python, without losing its declarative nature. To make your post more engaging and clear for your audience, I’ve restructured it to highlight the "Why it matters" and provided a clear code comparison. 🚀 Is CSS Becoming a Programming Language? Not exactly—but it’s getting a major "brain" upgrade! Traditionally, logic in CSS (like media queries) requires jumping between different blocks of code. It works, but it can feel fragmented and repetitive. That’s about to change with the new CSS if() function. 💡 What’s changing? The if() function allows you to write inline conditional logic. Instead of writing an entire media query block to change one value, you can handle it directly inside the property declaration. #programing #css #coding #Csslanguage
To view or add a comment, sign in
-
-
I built a web browser (kind of) from scratch. And yes, it runs on Python. We stare at web browsers all day, but rarely stop to think about the sheer engineering magic happening behind that URL bar. How does a string of HTML turn into pixels? How does display: flex actually know where to put things? I decided to stop wondering and start building. I’ve been deep in the trenches building a custom browser engine from the ground up in Python. No Chromium wrapper. No WebKit. Just raw code handling everything from the network layer to the final pixel paint. But here’s the twist: It doesn’t use JavaScript. I wanted to explore a "what if" scenario: What if Python was the first-class language of the web? In this prototype, instead of <script> , you write <script type="text/python"> You manipulate the DOM, handle click events, and animate styles, all using native Python code running directly in the browser. Some of the engineering wins: ✅ Custom Layout Engine: I had to write my own Flexbox and Grid implementation. (Respect to browser engineers, like always for me math is hard). ✅ Time Travel Debugging: Since the engine tracks every state change, I built a slider that lets you "rewind" your user session to see exactly how the UI looked seconds ago. ✅ Hot Reload: Because waiting for refreshes is not cool (sometimes it is). It’s not going to replace Chrome tomorrow. It’s strictly a research prototype. But seeing my own engine parse HTML and execute Python to render a live, interactive page? That’s a feeling hard to beat. It reminded me why I love engineering in the first place, taking the "magic" apart until it’s just logic. Repo: https://lnkd.in/gXzBgcNn #Python #SoftwareEngineering #WebDevelopment #BrowserEngine #OpenSource #Coding
To view or add a comment, sign in
-
Revisiting DSA through JavaScript – Day 1/30 I’ve solved plenty of DSA problems before — but mostly in Python or Java to just “get the answer”. This time, I want it to actually improve how I write production JavaScript. So I’m spending the next 30 days brushing up on Data Structures & Algorithms entirely in JS, as a MERN developer. No more switching languages just to pass the test. Here are 3 things that hit me today while revisiting Big O and the JS engine: .shift() is way more expensive than I remembered Quick test: pushing 50k items → ~16ms. Shifting them all out one by one? ~184ms. Reason: every shift re-indexes the entire array. If you’re frequently removing from the front, a proper queue (or different structure) makes a huge difference. O(n²) sneaks up fast Nested loops feel fine with small datasets, but at ~10k items, performance drops hard. The fix that often saves the day: swap .find() inside a loop for a Set/Object → O(1) lookups. Slow becomes instant. V8 loves predictable objects Dynamically adding properties later breaks hidden class optimizations. Simple habit: initialize all expected properties upfront. Big O always felt a bit theoretical before. Seeing it directly impact real JS performance? That’s when it finally clicked — especially when you're building apps for actual users and traffic. Day 2 (Arrays & two-pointer technique) coming tomorrow 🚀 #DSA #JavaScript #MERN #WebDev #Coding
To view or add a comment, sign in
-
-
Coming from JavaScript, polymorphism always felt natural — the same function behaving differently depending on the object. It’s one of the reasons modern web apps feel smooth and intuitive. Exploring it in Python has reinforced why this concept matters. Polymorphism lets us design around behavior, not rigid structures. Same interface, different outcomes — cleaner code for developers and consistent experiences for users. Think of a render() method handling buttons, modals, or cards. The UI stays predictable while the logic stays flexible. It’s one of those quiet tools that helps software evolve without breaking — and that’s exactly what clients value. Do you use polymorphism often?
To view or add a comment, sign in
-
-
Day 20 of me reading random and basic but important coding facts Today I read about Destructuring Assignment in JS. It sounds fancy, but it’s basically just a cleaner way to unpack values from arrays or properties from objects into distinct variables. Instead of accessing data like arr[0] or obj.name, we can extract them directly during assignment. Arrays: let [first, second] = ["Apple", "Orange"]; Objects: let { height, width } = { height: 100, width: 200 }; What I found amusing is the swap trick just like python We can easily swap two variables in one line without a temporary variable [guest, admin] = [admin, guest]; Instead of strictly ordered arguments, we can pass an object and destructure it right in the function signature. function createMenu({ title = "Untitled", width = 200, height = 100 }) { } // we now have title, width, and height as variables We can also go thru complex objects in a single statement. let options = { size: { width: 100, height: 200 } }; let { size: { width } } = options; // Extracts just 100 It is perfect for handling JSON data where we only need a few specific fields from a large response object. Just we have to be conscious .....If we try to destructure into existing variables (without let or const), JS might get confused and think {} is a code block. We just have to wrap the whole line in parentheses ( ). eg. { a, b } = obj; // Error correct way is...... ({ a, b } = obj); Keep Learning...... #JavaScript #Development #Factsss #Coding
To view or add a comment, sign in
-
-
Day 19 of me reading random and basic but important coding facts...... Today I learnt about a very underrated topic: WeakMap and WeakSet. We all use Map everywhere, but few know that Map has a problem. In a Map, keys are strongly held. This means if we use an object as a key like map.set(obj, "data")......that object now exists in memory. Even if we set obj = null everywhere else in the code, the object still exists in memory as long as the Map exists. The garbage collector cannot touch it because the Map is still holding it. This leads to memory leaks. The solution is simple: WeakMap & WeakSet. These structures hold weak references to their keys (objects only). If the key object is deleted or becomes unreachable elsewhere in your code, WeakMap automatically releases it. The entry is removed from the map, and the memory gets freed. We use WeakMap for: 1. Caching/Memoization: To store the result of a heavy calculation on an object. e.g., cache.set(obj, result) If obj is later deleted by the app, the cached result is automatically wiped from memory. 2. DOM Node Tracking: Associating data with a DOM element. When the element is removed from the DOM, the data vanishes. It is a very, very important use case. Vue.js uses it to track reactivity without leaks, and Angular uses it to link metadata to components. It is the industry standard for associating data with objects you don't own. But if WeakMap is so good, why do we use Map? Why not always use WeakMap? The answer is: in Map, keys can be primitives as well as objects, but WeakMap can only have objects as keys. Also, we can iterate over Map, but we can't in WeakMap. So, the rule of thumb is: Use Map when you need a resilient data store that you need to count or iterate over. Use WeakMap when you are adding secondary data to objects that have a lifecycle managed by something else. Keep Learning!!!!!!! #JavaScript #WebDevelopment #Coding #MemoryManagement #SoftwareEngineering #FrontendDev
To view or add a comment, sign in
-
-
🚀 Leveling up my JavaScript logic: The Fisher-Yates Shuffle! I spent some time today diving into how online examination systems handle fairness and randomization. To solve the problem of generating unique question orders for every student, I implemented the Fisher-Yates Shuffle Algorithm. 💡 Why this approach? While many might use array.sort(() => Math.random() - 0.5), that method is actually biased and doesn't provide a truly uniform distribution. The Fisher-Yates (or Knuth) Shuffle is the gold standard because: Efficiency: It runs in O(n) time complexity. True Randomness: Every permutation of the array is equally likely. In-place potential: It can be done without extra memory (though I used the spread operator here to keep the original data immutable! 🛡️). 🛠️ The Implementation I wrapped the logic in a Higher-Order Function. This allows me to lock in the "Original Array" and generate a fresh, shuffled version whenever getPaper() is called. Key takeaway for the day: Functional programming isn't just about cleaner code; it’s about creating predictable, reusable tools for complex problems. Check out the snippet in the image below! 👇 #JavaScript #WebDevelopment #CodingLife #Algorithms #ProblemSolving #FisherYates #LearningToCode #SoftwareEngineering How do you handle randomization in your projects?
To view or add a comment, sign in
-
-
In the Scrapy community, there’s a common trope: "Pro devs only use scrapy.Spider." While it's true that a standard Spider gives you control, CrawlSpider (which uses LinkExtractors) is a powerhouse for repetitive, wide-scale tasks. 𝗧𝗵𝗲 𝗣𝗿𝗼𝗽𝗲𝗿 𝗨𝘀𝗲 𝗖𝗮𝘀𝗲𝘀: 𝗨𝘀𝗲 𝗮 𝗦𝘁𝗮𝗻𝗱𝗮𝗿𝗱 𝗦𝗽𝗶𝗱𝗲𝗿 𝘄𝗵𝗲𝗻: You are scraping a specific list of 1,000 URLs or the site navigation is highly unconventional and requires complex logic to find the "Next" button. 𝗨𝘀𝗲 𝗖𝗿𝗮𝘄𝗹𝗦𝗽𝗶𝗱𝗲𝗿 + 𝗟𝗶𝗻𝗸𝗘𝘅𝘁𝗿𝗮𝗰𝘁𝗼𝗿 𝘄𝗵𝗲𝗻: You need to "map" a whole site. Think e-commerce catalogs, news archives, or documentation sites. 𝗜𝗻𝘀𝗶𝗴𝗵𝘁: If the site has a predictable structure (like /category/item), a CrawlSpider lets you write 5 lines of rules instead of 50 lines of recursive functions. It’s cleaner, easier to maintain, and far more readable for your team. - Below is a Python code snippet using Scrapy. It defines a CrawlSpider class with a LinkExtractor rule. Its parsing through approximately 150k pages and also implementing the use of playwright to load Javascript on pages where needed.
To view or add a comment, sign in
-
-
New to JavaScript? Discover the power of the Math object with simple, hands-on examples. Learn how to round numbers, pick random values, and more—click to start coding!
To view or add a comment, sign in
-
Day 4 of #100DaysOfCode: Array Logic & The Magic of Truthy/Falsy in JS ⚡ Today was about mastering the fundamentals that make JavaScript unique. I split my time between solving algorithmic problems and digging deep into JS engine mechanics. 1️⃣ Data Structures & Algorithms: I tackled the "Concatenation of Array" problem (LeetCode 1929). The goal was to create an array of length 2n by concatenating two nums arrays. Key takeaway: It’s a great exercise in understanding array indexing and memory allocation. It’s simple on the surface, but efficient array manipulation is core to everything. 2️⃣ Web Development Concepts: I dove into Truthy & Falsy values. Coming from other languages, JS handling of conditionals is fascinating. I learned that 0, "", null, undefined, and NaN are the only falsy values. Everything else—including empty arrays [] and objects {}—is truthy. #JavaScript #DSA #WebDevelopment #MERNStack #CodingJourney #100DaysOfChallenge
To view or add a comment, sign in
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development