I've been studying Grokking Algorithms, and in the chapter about Big O notation, I had some trouble understanding the different notations. By coincidence, Augusto Galego (a YouTuber I really like), released a video explaining Big O, and it helped me a lot. Here’s what I learned: Time complexity refers to how long an algorithm takes to run, and space complexity refers to how much memory it uses. O(1) means constant complexity. The algorithm performs a fixed number of operations, no matter the size of the input. For example, accessing an element in an array by its index is O(1) because you know exactly where the element is, so the operation happens once. O(n) means the running time grows linearly with the input size n. If you have to go through an entire list to find an item, the number of operations depends on how many elements are in the list. O(log n) is a classic case for algorithms like binary search. Even as the input grows, the running time increases much more slowly proportional to the logarithm of the input size because the problem is divided in half at each step. O(n log n) is typical of efficient sorting algorithms like Merge Sort, Heap Sort, and Quick Sort. They divide the problem into smaller parts (log n times) and process all elements (n) in each division. O(n²) appears in algorithms with two nested loops, such as Bubble Sort. Each element is compared with every other element, so the number of operations grows quadratically with the input size. Because of that, these algorithms are recommended only for small lists. I’ve just finished the first chapter of the book, and it’s been a great start to understanding how algorithms really work. #BigO #AlgorithmComplexity #Programming #SoftwareDevelopment #Tech #ComputerScience
Wesley Silva’s Post
More Relevant Posts
-
When you reduce your algorithm's time complexity from O(n) to O(1)... You feel like a GOD. 🤯 You see that feeling in the meme? That's the face of a developer who just became a Time Lord in the matrix. 🕰️ But what does it all mean? This is a celebration of mastering Time Complexity—a cornerstone of algorithm design! 💥 The Complexity Showdown: O(n) vs. O(1) The Problem Child: O(n) (Linear Time) The Analogy (Your Idea!): Imagine you have a list of n friends' phone numbers, but they're unsorted. To find a specific friend, you might have to check every single number in the list. The Reality: If you double your friends list (from 10 to 20), the time it takes to search doubles. The execution time grows linearly with the input size (n). The Conqueror: O(1) (Constant Time) The Analogy (Your Idea!): Imagine a library book with a perfect index. No matter if the book has 100 pages or 1000 pages, the time it takes to find the information you need is one swift, single action (looking up the index and flipping the page). The Reality: The execution time is constant—it doesn't matter if your input has 10 items or 10 million. It's the fastest possible outcome! 🌌 Why This Matters (The 'Lord of Time' Factor) This optimization is the difference between an application that grinds to a halt under heavy load and one that handles billions of users seamlessly. It's literally optimizing the future. My quick take on Space Complexity (for completeness): Think of Space Complexity as how much extra memory your algorithm needs to complete its task. O(n) might mean creating a whole new copy of the input, while O(1) means the algorithm performs its task in place without requiring significant extra storage. What's the best optimization you've ever achieved? Share your "I am the Lord of Time" moment below! 👇 #programming #softwaredevelopment #algorithms #timecomplexity #bigO #computerscience #coding
To view or add a comment, sign in
-
-
🧩 Understanding Bubble Sort — The Classic Sorting Algorithm Made Simple! Bubble Sort is one of the most fundamental sorting algorithms in computer science — perfect for understanding comparison-based logic and how iterative processes work. It repeatedly compares adjacent elements and swaps them if they’re in the wrong order. With each pass, the largest element “bubbles up” to its correct position — just like air bubbles rising to the top! While it’s not the most efficient algorithm (O(n²) time complexity), Bubble Sort remains a great learning tool for beginners to grasp the core concepts of loops, conditionals, and data manipulation. At GSW Infotech, we believe in building a strong foundation in both core logic and modern technologies. Every great developer starts by mastering the basics — and that’s where true innovation begins. #Programming #Coding #Algorithms #BubbleSort #ComputerScience #DataStructures #SoftwareDevelopment #TechEducation #LearnToCode #WebDevelopment #GSWInfotech #DeveloperCommunity #TechLearning #Innovation #CodeLogic #CodingBasics #SortingAlgorithm #ProgrammersLife
To view or add a comment, sign in
-
-
💻 Day 28 of My DSA Learning Journey : Today’s problems were all about patterns and optimization — one highlighting the beauty of mathematics in programming, and the other demonstrating algorithmic efficiency in arrays. 🔹 Problem 1: Pascal’s Triangle 📘 LeetCode #118 — Pascal’s Triangle 🧩 Problem Brief: Given an integer numRows, generate the first numRows of Pascal’s Triangle — where each number is the sum of the two numbers directly above it. 💡 My Approach: Instead of constructing it row by row using loops, I used the combination method, calculating each element based on its position in the triangle. This approach connects combinatorics with programming and simplifies the logic significantly while maintaining efficiency. 🕒 Time Complexity: O(n²) 💾 Space Complexity: O(1) (excluding output storage) 🔹 Problem 2: Majority Element (n/3 times) 📘 LeetCode #229 — Majority Element II 🧩 Problem Brief: Find all elements in an array that appear more than [ n/3 ] times. 💡 My Approach: I implemented the Extended Boyer-Moore Voting Algorithm, which efficiently tracks up to two potential majority candidates and validates them in a second pass. This method is highly optimized — both in time and space — making it a great example of clever algorithmic design. 🕒 Time Complexity: O(n) 💾 Space Complexity: O(1) 💭 Today’s problems reminded me how mathematics and algorithms go hand in hand — from generating beautiful patterns to crafting optimized logic. Every concept builds a stronger foundation for problem-solving. #100DaysOfCode #LeetCode #DSA #CodingChallenge #PascalTriangle #Combinatorics #BoyerMoore #Algorithms #ProblemSolving #Programming #LearnToCode
To view or add a comment, sign in
-
🚀 Day 59 – 75 Days DSA Challenge 💻 Today, I learned about Topological Sorting and Kahn’s Algorithm - two important concepts in Graph Theory 🔗 📘 Concepts Covered: 🔹 Topological Sort: =A linear ordering of vertices such that for every directed edge (u → v), vertex u appears before v. =Applicable only to Directed Acyclic Graphs (DAGs). =Helps in task scheduling, dependency resolution, and compiler design. 🔹 Kahn’s Algorithm: =An efficient BFS-based approach for topological sorting. =Uses in-degree of vertices and a queue to process nodes in order. =Simple yet powerful for handling large DAGs efficiently. 💡 Key Takeaway: =Understanding Topological Sorting and Kahn’s Algorithm gave me deeper insights into dependency management, graph traversal, and how ordering plays a key role in real-world systems like build pipelines and task schedulers. #Day59 #75DaysDSAChallenge #DSA #GraphAlgorithms #TopologicalSort #KahnsAlgorithm #DataStructures #Programming #ProblemSolving #CodingJourney #TechLearning
To view or add a comment, sign in
-
Understanding Recursion in Computer Science Recursion is a fundamental programming technique where a function calls itself to solve smaller instances of the same problem. Instead of approaching a problem in a step-by-step iterative manner, recursion breaks it down into simpler subproblems until it reaches a base case. A recursive function typically includes two key parts: 1. Base Case The stopping condition that prevents infinite calls. It returns a simple, direct answer. 2. Recursive Case The part where the function calls itself with a smaller or simpler version of the problem. Example: Calculating Factorial def factorial(n): if n == 0: return 1 return n * factorial(n - 1) Recursion is especially powerful in tasks like tree traversal, divide-and-conquer algorithms, backtracking, sorting algorithms such as quicksort and mergesort, and mathematical computations. Understanding recursion helps developers think more abstractly and write cleaner, modular solutions to complex problems. #programming #computerscience #coding #softwareengineering #recursion #learning #developers #algorithms
To view or add a comment, sign in
-
-
Time Complexity vs Space Complexity When writing efficient code, understanding time complexity and space complexity is essential. Time Complexity measures how the execution time of an algorithm grows as the size of the input increases. It helps us understand how fast or slow a program runs. Example: Linear search → O(n) Binary search → O(log n) Space Complexity measures how much memory an algorithm uses as the input size increases. It includes variables, data structures, and recursion stacks. Example: Storing an array → O(n) Constant variables → O(1) In short: Time complexity = How long it takes to run Space complexity = How much memory it consumes Efficient algorithms balance both — fast execution with minimal memory use. #Programming #DataStructures #Algorithms #Coding #SoftwareEngineering #Developer #BigO #TimeComplexity #SpaceComplexity #TechLearning
To view or add a comment, sign in
-
-
𝐈𝐧 𝐣𝐮𝐬𝐭 𝟮 𝐌𝐢𝐧𝐮𝐭𝐞𝐬, 𝐮𝐧𝐝𝐞𝐫𝐬𝐭𝐚𝐧𝐝 𝐌𝐞𝐦𝐨𝐢𝐳𝐚𝐭𝐢𝐨𝐧 — 𝐭𝐡𝐞 𝐡𝐢𝐝𝐝𝐞𝐧 𝐩𝐨𝐰𝐞𝐫 𝐛𝐞𝐡𝐢𝐧𝐝 𝐟𝐚𝐬𝐭 𝐚𝐥𝐠𝐨𝐫𝐢𝐭𝐡𝐦𝐬⚡ Ever wondered how dynamic programming works so fast while doing the same calculations ? Meet Memoization — your program’s secret memory 🧩 Think of it like ordering food 🍝 🔹 𝗙𝗶𝗿𝘀𝘁 𝗢𝗿𝗱𝗲𝗿 → Chef cooks from scratch (expensive computation) 🔹 𝗦𝗲𝗰𝗼𝗻𝗱 𝗢𝗿𝗱𝗲𝗿 → Stored in memory (served instantly) That’s memoization — your program remembers past results instead of recomputing them every time. Used in : 🔹 Fibonacci Series optimization 🔹 Caching API results 🔹 Machine Learning computations 👉 Result : Massive speed-up, lower CPU load, smarter code. 🔔 Subscribe to my channel for daily programming insights, shorts, and real interview problems — https://lnkd.in/gvvbUgYK #Programming #DynamicProgramming #Memoization #CodingInterview #Algorithms #SoftwareEngineering #Optimization #TechSimplifiedwithVK #DataStructures #CodeBetter #cfbr
To view or add a comment, sign in
-
-
🔥 DFS in Action: Word Search Problem Problem lInk:https://lnkd.in/eFfbAJG8 Approach: Traverse each cell of the 2D board. When a cell matches the first character of the target word, perform a Depth-First Search (DFS) recursively in all 4 directions (up, down, left, right). Keep track of visited cells to avoid revisiting in the same path. Use backtracking to unmark cells after exploring a path. If all characters are found in sequence, mark the word as found. This method elegantly handles matrix-based search problems using recursion and backtracking. #Coding #Algorithms #DFS #Recursion #Backtracking #ProblemSolving #Tech #Programming
To view or add a comment, sign in
-
Explore content categories
- Career
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Hospitality & Tourism
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development
Great explanation Wesley. A practical example that always comes to mind in front-end development is the difference between using a .find() on an array (which is O(n)) and searching for a property in an Object or Map (which is O(1)).