Efficient way of analysing Algorithms -Time Complexity
There are many aspects to consider when building out any software but from an Engineering standpoint one of the most important thing is its performance and efficiency.
If the program takes a lot of time to load or hangs during the execution, the users may get frustrated and will not want to continue to use your product, doesn’t matter how amazing it may be. Today’s world your users are your USP and the ones who truly represent your product.
Performance of the solution isn’t just one thing but many small things that make a complete solution better performing.
One of those is applying the most optimal algorithm for our use case.
Now, when It comes to comparing different algorithms for a particular problem, how do we determine which one is the best ? which one is most efficient ?
Should we analyze based on the time taken by each algorithm for same input parameters ?
Or based on the memory used by each algorithm ?
Or It could be something else, specific to the project or to the input constraints !
The most widely used way to analyze the algorithms is based on the time taken. Let's understand what is time complexity and compare two search algorithms.
Time complexity is defined as the amount of time taken by an algorithm to run, as a function of the length of the input. We don't measure total execution time of an algorithm as the same algorithm can finish in different times based on the processor it is running on.
In simple words, Time complexity gives information about the variation (increase or decrease) in execution time when the number of operations (increase or decrease) in an algorithm.
Let's consider 2 searching algorithms,
Given an array of size n and an element X that is to be searched in the array
arr = [1,2,3,4,5,6,7,8,9,10] K=10
1. Linear Search - Linear search algorithm will start from the starting of the array and traverse one by one till the element is found or till the end. When it finds the element in the array, it will return true & if it fails to find the element, it returns false.
2. Binary Search - Binary Search algorithm is used in a sorted array by repeatedly dividing the search interval in half.
Now, we have different types of time complexities of an algorithm :
1. Best Case Time complexity - It measures the lower bound of the execution time of an algorithm. It is the minimum time taken by an algorithm to solve a problem. However, The Best Case analysis doesn’t provide any information
In the linear search problem, the best case occurs when X is present at the first location.
arr = [1,2,3,4,5,6,7,8,9,10] K=1
The number of operations in the best case is constant (not dependent on n). So time complexity in the best case would be Ω(1)
In the binary search problem, the best case occurs when X is present at the middle location.
arr = [1,2,3,4,5,6,7,8,9,10] K=5
The number of operations in the best case is constant (not dependent on n). So time complexity in the best case would be Ω(1)
2. Average Case Time complexity - In the average case analysis, we take all possible inputs and calculate the computation time for all inputs. Add up all the calculated values and divide the sum by the total number of entries. It is in between best case and worst case. The average case analysis is not easy to do in most practical cases and it is rarely done.
3. Worst Case Time complexity - we calculate the upper limit of the execution time of an algorithm. It is the maximum time taken by an algorithm to solve a problem. Most of the time, we do worst-case analyses to analyze algorithms as to find the maximum time.
Recommended by LinkedIn
For Linear Search, the worst case happens when the element to be searched (x) is not present in the array.
arr = [1,2,3,4,5,6,7,8,9,10] K=15
When x is not present, the search() function compares it with all the elements of arr[] one by one. Therefore, the worst-case time complexity of the linear search would be O(n).
For Binary Search, the worst case occurs, when we have to keep reducing the search space till it has only one element.
arr = [1,2,3,4,5,6,7,8,9,10] K=7
I) left=0 right=9 mid=4 arr[mid]=5
since x>arr[mid] i.e. 7>5 left = mid+1 = 4+1 = 5
II) left=5 right=9 mid=7 arr[mid]=8
arr = [6,7,8,9,10]
since x<arr[mid] i.e. 7<8 right = mid-1 = 7-1 = 6
III) left=5 right=6 mid=5 arr[mid]=6
arr = [6,7]
since x<arr[mid] i.e. 7>6 left = mid+1 = 5+1 = 6
IV) left=6 right=6 mid=6 arr[mid]=7
arr = [7]
since x==arr[mid] i.e. 7==7 return mid
Element is present at index 6
Since, at each iteration, we are discarding half of the elements, the worst-case time complexity of Binary search is O(logn).
Although Time complexity is widely used to analyze the efficiency of the algorithms, It all depends on the input constraints and the type of problem.
We use Performance Testing to determine the performance of our software. Performance Testing is a type of software testing which ensures that the application is performing well under the workload.
The attributes of Performance Testing include:
Performance Testing informs the stakeholders about the speed, scalability, and stability of their application and reveals the necessary improvements needed before the product is released in the market!
So next time when you’re building out any small program as well, just think about ways how it can be done differently, compare the pros and cons of each approach and make an informed decision.