Algorithms, Simplified!
publicly available internet image

Algorithms, Simplified!

Well, it is tough to define what we call as Algorithm ( Read, Algo’s) in exact words, to me it is simply an instruction set arranged in a well-defined, structured sequence of programmable logical steps and /or loops to perform computation for solving an identified problem. Over the last 200 years the definition of algorithm has become more complicated and detailed as researchers have tried to pin down the term, yet it is evolving due to nature of the usage and deployments beyond its classical means. Often used as specifications for performing complex/iterative calculations, on large-scale or real-time data processing or even automated reasoning, machine learning tasks in the application workflow or business logic thereof.

In modern computing, algorithms are used widely ranging from forensics, forecasting ,financial modelling and futuristic predictions while using emerging technologies such as Machine Learning and Data Science. In true sense they extend the business logic given the set of iterative computations and enable advanced analytics with speed and accuracy. An algorithm has the following properties:

  • Finiteness & Definitive – dictates that each process must terminate, the number of steps are unique, finite yet iterative in which each step is precisely stated and defined without any ambiguity.
  • Computability & Correctness - each step can be computed effectively, i.e. the instruction sets are computable and performs the task without error. An effective algorithm is a step-by-step solution to a given problem.
  • Comprehensibility of Pseudocodes - Pseudocode is independent of any language, It is a structured way of expressing an algorithm in English and should be easily understood (which helps with human efficiency).

Algorithmic problems are often presented using standard English language, in terms of real-world objects and references, Algorithm designers may restate these problems in terms of formal, abstract, mathematical objects such as numbers, arrays, lists, graphs, trees etc to reason out formally.

Most of the algorithms are complex and proprietary nature of many algorithms, current lack of standards and tools, the speeds at which algorithms operate provides limited controls and governance on the pseudocodes operating within them and if any of these go out of the hand resulting in adverse implications of reputational, financial, operational, regulatory, technology, and competitive risks as well. Of course there are checks and balances in place for via structured analysis of algorithms and evaluation metrics such as classification accuracy, F1 scores, log loss, confusion matrix for performance measures of false positives or negatives etc, in line with the profound data science and analytics practices thus is not a life and death situation for sure.

Algorithm life cycle includes design/problem definition, writing, testing and analysing phases.   The skills required to effectively design and analyse algorithms are entangled with the skills required to effectively describe algorithms, the comprehensiveness of any algorithm has four key tenets..

  • What: The problem that the algorithm intends to solve.
  • How: Description of the algorithm itself.
  • Why: A proof that the algorithm solves the problem it is supposed to solve, the quality check and UAT of course
  • When (How Fast?): An analysis of the running time of the algorithm determining the performance, effectiveness and efficiency.

Data structures and algorithms have key dependencies on each other and requires consistency and regular practice to gain conceptual depth and knowledge. A data structure is a way to store and organize data in order to facilitate access and modifications in a named location that can be used to store and organize data, even an integer or floating point number stored can be viewed as a simple data structure. It is essential knowledge to be able to write algorithms aligned to appropriate data structure or data set without which appropriate of steps to solve a particular problem can never be achieved thus learning and understanding of both concepts is vital.

Early Usage till today

During my research on the topic, I came across an interesting book on Algorithms by jeff Erickson where he mentions the origin of algorithms quoting the “Hindu-Arabic” numeric system, oldest surviving descriptions of the algorithm appear in The Mathematical Classic of Sunzi, written in China between the 3rd and 4th centuries, and in Eutocia of Ascalon commentaries on Archimedes’ Measurement of the Circle but there is evidence that the algorithm was known much earlier. Described in Euclid’s Elements centuries ago, for multiplying or dividing two magnitudes.

Interestingly the roots can even be traced to ancient Indians as well, including Brahmagupta’s 7th-century treatise Brāhmasphuṭasiddhānta, Doctrine of Brahma is the main work of Brahmagupta, written c. 628 and a faster divide-and-conquer method, originally proposed by the Indian prosodist Pingala in the 2nd century , which used recursive formulas and is the earliest examples of recursion more than 2000 years ago, in the study of poetic meter, or prosody. Classical Sanskrit poetry distinguishes between two types of syllables (ak.sara): light (laghu) and heavy (guru). In one class of meters, variously called ma ̄tra ̄vr.tta or ma ̄tra ̄chandas, each line of poetry consists of a fixed number of “beats” (ma ̄tra ̄), where each light syllable lasts one beat and each heavy syllable lasts two beats. The formal study of ma ̄tra ̄-vr.tta dates back to the Chandaḥśāstra written by the Acharya Pingala between 600 bce and 200 bce contains the basic ideas of Fibonacci numbers

With such a rich history and profound mentions described in the Śulbasūtrās, the algorithm for computing the cube-root given by Aryabhata and the kuṭṭaka and Cakravala algorithms for solving linear and quadratic indeterminate equations as discussed by Aryabhata ((476–550 CE), Brahmagupta (628 CE), Jayadeva (prior to the 11th century) and Bhaskaracharya (circa 1150 A.D.) describe many references to ancient Indian Mathematics and deep rooted algorithms within.

As on today, Algorithms power the heart of computations, we can see several algorithms working to solve our daily life problems from social media networks, GPS applications, search engines, e-commerce platforms, recommendation systems, video surveillance etc. applications are powered by various algorithms coupled with modern data structures. Recently there are mentions of Algorithmic entities refer to autonomous algorithms that operate without human control or interference. Recently, attention is being given to the idea of algorithmic entities being granted (partial or full) legal personhood and the accompanying rights and obligations. Algorithmic World has certainly moved on…

Types of Algorithm’s

As discussed above there exist various kinds of algorithms to solve different problems via different approaches, in programming few are considered as the important Algorithms to solve a particular problem.

  • Brute Force algorithm - simply try all possibilities until a satisfactory solution or every potential solution until the solution is found satisfactory.
  • Greedy algorithm - optimisation problem is one in which one wants to find, not just a solution, but the best solution. Finish at local level an ideal solution to discover the ideal answer to the complete issue. 
  • Recursive algorithm - Solves the base cases directly recurs with a simpler subproblem and address the lowest and easiest form of an issue and solve it in a wider and broader way until the original problem has been resolved.
  • Backtracking algorithm -based on a depth-first recursive search, Divide the issue into sub-problems that can be resolved, but if the requested answer is not attained, reverse the problem until a route pushes it forward is established.
  • Divide & Conquer algorithm -Divide the problem into smaller subproblems of the same type, and solve these subproblems recursively, Divide the issue into smaller sub-issues of the same type; solve smaller issues and integrate the original problem solving answers.
  • Dynamic programming algorithm -remembers past results (“memorization”)and uses them to find new results, Pause a difficult problem and solve them once only, instead of re-calculating their answers, by collecting smaller subproblems and saving their solutions for future use.
  • Randomised algorithm - uses a random number at least once during the computation to make a decision, To find a solution to the problem, employ a random integer for at least once throughout a calculation.

No alt text provided for this image

Risks & Challenges

Yes, there are different school of thoughts on the biases inherited by few algorithms risking the accuracy or integrity of outcomes causing inappropriate and wrong conclusions or insights as well.  Complexity and lack of governance or standards, errors in algorithm designs, inappropriate usage or testing etc are subject to the risks and biases etc and may skew the results and accuracy in part of full.

No alt text provided for this image

Math-washing is a term coined to represent the obsession for math and algorithms and human psychological tendency to believe the truth of something more easily if there is math or jargon associated with it even though the values may be arbitrary or assumed. Since machine learning algorithms are trained based on given datasets to recognize and leverage patterns, associations, and correlations in the statistics and may inherit biases from the data analyst or scientist creating or ‘curating’ these datasets.  

For example word embedding is a technique to Identify association of words via vectors, the group or the association depending on the angle of the vector, the machine would be able to understand the meaning of the word, in addition to commonly associated words and correlations. Using these vectors make up the dictionary of words for algorithms. Word embedding is widely used in many common applications, including translation services, search, and text autocomplete suggestions, thus any wrong association, phrases or bias that may arise naturally based on the culture, language and regional beliefs of the human engineer used during training of these algorithms based models and may perpetuate the bias since machine learning is prone to being stuck in feedback loops reinforcing its own learning.

Unfortunately, we humans are not as smart or neutral to above characteristics of language, phrases and beliefs etc in order for the algorithm to work the way they should. The content or the datasets used for real time learning may be an outrage on the social media, conversations and even the fake news or celebrity gossip, political slander, and many other things that serve no purpose to the expected outcome but because these algorithms can’t understand that, these echo chambers form, and it continues on as a structured bias unnoticeable to humans.

These risks must be dealt with enhanced quality checks and modernising risk management processes with an eye on governance on assumptions, approach and design, development and deployment thereof. It’s not as simple as it sounds since today’s algorithm-based decision-making systems becoming more prevalent and integral to digital landscape and have complexity, unpredictability, and proprietary nature of algorithms. If we look from the development angle most of these algorithms are based on advanced technologies such as machine learning and evolve over time depending on the input data, veracity and volume of data sets required for inferencing etc thus predicting or explaining algorithm behaviours is difficult and at times not possible at all. My quick read recommendation would be “ How algorithms are controlling your life” dialogue with writer Hannah Fry, a mathematician at University College London to make my point.

A standard traditional point in time process of risk management process will not be effective and needs continuous monitoring and corrective actions to stay on course. The best way to keep track of accountability is to maintain accurate and detailed records the data by which the decisions come to be made need to be transparent and easily auditable so that incase of something going wrong, audit and quality checks can be applied to measure the skew or dis-orientation of the intended results and be able to retrace the steps leading up to the outcome to locate the source of the problem, this is of course iterative endeavour.

In Summary, From Ancient world to the modern, humankinds are fascinated with number and the pseudocodes, from cyphers to autonomous algorithms we have witnessed miracles of computations that are impacting our lives and influencing our wellbeing, more so the next generation of machines are learning, augmenting and executing our automation agenda powered by Algorithms helping us to leap forward towards sustainability and greater good of our digital self.

***

Nov 2021. Compilation from various publicly available internet sources, authors views are personal.

#Algorithms #Machinelearning #datastructures #AIML


That's interesting. This article about the future of business being heavily influenced by artificial intelligence is quite interesting. You should check it out. https://bit.ly/3F18dwC

Like
Reply

Thanks Rajesh for bringing in some " rythm " in my understanding these "Algor-y-thms ".

Nicely written article Rajesh Dangi! I would like to suggest an inclusion of ‘Ecorithm’ introduced by Leslie Valiant in Probably Approximately Correct. He deals with ‘theoryless’ problems.

Wow Rajesh Dangi , you have done an extensive study. Thanks gives me a good insight .

To view or add a comment, sign in

More articles by Rajesh Dangi

  • Photonic processors, Simplified!

    Photonic processors are a new type of computing chip that use light instead of electricity to perform calculations. The…

  • “Neuro Symbolic AI, Simplified!"

    The transition of Artificial Intelligence from a specialized academic pursuit into a foundational element of modern…

  • “State Space Models, Simplified!"

    The world of artificial intelligence is constantly seeking more efficient ways to process information. At the heart of…

  • “Neuromorphic Computing, Simplified!"

    For decades, the engine of the digital world has been built on a single, dominant blueprint: the von Neumann…

  • MRAG, Simplified!

    Retrieval-Augmented Generation, often referred to as RAG, was developed to reduce hallucinations in large language…

  • A Quick take on the contradictions in The Digital Personal Data Protection (DPDP) Rules, 2025

    The Digital Personal Data Protection (DPDP) Rules, 2025, provide crucial detail for the implementation of the parent…

    1 Comment
  • “Agentic AI Frameworks, Simplified!"

    The field of Artificial Intelligence is undergoing a fundamental transformation, moving from the creation of…

    3 Comments
  • Human-Centric AI (HCAI), Simplified!

    Artificial Intelligence (#AI) has entered a new era one that transcends pure technical optimization and seeks to align…

  • “Agent to Agent (A2A), Simplified!"

    Artificial intelligence is moving away from isolated systems and evolving into ecosystems of specialized autonomous…

    4 Comments
  • MOSIP, Simplified!

    The dawn of the 21st century has seen a rapid acceleration in the digitization of governance and public services…

Others also viewed

Explore content categories