Is algorithmic thinking essential for data analysis?
Suppose an analyst wants to perform data analysis & inference in a specific domain of knowledge (domain of expertise). The first thing to do is to break down the domain of interest in a bunch of interrelated random variables. After encoding the data in the known tabular format (according to the desired decomposition into the specific variables), the analyst may want to gain insight in some of the potential relations among the random variables.
Suppose that the analytic goal is to better understand some of the variable relations or to predict a variable on the basis of some of the others. The analyst may decide to employ exploratory or predictive or probabilistic models or a combination to carry out the task.
One can rarely bare on general closed form (analytic) representations and manipulations of inferential models. Closed form solutions often require too many restrictive assumptions, for example, certain normality or linearity constraints about the distributions or relations of the random variables, which may not be good approximations of a complex, real world domain.
On the other hand, a clever algorithm, that captures the particularities of the specific domain, may prove a better approach to take. Employing adapted versions of existing algorithms might better fit the domain of interest and deliver useful business insights. Still, the algorithmic approach has one additional advantage; algorithms can have applications that scale well with input, if designed appropriately. As new data analysis algorithms and implementations are continuously made available, professionals need to be careful, though; no algorithm should be allowed to violate the laws of probability & statistics and mathematical justification should accompany an introduced algorithm.
It is said that the 20th century was the age of grand equations (usually invented in the natural sciences fields). Is the 21st century becoming the algorithmic age?