Higher Learning in the Age of Data

In the world that moves and changes at an ever-faster pace, institutions of higher learning, most notably colleges and universities, are often seen as bastions of steadiness. On the surface that would indeed appear to be so, but those institutions are also being remolded, albeit slowly, by a myriad of socio-politico-economic forces including changing demographic trends, disruptive technologies, and globalization. Perhaps the clearest manifestation of the transformative effect of those change agents has been the heightening of universities’ self-interest, which can be seen as a direct consequence of growing competitive pressures. The race to offer customized, on-demand, multi-modal (on-campus, online and blended) education that fits-in with students’ work and lifestyles can be seen as one manifestation of that trend, while the ever more frequent and ambitious fundraising campaigns offer another obvious example. All and all, efforts to adapt to changing social, political and economic realities can be characterized as structural self-re-engineering, an important step toward modernizing decades or even centuries-old educational institutions. Receiving considerably less attention, however, is the impact of some of the same technological and competitive forces on the essence of teaching and learning, and the very idea of what it means to know.

Implicit in the traditional conception of higher education is human centricity since formal education is essentially structured transfer of knowledge from (human) teachers onto (human) learners. And while new technologies have been widely adapted as facilitators that transfer, the essence of teaching and learning continues to be framed in the context of human-to-human exchanges. But the emergence and rapid maturation of advanced self-functioning technologies, broadly referred to as artificial intelligence (AI), and more specifically machine learning (ML) applications, underscores the need to re-think what it means to learn. An analyst using ML algorithms to sift through large volumes of raw data learns, but in a manner that is quite different from the same analyst reading a textbook or listening to a lecture. Moreover – and that is the real difference-maker – it is not just the analyst who learns. By continuously updating networked patterns and associations it derives from data, the ML application also learns. In a more general sense, a growing array of man-made electronic systems are now capable of learning autonomously, that is, without human input, as illustrated by widely used technologies including spam filters or electronic personal assistant such as Apple’s Siri or Amazon’s Alexa. All considered, if learning is now shared between humans and human-developed artificial agents, what does it mean to ‘known’ and to ‘learn’? And by extension, will, and if so how, the evolving conception of knowing and learning reshape the centuries-old institution of higher learning? 

It is likely that the manner in which institutions of higher learning adapt to changes brought about by the onslaught of intelligent automation will play a major role in shaping the future of higher education. But questions remain. Will future universities emphasize training of the mind, which, as discussed in the next section was the original goal of higher education, or will they focus even more on endowing students with defined skill sets, which is commonly the goal of professional education? What elements comprising the traditional pillars of a university – competent faculty, capable students and learning resources – will need to change, and how? Finding sound answers to such complex and enduring questions is a journey that begins with shared understanding of the genesis of higher learning as a formal, institutionalized pursuit.

 The Institution of Higher Learning

The origins of contemporary western universities can be traced back to the emergence of democracy in ancient Greek city-states, most notably Athens, and the concurrent idea of liberal education. Derived from the Latin word ‘liberalis’, or ‘appropriate for free men’, ancient Greek democracies regarded the study of grammar, rhetoric and logic as an essential enabler of free citizens’ participation in civic life. And while feudal societies that followed in their wake were in many regards less erudite, clergy and nobility carried forth the core ideals of liberal education, eventually even adding mathematics, music and astronomy to what was considered proper education for the then ruling classes. During the same time period, education of first slaves and later commoners was limited to specific, typically servitude related skills, now broadly characterized as vocational training. Thus in the more general sense, the original focus and, more importantly, intent of higher learning was to develop the whole human being to their full potential, which is something that to this day remains the goal liberal arts education, though not necessarily the dominant goal of more broadly defined higher education.

That is because starting in the latter part of the 19th century universities began to steadily expand their initially general education-minded focus by offering more and more vocational training, delivered through growing arrays of professional schools, as exemplified by law, engineering, education, medicine or business. That educational scope-expanding trend has been so transformational that nowadays the general perception of university education is far more closely aligned with advanced vocational training than with the comparatively vague notion of ‘developing the whole human being’…So, in the course of nearly two and a half millennia since Plato’s Academy and Aristotle’s Lyceum, the needs and values of societies slowly but systematically built, defined, and then re-defined what constitutes higher learning. And while numerous factors contributed to that evolution, the onset of the Industrial Revolution exerted perhaps the most profound influence. 

 Changing Needs – Changing Focus

A widely embraced conceptualization of industrial progress depicts the underlying process through several distinct stages, briefly summarized in Figure 1 below.

 Figure 1: A Summary of Industrial Progress

No alt text provided for this image

The development of commercially viable steam engine in the early 18th century is commonly believed to have been the spark that eventually ignited the 1st Industrial Revolution, which was further fueled (no pun intended) by the subsequent discovery of oil, and the resultant large-scale mechanization. Significantly more complex to build, operate and maintain than the comparatively simple hand tools used earlier, the growing array of industrial machinery required properly trained engineers and managers to design and utilize those progressively more complex tools. The subsequent invention of electricity, widely credited with sparking the 2nd Industrial Revolution, further expanded the need for advanced technical training, a trend that continued with the rise of electronics and computing that ushered the 3rd Industrial Revolution. The relatively recent proliferation of data-driven interconnected intelligent networks, widely taken to signal the 4th Industrial Revolution, gave rise to data science as a new and distinct field of academic training, a yet another point adding to the trend of vocationalization of higher education.

Somewhat hidden in that high-level view of the march of industrialization is the emergence and gradual maturation of machine-centric learning. Commonly referred to as computer or digital revolution, this chapter in the book of industrial transformations is graphically summarized in Figure 2.

Figure 2: From Abacus to AI

No alt text provided for this image

At least notionally, the abacus can be considered to be the original computer, although perhaps the most direct predecessor of modern computer was Babbage’s Analytical Engine, first described in 1837. However, it wasn’t until about a century later that the pioneering ideas of Alan Turing and John von Neumann laid the conceptual foundations that soon after gave rise to modern computing, with commercial systems in the form of large, complex and expensive mainframe machines entering the marketplace in the 1950s. In the years that followed, numerous hardware and processing related innovations inspired, first, minicomputers, which offered comparable (to the much larger mainframes) computing capabilities at about a tenth of the cost, and eventually microcomputers, now known as personal computers, which leveraged advances in integrated circuit design to pack unprecedented computing power into small, portable devices. The coming together of electronic computing and communication technologies then gave rise to the Internet and the era of informational computing, characterized by the use of computing devices to share and create information. The subsequent addition of wireless transmission of data, voice and video, and progressively more autonomous self-learning software applications brought the current era of mobile intelligence, where broadly defined computing systems are not only responding to human directions, but more and more are directing human actions.

When considered within the confines of higher learning, computer revolution can be seen as explosion in the ability to learn from data. Electronic transaction processing and electronic communication systems and platforms generate now famously vast quantities of data, while the progressively more autonomous data processing mechanisms and agents offer increasingly more evolved data utilization capabilities. More and more, learning from data is becoming as important as learning from books or experience. 

 Learning from Data

 How do we know that we know? From the somewhat abstract epistemological perspective, knowledge can come from one’s own experience, information obtained from others, and inferences of logic. Those who feel that knowledge comes primarily, or even exclusively, from sensory experiences are often labeled as empiricists. As a philosophical stance, empiricism treats knowledge as probabilistic and subject to continued, evidence-based revision, and the empiricist reasoning is a fundamental part of the scientific method which posits that any knowledge claim must be tested against objective data, rather than resting solely on logic or intuition. Although the philosophical roots of empiricism can be traced back to the work of Aristotle and other Hellenic thinkers, as currently framed it is a relatively modern doctrine largely shaped in the Age of Reason. Manifesting itself in the departure from the mysticism and superstition of the Middle Ages, it brought about a fundamental shift in the way mankind viewed itself and pursued knowledge, resulting in great changes in scientific thought and exploration, all of which ultimately gave universities their current empirical orientation.

More on point, empiricism is a perfect companion to the Age of Data. The ever-expanding digitization of all manners of business and social interactions yields vast arrays of objective data, which in turn creates previously unimaginable opportunities for empirical learning. However, technological progress (see Figure 2) also continues to affect what it means to ‘learn from data’. Consider Figure 3.

Figure 3: The Evolution of Data Learning

No alt text provided for this image

The opening of human mind that characterized the Age of Reason resulted in numerous societal and scientific advances, including formal, i.e., mathematically based, study of chance, which in turn gave rise to the emergence of a new branch of applied mathematics known as statistics. The ensuing popularization of the notion of statistical inference then lent support to the hypothetico-deductive application of the scientific method, which then framed knowledge creation as a process of statistically testing falsifiable claims (i.e., hypotheses) using data derived from observation or experimentation. The later emergence and rapid maturation of electronic computing eventually produced machine learning, which marked the beginning of a new era, one in which not just mankind, but man-made devices are able to learn from data. And while early machine learning applications were limited to identification of simple recurrence-based patterns hidden in relatively homogeneous data, rapidly accelerating computing capabilities (summarized in Figure 2), coupled with advances in computer and neural sciences led to the development of progressively more capable and autonomous machine learning technologies. Now commonly referred to as AI, those systems exhibit increasingly human-like functioning, best illustrated by the now common deep learning and cognitive computing applications, and soon to become common self-driving vehicles.

And so the learning journey that began with the opening of the human mind in the Age of Reason is now taking us into the new world of blurring distinction between ‘man’ and ‘man-made’. It is easy to see why, at least some, futurists see signs of oncoming singularity, or the merging of human and artificial intelligence. But it is also just as easy to imagine that the seemingly slow but interminably creative human analog (meaning capable of representing any process in terms of infinite values, which implies infinite ability to conceive) brain will continue to innovate in ways that are hard to imagine today, just as our modern innovations would be hard to imagine for our predecessors.

Technological Progress and Higher Education

The currently – i.e., early 21st century – unfolding period, often referred to as the Information Age or the Age of Data is often depicted a yet another step on the continuum of technological progress (first summarized in Figure 1), but perhaps it is more appropriate to think of the current era as a new chapter in the evolution of mankind. Consider Figure 4.

Figure 4: The Eras of Change

No alt text provided for this image

Agriculture-based agrarian societies were (and still are, in few isolated parts of the world) organized around producing and maintaining of crops. Those societies existed as far back as 10,000 years and can be thought of as the foundational era of human socio-economic development, lasting many centuries, until the onset of the 1st Industrial Revolution in the early eighteen-hundreds. The purpose of higher learning in agrarian societies was, as discussed earlier, to enable the ruling classes – initially the free citizens of Greece and Rome, and later feudal nobility and clergy – to govern. Any vocational training required by the working classes to produce goods and perform services necessitated only comparatively basic practical training, given the simple techniques and tools used during that period.

The emergence and rapid proliferation of initially steam- and later oil-powered mechanization effectively brought the agrarian era to an end. The comparatively quick succession of the 1st, 2nd and 3rd industrial revolutions can be seen as successively more advanced industrial development periods, or epochs, each rooted in a distinct disruptive innovation, and jointly comprising the Industrial Era. Underpinning the rise of large-scale mechanization was an exponential increase in the complexity and sophistication of means of production, which in turn gave rise to the emergence of highly trained and skilled classes of professionals. As a result of those and other (e.g., political, legal, etc.) changes, the focus of higher education slowly began to expand to include a growing array of specialized, professional education-oriented programs, and the overall goal of higher learning began to shift away from learning to govern and toward learning to make.

Another key transition period began in the second half of the 20th century. Commercially jump-started in the 1950s, electronic computing first emerged as a limited access, special purpose tool, but the arrival of personal computers in the 1980s transformed computing into everyday utility for nearly everyone. The subsequent emergence of the Internet as a new communication modality, soon after enhanced by the utility offered by the World Wide Web interactivity, and further expanded by the rapid maturation of mobile connectivity and the proliferation of interconnected personal and commercial data capturing devices, laid the foundation for the current Information Era. Characterized by intelligent, meaning automated or even autonomous, interconnected networks, the commercial and personal consequences of the resultant changes are as monumental as those that characterized the agrarian-to-industrial society transformation. Just as steam-, oil- and electricity-powered machines changed how work was done, self-functioning, interconnected systems – perhaps best illustrated by self-driving vehicles – are now again changing not only how work is done, but also how lives are lived. Not surprisingly, the transformational impact of artificial intelligence extends into higher learning as more and more of the ‘make’ related work is handled by independently functioning systems – as a result, the focus of university learning will begin to shift away from making and toward conceiving, as graphically summarized in Figure 5 below.

Figure 5: Shifting Focus of Higher Learning

No alt text provided for this image

Starting with the purpose of educating of elites during the Agrarian Era, the focus of higher education shifted during the Industrial Era toward training of the new professional class, and it is once again shifting, this time in response to transformative changes brought about by the burgeoning Information Era. Technological progress is gradually alleviating direct physical work, while at the same time generating vast volumes and rich varieties of data, and the combination of those two related but distinct trends is ushering in the age of information-driven creativity. New knowledge is created, and ideas are tested not just by mining and analyses of data, but also through AI-driven capabilities to simulate reality in a way that transcends experience and physical existence, perhaps best exemplified by research aiming to describe conditions that existed moments after the Big Bang. In contrast to general creativity, which is rooted in subjective reasoning and/or imagination, information-driven creativity can be described as conception of novel ideas spurred by the use of available data. It entails going beyond what is currently known but in a manner that is guided by insights derived from currently available information.

What does that mean in practice? First and foremost, the traditional teaching model, the core of which changed very little since the days of early universities, ought to be reconsidered. More specifically, rather than being structured (i.e., curricula comprised of sets of standard courses), largely undifferentiated (i.e., students consuming the same content within the same time period), and oriented more toward assimilation than discovery of knowledge, the new teaching and learning model ought to be more flexible, individualized and exploration minded. Intelligent technologies need to play a more fundamental role in teaching and learning, not just as a mean of different (i.e., online) content delivery, but also as a mean of discovering new knowledge. Making use of augmented, mixed and virtual reality technologies will allow learners to see beyond the boundaries of currently existing reality, making it more likely for more learners to conceive novel ideas that may not have emerged in a more traditional learning setting. As noted earlier, human brain is analog, which implies potentially unlimited capability to create. It thus follows that focusing higher learning more on discovery of new, rather than the assimilation of old knowledge, infusing discovery-promoting technologies into the learning process, and individually-tailoring learning pathways will help institutions of higher learning to continue to unlock more and more of human creative genius.

To view or add a comment, sign in

More articles by Andrew Banasiewicz, PhD

Others also viewed

Explore content categories