Where's Hardware heading ?
A lot is happening in hardware, but not at the fundamental level. We are still pushing Silicon to its limits pursuing Moore's Law, so we are getting more and more transistors cramped up in a small space and thus more computing power. We could not push the frequency limit due to power constraints thus we invented Multicore. But are we utilising all the cores at our disposal? Is the next gen hardware fundamentally better than the previous gen hardware ?
Internet-Of-Things is a buzzword now. Everything's connected and is seemingly better.
Lets put ourselves in comparison with processors. The Human brain has some 100 Billion Neurons with several order of more Synapses ( Some 1000 Trillion ). The numerical capability of an average human being was overtaken sometime in 1980's with 100K's transistors. Currently the number of transistors in a modern processor is somewhere around 10 Billion, where we are no where close to the human capacity for abstract thinking , problem solving.
Traditional processors process things serially, even if they are multicore. [ At best several threads execute serially ]. Humans can approximate, solve constraints iteratively without the need for a serial , disciplined thought process. A constraint solver for a traditional processor will require loads of cycles for a complex enough problem.
Quantum Computers change the game. By inherently changing the physics of computation we have a powerful constraint solver available. All that needs is to map each problem to be solved as a constraint. Maybe that is what programming will be in 20 years, maybe earlier.
The second change will be the introduction of Memristors , which in theory should allow for making newer building blocks for electronics.
But what is hardware really ? In its essence , it is nothing but a more optimized functionality that can be achieved with software as well. On one end thus we have fixed functionality hardware (ASIC's), followed by semi-configurable hardware ( FPGA's) and lastly, the fully configurable hardware (Processors).
On Digital we are severely limited by just boolean logic, on Analog we are limited by basic hardware blocks which do not directly transform into Mathematical Operators.
What is in store for us in the future? A linear improvement cycle or a paradigm shift ?