In the famous 1999 film The Matrix, humanity has been subjugated by Artificial Intelligences and lives within an interactive neural simulation.
Yet the protagonist, Neo, much like in Plato’s Allegory of the Cave, is able to go beyond. He sees the computation, the interweaving of data, converted by machines into signals sent to our brains to keep us enslaved in a virtual reality known precisely as “the Matrix”.
Without venturing too far into science fiction, it is in fact mathematical structures known as matrices that enable computers to create three-dimensional images, animate characters in video games, and power Artificial Intelligence systems. Every digital action — even every word generated by a system such as ChatGPT — is the result of an enormous number of calculations carried out through these numerical tables.
The computers we use today are digital machines, meaning they operate by reading and manipulating just two states: 0 and 1, on and off. This approach, which made the information technology revolution possible, is now encountering increasingly evident limits. The great boom in AI is making us realise that, despite remarkable achievements, the computational capabilities of current machines are nearing their limits. On the one hand, there is the so-called Von Neumann bottleneck, where vast amounts of energy are wasted moving data between memory and processor. On the other, there is the crisis of Moore’s Law, which predicted the doubling of the fundamental components of chips every two years. Transistors are now so small that they are approaching atomic dimensions, making it ever more difficult to shrink them further without running into insurmountable physical constraints.
To overcome these obstacles, analogue computing is being rediscovered. Unlike digital computing, it does not rely on 0s and 1s, but on continuous signals, such as electrical current or the mechanical movement of gears. This is not an entirely new idea: one of the earliest examples of an analogue computer is the Antikythera Mechanism, a sophisticated Greek device over two thousand years old used to predict the movements of celestial bodies. The most intriguing aspect of analogue technology, particularly for Artificial Intelligence, is the possibility of directly exploiting the laws of physics to perform the matrix multiplications typical of neural networks. In this way, rather than coordinating thousands of tiny electronic switches, the result emerges naturally from the behaviour of materials, with a drastic reduction in energy consumption.
In recent years, several companies and research centres have brought this technology back to the forefront of innovation. The American company Mythic has developed chips that use flash memory, similar to that found in USB drives, to carry out AI calculations with very low energy consumption.
Researchers at Peking University have created chips based on RRAM technology that promise markedly higher speed and energy efficiency compared with traditional processors, including those produced by Nvidia.
In Germany, the company Anabrid markets a small analogue computer called The Analog Thing, designed to help students and researchers tackle complex problems using an approach different from that of conventional computers.
The impact of these advances could be enormous. An example of what computational power can already achieve is AlphaFold, the system developed by DeepMind that has determined the structure of around 200 million proteins — a result that would have required an extremely long time using traditional methods.
With the evolution of analogue chips, in the future we may have medical devices, sensors and intelligent instruments capable of operating for years on a small battery, processing data and learning directly on site, without the need to rely constantly on an internet connection.
If, in the film, the machines of the future are hungry for energy and dominate humankind, in reality they may consume little and amplify our capabilities.
The real risk, then, is not that machines will think too much, but that humanity will cease to do so. Plato had already intuited this, urging us not to prefer the comfort of shadows on the wall to the light outside the cave.
Author: Emanuele Mulas M.Sc. MIEI
Publisher: Wittystore
License: Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND)
https://creativecommons.org/licenses/by-nc-nd/4.0/


