The brain can perform hundreds of billions of calculations per second while using minimal energy. The DeepSouth supercomputer is inspired by being able to accelerate the development of technologies, especially artificial intelligence.
This will also interest you
[EN VIDÉO] Interview: How is a quantum computer different? The quantum world is fascinating: on this scale, for example, objects…
Artificial intelligence (AI) algorithms know how to deceive us thanks to massive probability calculations, but they don’t think or function at all like our brains. To simulate real neural networks inspired by the brain, researchers at the International Neuromorphic Systems (ICNS) at Western Sydney University, Australia, have introduced the DeepSouth neuromorphic supercomputer. It will be activated from April 2024 and can process 228 trillion synaptic operations per second.
The human brain, for its part, can calculate the equivalent of one exaflop (a billion billion floating point operations performed in one second) of mathematical operations per second, using only 20 watts of energy. This ability to reproduce with less consumption is exactly what DeepSouth was designed for.
Imitate the brain to consume less
With this computing power, the supercomputer is intended to accelerate data processing for applications such as biomedicine, robotics, space exploration and artificial intelligence. In any case, its operation will be in contrast to current supercomputers, which are slow and power-hungry when simulating neural networks using graphics processing units (GPUs) and other multi-core chips.
The name DeepSouph is a reference to both IBMBM’s TrueNorth supercomputer, which aimed to build a machine capable of simulating large neural networks, and DeepBlue, the first computer to beat humans at chess. The great thing about DeepSouth is that it is also scalable. It will therefore be possible to customize and reconfigure it to increase or reduce its capacities according to needs. If DeepSouth meets scientists’ expectations by evolving and miniaturizing itself, this type of technology could further accelerate the development of AI while significantly reducing its energy consumption.