Supercomputers aren’t super anymore…?
We have come a long way since the first supercomputer, CDC 6600 was released in 1964. With barely couple of FLOPS (floating point operations per second) operating speed then — to today’s age, where petaFLOPS will soon be a legacy, as exaFLOPS are on the horizon.
The visionary and legend mathematician John von Neumann, gave an amazing gift to our computing world with his Von Neumann architecture, designed to carry out logical operations using the precise physical position. That changed the world forever. It was his genius and futurist thinking of integrating pure and applied sciences, which led to the birth of digital compute power. However, as the need for compute power is increasing exponentially, supercomputers themselves are appearing dwarf in their aura…
It seems that we relied long enough on the computer architecture having separate CPU and memory units, with data moving between the two. With this topology, extended to supercomputers by adding thousands of CPUs and GPUs, it gave the us an unprecedented increased compute power, but still restricted us with binary bits — a stream of electrical or optical pulses representing one of the two positions i.e., 1 or 0, on or off, up or down.
This limitation has proven that supercomputers aren’t very good at solving certain types of problems, which seem easy at first glance.
Let’s look few such scenarios:
- A logistics company distributing materials to over 100s of cities wants to optimise their routes to save on fuel costs.
- An investment bank looking to manage risks in their investment portfolios
- Met department looking for weather patterns analysis, involving real time fluctuations
- And many more…
The limitation supercomputers have is that — they don’t have the working memory to hold the myriad combinations of real-world problems. And, they have to analyse each combination one after another bit-by-bit, which can take a long time.
This brings Quantum computers to horizon.
Even though, quantum computing is in nascent stage, but its’s early version itself brought the most powerful supercomputer of the world on its’ knees! The landmark day of Oct’19, when google announced quantum supremacy as it’s quantum computer Sycamore surpassed IBM’s Summit in computations by a far distance.
The calculations which Sycamore performed in just 200 seconds could have taken 10,000 years for Summit. While IBM challenged it with 2.5 days counter argument but even than 200 secs vs. 2.5 days still no mean feat!
So, with this fast-changing compute dynamics the question is:
Are supercomputers still super…?
For how long…?
Note: You are welcome to reach out for further details, to explore how this can impact and help you. Please do not reproduce, change or remove the information without written permission.