Supercomputers aren’t super anymore…?

We have come a long way since the first supercomputer, CDC 6600 was released in 1964. With barely couple of FLOPS (floating point operations per second) operating speed then — to today’s age, where petaFLOPS will soon be a legacy, as exaFLOPS are on the horizon.

The visionary and legend mathematician John von Neumann, gave an amazing gift to our computing world with his Von Neumann architecture, designed to carry out logical operations using the precise physical position. That changed the world forever. It was his genius and futurist thinking of integrating pure and applied sciences, which led to the birth of digital compute power. However, as the need for compute power is increasing exponentially, supercomputers themselves are appearing dwarf in their aura…

It seems that we relied long enough on the computer architecture having separate CPU and memory units, with data moving between the two. With this topology, extended to supercomputers by adding thousands of CPUs and GPUs, it gave the us an unprecedented increased compute power, but still restricted us with binary bits — a stream of electrical or optical pulses representing one of the two positions i.e., 1 or 0, on or off, up or down.

This limitation has proven that supercomputers aren’t very good at solving certain types of problems, which seem easy at first glance.

Let’s look few such scenarios:

  • A logistics company distributing materials to over 100s of cities wants to optimise their routes to save on fuel costs.
  • An investment bank looking to manage risks in their investment portfolios
  • Met department looking for weather patterns analysis, involving real time fluctuations
  • And many more…

The limitation supercomputers have is that — they don’t have the working memory to hold the myriad combinations of real-world problems. And, they have to analyse each combination one after another bit-by-bit, which can take a long time.

This brings Quantum computers to horizon.

Even though, quantum computing is in nascent stage, but its’s early version itself brought the most powerful supercomputer of the world on its’ knees! The landmark day of Oct’19, when google announced quantum supremacy as it’s quantum computer Sycamore surpassed IBM’s Summit in computations by a far distance.

The calculations which Sycamore performed in just 200 seconds could have taken 10,000 years for Summit. While IBM challenged it with 2.5 days counter argument but even than 200 secs vs. 2.5 days still no mean feat!

So, with this fast-changing compute dynamics the question is:

Are supercomputers still super…?

For how long…?

Note: You are welcome to reach out for further details, to explore how this can impact and help you. Please do not reproduce, change or remove the information without written permission.

--

--

--

Managing Director & ExecCom Member, Author, Speed Rope Enthusiast

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Introduction to Electronic Sensors

Operation: Furious Speed Bump

Weekly Wins: Gifts From Google

QacQoc GN30H Premium USB-C Hub REVIEW

Countdown18 days||Asian-Pacific Meta Universe New Era Summit

“Agrivoltaics”, First Nations Tech, and Data That Glows Through Your Pants

The Future of Self-Driving Cars

Why Virtual Reality is the Future.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Manoj Gupta

Manoj Gupta

Managing Director & ExecCom Member, Author, Speed Rope Enthusiast

More from Medium

A systematic review of repositories on GitHub with python (Game Dev Style)

Despair following violent ethnic based killings in some parts of Oromia, Ethiopia

GPU Compute Shaders part 2 — fluids

How to Setup Polkadot{.js} to Recieve & Hold $TEER