قالب وردپرس درنا توس
Home / Tips and Tricks / Huge supercomputers still exist. Here is what they are used for today

Huge supercomputers still exist. Here is what they are used for today



  modern supercomputers in the data center server room
Timofeev Vladimir / Shutterstock

Supercomputers were a massive race in the 90s, like the US, China and others all competed to have the fastest computer. While the race has died a bit, these monster computers still used solutions to many of the world's problems.

When Moore's law (an old observation stating that computer power doubles about every two years) pushes our computer equipment further, the complexity of the issues being solved is also increasing. While supercomputers used to be quite small, they can today pick up entire layers, all filled with interconnected computers.

What makes a computer "Super"?

The term "supercomputer" means a gigantic computer many times more powerful than your simple laptop, but it could not be further from the case. Supercomputers consist of thousands of smaller computers, all connected together to perform a task. Every CPU core in a data center probably goes slower than your desktop computer. It is a combination of all that makes the data so effective. There are a lot of networks and special equipment involved in computers on this scale, and it is not that easy to just plug each rack into the network, but you can think of them this way, and you would not be far from the mark.

Not every task can be paralleled so easily, so you won't use a supercomputer to run your games at one million frames per second. Parallel computing is usually good for accelerating much computational computing.

Supercomputers are measured in FLOPS or Floating Point Operations per second, which is essentially a measure of how fast it can do math. The fastest is currently IBM's summit, which can reach over 200 PetaFLOPS, one million times faster than "Giga" most people usually do.

So what are they used for? Mostly Science

  3D Rendering of a Weather Map
Andrey VP / Shutterstock

Supercomputers are the backbone of computational science. They are used in the medical field to drive protein weight simulations for cancer research, in physics to run simulations for major engineering projects and theoretical calculation, and also in the financial market to track the stock market to get the edge on other investors.

Perhaps the job that most benefits the average person is weather modeling. Just predicting if you need a coat and an umbrella next Wednesday is a surprisingly difficult task, one that even today's huge supercomputers can't do with great accuracy. It is theoretical that we need a computer that measures its speed in ZettaFLOPS, two more levels from PetaFLOPS and around 5000 times faster than IBM's summit in order to run the whole weather modeling. We will probably not reach this point until 2030, but the main problem that holds us back is not the hardware, but the cost.

The negotiated cost of buying or building all that hardware is high enough, but the real kicker is the electricity bill. Many supercomputers can utilize up to millions of dollars each year to just keep running. So there is theoretically no limit to how many buildings that are full of computers that you can connect together, we just build super computers big enough to solve current problems.

So should I have a supercomputer at home in the future?

In a certain way, you already do. Most desks today compete with older supercomputers, with the average smartphone having higher performance than the infamous Cray-1. So it's easy to make the comparison with the past and theorize about the future. But this is largely due to the fact that the average CPU is much faster over the years, which does not happen so fast anymore.

Lately, Moore's law has slowed down when we reach the limits of how small we can make transistors, so CPUs don't get much faster. They become smaller and more efficient, enhancing CPU performance toward more cores per desktop hard drive and more powerful overall for mobile devices.

But it is difficult to predict the average user's problem that sets out data requirements. After all, you do not need a supercomputer to surf the Internet, and most do not use protein weight simulations in their basements. The high end consumer hardware today normally exceeds normal usage cases and is usually reserved for specific work that benefits from it, such as 3D rendering and code compilation.

So no, you probably won't have one. The biggest advances will probably be in the mobile space, as phones and tablets are approaching desktop levels, which is still a pretty good step forward.

Image Credit: Shutterstock, Shutterstock


Source link