If Today's Tech Gets You Down, Remember Supercomputers Are Still Being Used For Scientific Progress

The US Department of Energy this week laid out how it intends to put its supercomputing might to work simulating the fundamental building blocks of the universe.

The electrons, protons and neutrons that make up atoms, from which all matter is comprised, are fairly well understood. However the particles that make up those particles – leptons, quarks and bosons – remain mysterious, and the subject of ongoing scientific inquiry.

A $13 million grant from the Dept of Energy's Scientific Discovery through Advanced Computing (SciDAC) program aims to expand our understanding of the extraordinarily tiny things that exist within the particles within atoms.

As far as scientists can tell, quarks and gluons — the stuff that holds them all together — can't be broken down any further. They are quite literally the fundamental building blocks of all matter. Remember of course that scientists once thought the same of atoms, so who knows where this might go.

The initiative will enlist several DoE facilities – including Jefferson, Argonne, Brookhaven, Oak Ridge, Lawrence Berkeley, and Los Alamos National Labs, which will collaborate with MIT and William & Mary – to advance the supercomputing methods used to simulate quark and gluon behavior within protons.

The program seeks to answer some big questions about the nature of matter in the universe, such as “what is the origin of mass in matter? What is the origin of spin in matter,“ Robert Edwards, deputy group leader for the Center for Theoretical and Computational Physics at Jefferson Labs, told The Register.

Today, physicists use supercomputers to generate a "snapshot" of the environment inside a proton, and use mathematics to add quarks and gluons to the mix to see how they interact. These simulations are repeated thousands of times over and then averaged to predict how these elemental particles behave in the real world.

This project, led by the Thomas Jefferson National Accelerator Facility, encompasses four phases which aim to streamline and accelerate these simulations.

The first two phases will involve optimizing the software used to model quantum chromodynamics – the theory governing photons and neutrons – to break up the calculations into smaller chunks, and take better advantage of the even greater degrees of parallelism available on next-gen supercomputers.

One of the challenges Edwards and his team are working through now is how to take advantage of the growing floating-point capabilities of GPUs without running into connectivity bottlenecks when scaling them up.

“A good chunk of our efforts have been trying to find communication-avoiding algorithms and to lower the amount of communication that has to come off the nodes,” he said.

A good chunk of our efforts have been trying to find communication-avoiding algorithms

The team is also looking at applying machine-learning principles to parameterize the probability distributions at the heart of these simulations. According to Edwards, this has the potential to dramatically speed up simulation times and also helps to eliminate many of the bottlenecks around node-to-note communications.

“If we could scale it, this is like the Holy Grail for us,” he said.

In addition to using existing models, the third phase of the project will involve the development of new methods for modeling the interaction of quarks and gluons within a computer-generated universe. The final phase will take information collected by these efforts and use them to begin scaling up systems for deployment on next-gen supercomputers.

According to Edwards, the findings from this research also have practical applications for adjacent research, such as Jefferson Lab's continuous electron beam accelerator or Brookhaven Lab's relativistic heavy-ion collider – two of the instruments used to study quarks and gluons.

"Many of the problems that we are trying to address now, such as code infrastructures and methodology, will impact the [electron-ion collider]," he explained.

The DoE's interest in optimizing its models to take advantage of larger and more powerful supercomputers comes as the agency receives a $1.5 billion check from the Biden administration to upgrade its computational capabilities. ®

RECENT NEWS

The Power Of AI: Microsoft's Cloud Sales Reach New Heights

In the ever-evolving landscape of technology, Microsoft has emerged as a frontrunner, leveraging the transformative powe... Read more

Uncovering The Tactics: How Hackers Exploit Developing Countries In Ransomware Testing

In recent years, there has been a concerning rise in hackers using developing countries as testing grounds for ransomwar... Read more

From Silicon Valley To Down Under: Musk's Defense Of Public Interest In The Digital Era

In recent headlines, tech titan Elon Musk has once again captured global attention, this time for his intervention in an... Read more

The Global Semiconductor Landscape: Navigating Through Market Shifts Post Samsung's Earnings Triumph

In the first quarter of 2024, Samsung Electronics announced a staggering 931% surge in operating profits, reaching 6.6 t... Read more

The Balancing Act: Google's Paywalled AI And The Quest For Digital Equity

In an era where artificial intelligence (AI) is no longer the stuff of science fiction but a daily utility, Google's lat... Read more

The Meteoric Rise Of Anthropic: Valuation And The Future Of AI

In an era where artificial intelligence (AI) is not just a buzzword but a cornerstone of technological advancement, Amaz... Read more