Bull In A Cosmos Shop: Edinburgh Uni Boffins Strike Deal With Atos For BullSequana Supercomputer

Boffins at the University of Edinburgh will soon be able to get their hands on a new high-powered supercomputer to help them "unlock the secrets of the universe" following a deal with European tech outfit Atos.

The university has handed Atos a contract to deliver its BullSequana XH2000 supercomputing system described as being the "largest system dedicated to GPU computing deployed at a customer site in the UK".

Once up and running, it will allow star gazers across the STFC (Science and Technology Facilities Council) theory community to conduct research in particle physics, astronomy, cosmology and nuclear physics using NVIDIA Graphics Processing Units (GPUs) and AMD Eypc processors.

In a statement announcing the deal, Atos said the tie-up with the University of Edinburgh represents "a major boost to DiRAC’s computing capacity, significantly increasing the power of the Extreme Scaling service."

DiRAC – which stands for Distributed Research Utilising Advanced Computing – is a distributed facility with high performance computing resources hosted by the Universities of Edinburgh, Cambridge, Durham and Leicester.

The statement went on: “These systems support fundamental research in particle physics, astrophysics, nuclear physics and cosmology. This agreement forms part of a £20 million investment by the UK Research and Innovation (UKRI) World Class Laboratories scheme, through the Science and Technology Facilities Council (STFC), to fund an upgrade of the DiRAC facility.”

The XH2000 cabinet can accommodate up to 32 direct liquid cooled blades (20 at the front and 12 at the rear). Full specs here [PDF].

While, no doubt, an important step forward in understanding the hidden workings of the universe, the good news for students was that Atos plans to splash some cash sponsoring one student’s PhD and supporting a Hackathon. Happy days.

Last week, The Register reported that Nvidia had unveiled what it called the "world’s most powerful AI supercomputer yet", a giant machine named Perlmutter for NERSC, aka the US National Energy Research Scientific Computing Center.

“Perlmutter’s ability to fuse AI and high-performance computing will lead to breakthroughs in a broad range of fields from materials science and quantum physics to climate projections, biological research and more,” Nvidia CEO Jensen Huang gushed at the time.

The $146m supercomputer – considerably more than the "part of a £20m investment" figure quoted in the Atos / University of Edinburgh handshake - Perlmutter is to be operated at the Lawrence Berkeley National Laboratory. It’s named after Saul Perlmutter, a physicist working at the lab and the University of California, Berkeley, who won the Nobel Prize in 2011 for uncovering evidence that the universe was expanding faster than expected. ®

RECENT NEWS

From Chip War To Cloud War: The Next Frontier In Global Tech Competition

The global chip war, characterized by intense competition among nations and corporations for supremacy in semiconductor ... Read more

The High Stakes Of Tech Regulation: Security Risks And Market Dynamics

The influence of tech giants in the global economy continues to grow, raising crucial questions about how to balance sec... Read more

The Tyranny Of Instagram Interiors: Why It's Time To Break Free From Algorithm-Driven Aesthetics

Instagram has become a dominant force in shaping interior design trends, offering a seemingly endless stream of inspirat... Read more

The Data Crunch In AI: Strategies For Sustainability

Exploring solutions to the imminent exhaustion of internet data for AI training.As the artificial intelligence (AI) indu... Read more

Google Abandons Four-Year Effort To Remove Cookies From Chrome Browser

After four years of dedicated effort, Google has decided to abandon its plan to remove third-party cookies from its Chro... Read more

LinkedIn Embraces AI And Gamification To Drive User Engagement And Revenue

In an effort to tackle slowing revenue growth and enhance user engagement, LinkedIn is turning to artificial intelligenc... Read more