Meta is building the world's largest AI-specific supercomputer

 Facebook’s owner wants extraordinary computing power to develop AI models to recognise speech, translate languages and power 3D worlds.

Facebook's parent company, Meta, is building the world's most powerful AI-specific supercomputer to develop better speech-recognition tools, automatically translate between different languages ​​and help build its 3D virtual metaverse. does.

Although far from complete, the AI ​​Research Supercluster (RSC) is up and running and has already surpassed META's previous fastest supercomputer. That machine was designed in 2017 and runs on 22,000 powerful graphics processing units (GPUs), which, despite being designed for playing games, are highly effective tools for training artificial intelligence models.



RSC currently has only 6080 GPUs, but they are more powerful than those in the older machine and it is already three times faster at training large AI models than its predecessor. Its current performance is on a par with the Perlmutter supercomputer at the National Energy Research Scientific Computing Center in California, which is currently placed at number five in the TOP500 global supercomputer rankings.

When RSC is complete, it will have 16,000 GPUs and will be about three times more powerful than it is now. Meta says that at this point in time, it will be the world's fastest AI-optimized supercomputer, performing at about 5 xFlops.

super computer, artificial inteligence


Supercomputers can be designed to excel at certain tasks. Meta Key Machine is specialized for training and running large AI models. When it is complete there will be more powerful computers in the world, but only a few, and none that share its exact architecture or intended use.

super computer, artificial inteligence


The cutting edge of AI research in recent years has largely relied on more powerful machines to train models. One of the largest neural networks, the Megatron-Turing Natural Language Generation Model, has 530 billion parameters, the equivalent of connections between brain cells. Meta says its machines will eventually run models with trillions of parameters.

super computer, artificial inteligence



James Knight of the UK's University of Sussex says the proposed computer is "huge" at scale, but may not meet some of the challenges in AI research. "A system so large would certainly let them build larger models," he says. "However, I do not think that simply increasing the size of language models will solve the well-documented problems of existing models repeating sexist and racist language or failing basic tests of logical reasoning."

Post a Comment

Previous Post Next Post