Facebook’s owner wants extraordinary computing power to develop AI models to recognise speech, translate languages and power 3D worlds.
Facebook's parent company, Meta, is building the world's most powerful AI-specific supercomputer to develop better speech-recognition tools, automatically translate between different languages and help build its 3D virtual metaverse. does.
Although far from complete, the AI Research Supercluster (RSC) is up and running and has already surpassed META's previous fastest supercomputer. That machine was designed in 2017 and runs on 22,000 powerful graphics processing units (GPUs), which, despite being designed for playing games, are highly effective tools for training artificial intelligence models.
RSC currently has only 6080 GPUs, but they are more powerful than those in the older machine and it is already three times faster at training large AI models than its predecessor. Its current performance is on a par with the Perlmutter supercomputer at the National Energy Research Scientific Computing Center in California, which is currently placed at number five in the TOP500 global supercomputer rankings.
When RSC is complete, it will have 16,000 GPUs and will be about three times more powerful than it is now. Meta says that at this point in time, it will be the world's fastest AI-optimized supercomputer, performing at about 5 xFlops.
Supercomputers can be designed to excel at certain tasks. Meta Key Machine is specialized for training and running large AI models. When it is complete there will be more powerful computers in the world, but only a few, and none that share its exact architecture or intended use.