The OpenAI supercomputer is powered by 285,000 CPU cores and 10,000 GPUs (each of which are also united by speedy 400 gigabit per second connections). And while Microsoft didn’t reveal any specific speed capability, the company says it’s the TOP500 list of publicly disclosed supercomputers.
At this point, it’s unclear how, exactly, OpenAI will take advantage of such a powerful system. But we can at least expect the results to be interesting. The non-profit is best known for developing an algorithm that could write convincing fake news, as well as proving that even bots learn to cheat while playing hide and go seek.
Maybe OpenAI will take a note from Microsoft and develop something like its Turing models for natural language generation, a large-scale AI implementation that’s powering things like real-time caption generation in Teams. It’s backed by 17 billion parameters for understanding language — a particularly impressive number when competing solutions clocked 1 billion parameters last year. Microsoft also announced that it’s making the Turing models open source, so developers will be able to use it for their own language processing needs soon.