Microsoft has partnered with OpenAI to build an Azure-hosted supercomputer for testing large-scale models.
The supercomputer will deliver eye-watering amounts of power from its 285,000 CPU cores and 10,000 GPUs (yes, it can probably even run Crysis.)
OpenAI is a non-profit that was founded by one Elon Musk to promote the ethical development of artificial intelligence technologies. Musk, however, departed OpenAI following disagreements over the company’s direction.
Back in February, Musk responded to an MIT Technology Review profile of OpenAI saying that it “should be more open,” and that all organisations “developing advanced AI should be regulated, including Tesla.”
Microsoft invested $1 billion in OpenAI last year and it seems we’re just beginning to see the fruits of that relationship. While most AIs today focus on doing single tasks well, the next wave of research is focusing on performing multiple at once.
“The exciting thing about these models is the breadth of things they’re going to enable,” said Microsoft Chief Technical Officer Kevin Scott.
“This is about being able to do a hundred exciting things in natural language processing at once and a hundred exciting things in computer vision, and when you start to see combinations of these perceptual domains, you’re going to have new applications that are hard to even imagine right now.”
So-called Artificial General Intelligence (AGI) is the ultimate goal for AI research; the point when a machine can understand or learn any task just like the human brain.
“The creation of AGI will be the most important technological development in human history, with the potential to shape the trajectory of humanity,” said Sam Altman, CEO, OpenAI. “Our mission is to ensure that AGI technology benefits all of humanity, and we’re working with Microsoft to build the supercomputing foundation on which we’ll build AGI.”
“We believe it’s crucial that AGI is deployed safely and securely and that its economic benefits are widely distributed. We are excited about how deeply Microsoft shares this vision.”
AGI will, of course, require tremendous amounts of processing power.
Microsoft and OpenAI claim their new supercomputer would rank in the top five but do not give any specific power measurements. To rank in the top five, a supercomputer would currently require more than 23,000 teraflops of performance. The current leader, the IBM Summit, reaches over 148,000 teraflops.
“As we’ve learned more and more about what we need and the different limits of all the components that make up a supercomputer, we were really able to say, ‘If we could design our dream system, what would it look like?’” said Altman. “And then Microsoft was able to build it.”
Unfortunately, for now at least, the supercomputer is built exclusively for OpenAI.