It takes a whole lot of power and data to both train and run the AI models that have big tech so enamored. To support its AI ambitions on Bing Search, Microsoft leaped into the welcoming arms of cloud computing giant Oracle, all the better to help feed the ever-hungry multi-headed hydra that is AI.
Microsoft already uses its own Azure cloud infrastructure to support its AI, but it apparently needs reinforcements to power its Bing Search AI. Oracle said this deal will offload some of the AI tasks onto Oracle’s supercluster GPU infrastructure, according to a release on the partnership made public on Tuesday. The company said Microsoft would use the Oracle Interconnect for Microsoft Azure.
Microsoft’s Search and AI marketing head Divya Kumar said the partnership will “expand access to customers and improve the speed of many of our search results.”
These Oracle superclusters are made up of highly-prized Nvidia AI training GPUs like the A100 and H100s. Microsoft is so desperate for more computing power it reportedly has its own teams developing a new kind of AI training chip so it doesn’t have to rely on Nvidia’s dwindling stock.
AI tasks are so energy-hungry that Microsoft has even investigated using nuclear microreactors to fuel its ambitions. That isn’t stopping the company from funneling AI into every user-end product. Microsoft already has a so-called AI Copilot installed on Windows 11, but the tech giant is planning to stick the chatbot assistant into Windows 10 as well.
According to unnamed sources who talked to Windows Central Wednesday, Microsoft will soon update Windows 10 to put the Copilot button on the Windows 10 taskbar. Microsoft is reportedly trying to force AI down to the more than 1 billion Windows 10 users who haven’t made the switch to Windows 11, which will help the company sell more developers on more plugins for the AI Copilot.
Microsoft may backport even more Windows 11 features to its 8-year-old OS, though the report didn’t specify which features could rear their head for those who have yet to move on.
Gizmodo reached out to Microsoft for comment, but we did not immediately hear back.
Microsoft may need to completely revitalize its computing infrastructure if it expects to support even more desktop-based AI. It’s not just computing space and energy that Microsoft needs for AI. The company is using over a billion gallons more water than it had in past years to help keep its data centers cool.
The AI race won’t cool down, either. Earlier this week, Microsoft’s big partner in crime, OpenAI, launched a new version of ChatGPT-4 with the extra “Turbo” moniker. This bumped up the AI’s knowledge base all the way to April 2023 compared to ChatGPT’s previous 2021 cutoff. ChatGPT has full internet integration for those who pay for the service, but now OpenAI is letting enterprise customers build their own specific GPT, like one big “Build-a-AI” workshop.
Microsoft has put billions of dollars into its partnership with OpenAI, which has allowed the tech giant exclusive access to the AI startup’s software. This deal also supports OpenAI with the computing infrastructure necessary for building these more expansive models. If anything, Microsoft’s need for computing power will only balloon even larger with time.