Business
Microsoft unveils 2 custom-designed chips to drive AI innovations
San Francisco, Nov 16
Heating up the AI race, Microsoft has unveiled two in-house, custom-designed chips and integrated systems that can be used to train large language models.
The Microsoft Azure Maia AI Accelerator is optimised for artificial intelligence (AI) tasks and generative AI, and the Microsoft Azure Cobalt CPU, an Arm-based processor, is tailored to run general purpose compute workloads on the Microsoft Cloud.
The chips will start to roll out early next year to Microsoft’s data centres, initially powering the company’s services such as Microsoft Copilot or Azure OpenAI Service,†the company said at its ‘Microsoft Ignite’ event late on Wednesday.
“Microsoft is building the infrastructure to support AI innovation, and we are reimagining every aspect of our data centres to meet the needs of our customers,†said Scott Guthrie, executive vice president of Microsoft’s Cloud + AI Group.
Microsoft sees the addition of homegrown chips as a way to ensure every element is tailored for Microsoft cloud and AI workloads.
The end goal is an Azure hardware system that offers maximum flexibility and can also be optimized for power, performance, sustainability or cost, said Rani Borkar, corporate vice president for Azure Hardware Systems and Infrastructure (AHSI).
“Software is our core strength, but frankly, we are a systems company. At Microsoft we are co-designing and optimising hardware and software together so that one plus one is greater than two,†Borkar said.
“We have visibility into the entire stack, and silicon is just one of the ingredients,†she added.
At Microsoft Ignite, the company also announced the general availability of one of those key ingredients: Azure Boost, a system that makes storage and networking faster by taking those processes off the host servers onto purpose-built hardware and software.
To complement its custom silicon efforts, Microsoft also announced it is expanding industry partnerships to provide more infrastructure options for customers.
By adding first party silicon to a growing ecosystem of chips and hardware from industry partners, Microsoft will be able to offer more choice in price and performance for its customers, Borkar said.
Additionally, OpenAI has provided feedback on Azure Maia and Microsoft’s deep insights into how OpenAI’s workloads run on infrastructure tailored for its large language models is helping inform future Microsoft designs.
“Since first partnering with Microsoft, we’ve collaborated to co-design Azure’s AI infrastructure at every layer for our models and unprecedented training needs,†said Sam Altman, CEO of OpenAI.
“Azure’s end-to-end AI architecture, now optimized down to the silicon with Maia, paves the way for training more capable models and making those models cheaper for our customers,†Altman added.
14 hours ago
IDFUSA ‘ORG (International Development Foundation) Annual Charity Event Highlights Compassion-Led Projects, Global Partnerships, and Scalable Social Impact
14 hours ago
“THE GREEN ALERT” Documentary Garners Global Attention for Urgent Environmental Message
14 hours ago
JSW Sports Signs U19 Women’s T20 World Cup Winning Captain Niki Prasad
14 hours ago
Trump claims progress in Iran talks, but Tehran stays silent
15 hours ago
Aditya Dhar’s ‘Dhurandhar-The Revenge’ rakes in whopping Rs. 1365 crore globally
15 hours ago
US to finish Iran fight in weeks: Marco Rubio
15 hours ago
Usha Vance launches kids’ reading podcast
16 hours ago
Trump warns Iran on Hormuz, power grid if deal is not reached
20 hours ago
Satheesan-Vijayan development debate challenge takes centre stage in poll-bound Kerala
20 hours ago
Rahul Gandhi opens Kerala campaign for Cong with sharp attack on LDF–BJP 'hidden pact'
21 hours ago
Bharath-starrer 'Kaalidas 2' cleared for release with U/A certificate
21 hours ago
Basil Joseph-starrer Raawadi's shooting wrapped up
21 hours ago
Farida Jalal brings back Rajesh Khanna-era nostalgia, recreates iconic 'Aradhana' song after 57 years
