Microsoft unveils 2 custom-designed chips to drive AI innovations
Heating up the AI race, Microsoft has unveiled two in-house, custom-designed chips and integrated systems that can be used to train large language models.
)
Heating up the AI race, Microsoft has unveiled two in-house, custom-designed chips and integrated systems that can be used to train large language models.
The Microsoft Azure Maia AI Accelerator is optimised for artificial intelligence (AI) tasks and generative AI, and the Microsoft Azure Cobalt CPU, an Arm-based processor, is tailored to run general purpose compute workloads on the Microsoft Cloud.
The chips will start to roll out early next year to Microsoft's data centres, initially powering the company's services such as Microsoft Copilot or Azure OpenAI Service,” the company said at its 'Microsoft Ignite' event late on Wednesday.
TRENDING NOW

India vs Australia 4th T20I Free Live Streaming: When and Where to watch IND VS AUS T20I series Match LIVE on Mobile Apps, TV, Laptop, Online

Loan Against LIC Policy: If high interest rate and poor Cibil score bother you for taking personal loan; try out this LIC option

Rajasthan Assembly Election Exit Polls Results 2023 LIVE: BJP holds edge in close fight with Congress in Rajasthan — Check BJP, Congress seat projection

Madhya Pradesh Election Exit Polls Results 2023 Live: Congress and BJP neck and neck in Madhya Pradesh; Check BJP, Congress seats

Telangana Assembly election 2023: Will BRS retain its power for a third time? Check latest voting updates, date of counting, results announcement of 119 Assembly seats

Chhattisgarh Exit Poll Results 2023 Live: Will Congress retain power in state? Here is what exit polls suggest
“Microsoft is building the infrastructure to support AI innovation, and we are reimagining every aspect of our data centres to meet the needs of our customers,” said Scott Guthrie, executive vice president of Microsoft's Cloud + AI Group.
Microsoft sees the addition of homegrown chips as a way to ensure every element is tailored for Microsoft cloud and AI workloads.
The end goal is an Azure hardware system that offers maximum flexibility and can also be optimized for power, performance, sustainability or cost, said Rani Borkar, corporate vice president for Azure Hardware Systems and Infrastructure (AHSI).
“Software is our core strength, but frankly, we are a systems company. At Microsoft we are co-designing and optimising hardware and software together so that one plus one is greater than two,” Borkar said.
“We have visibility into the entire stack, and silicon is just one of the ingredients,” she added.
At Microsoft Ignite, the company also announced the general availability of one of those key ingredients: Azure Boost, a system that makes storage and networking faster by taking those processes off the host servers onto purpose-built hardware and software.
To complement its custom silicon efforts, Microsoft also announced it is expanding industry partnerships to provide more infrastructure options for customers.
By adding first party silicon to a growing ecosystem of chips and hardware from industry partners, Microsoft will be able to offer more choice in price and performance for its customers, Borkar said.
Additionally, OpenAI has provided feedback on Azure Maia and Microsoft's deep insights into how OpenAI's workloads run on infrastructure tailored for its large language models is helping inform future Microsoft designs.
“Since first partnering with Microsoft, we've collaborated to co-design Azure's AI infrastructure at every layer for our models and unprecedented training needs,” said Sam Altman, CEO of OpenAI.
“Azure's end-to-end AI architecture, now optimized down to the silicon with Maia, paves the way for training more capable models and making those models cheaper for our customers,” Altman added.
12:15 pm