The Windows giant said it has no plans to commercialize the chips. Instead, the new AI chips will be used internally for software products, as well as part of its Azure cloud computing service.
Solutions to rising costs
Microsoft and other tech giants like Alphabet (Google) are grappling with the high cost of providing AI services, which can be more than 10 times that of traditional services like search engines.
Microsoft executives say they plan to address the rising cost of AI by using a common platform model to deeply integrate AI into the entire software ecosystem. And the Maia chip is designed to do just that.
The Maia chip is designed to run large language models (LLMs), the foundation for the Azure OpenAI service, a collaboration between Microsoft and the company that owns ChatGPT.
“We think this gives us a way to be able to deliver better solutions to our customers at a faster pace, at a lower cost, and with higher quality,” said Scott Guthrie, executive vice president of Microsoft's cloud and AI division.
Microsoft also said that next year it will offer Azure customers cloud services running on the latest leading chips from Nvidia and Advanced Micro Devices (AMD). The group is currently testing GPT-4 on AMD chips.
Increased competition in the cloud sector
The second chip, codenamed Cobalt, was launched by Microsoft to save internal costs and compete with Amazon's AWS cloud service, which uses its own self-designed chip "Graviton".
Cobalt is an Arm-based central processing unit (CPU) currently being tested to power the Teams enterprise messaging software.
AWS representatives said their Graviton chip currently has about 50,000 customers. The company will also hold a developer conference later this month.
"AWS will continue to innovate to deliver future generations of custom-designed chips that deliver even better price performance, for any workload customers require," said an AWS representative in a statement after Microsoft announced the AI chip duo.
Rani Borkar, corporate vice president of Azure hardware and infrastructure, said both new chips are manufactured on TSMC's 5nm process.
In it, Maia is paired with standard Ethernet networking cables, rather than using the more expensive custom Nvidia networking technology that Microsoft has used in the supercomputers built for OpenAI.
(According to Reuters)
MediaTek launches mobile AI chip that can compose poems and create images without the Internet
Mobile chip designer MediaTek has just launched the Dimensity 9300 5G chipset with an integrated AI processor (also known as an APU), compatible with generative AI tasks such as generating images from text prompts without an Internet connection.
Nvidia immediately banned from exporting some AI chips
The US government is demanding that Nvidia immediately stop exporting certain chips without a license from the Department of Commerce.
AI chip 'unicorn' joins hands with Samsung, challenging Nvidia
Tenstorrent, a billion-dollar AI chip startup headquartered in Canada, has just reached an agreement to use Samsung's 4-nanometer (nm) microprocessor manufacturing technology.
Source
Comment (0)