Microsoft is set to launch its custom Cobalt 100 chips as a public preview at its Build conference next week, marking a significant step in its competition with AWS's Graviton chips.
These 64-bit, Arm-based chips boast 128 cores and promise 40% better performance than other ARM chips in the market.
Scott Guthrie, Microsoft's executive VP of the Microsoft Cloud and AI group, highlighted that companies like Adobe and Snowflake have already adopted the new chips, underscoring their potential impact in the AI landscape.
Alongside the Cobalt 100 chips, Microsoft will also make AMD's MI300X accelerators available to Azure clients, offering a cost-effective alternative to Nvidia's GPUs for running large language models.
In a move to further democratize AI, Microsoft plans to announce price reductions for accessing and running large language models at Build.
The company will also introduce a new "real-time intelligence system" with native Kafka integration and support for AWS Kinesis and Google Cloud's Pub/Sub data-streaming systems, allowing real-time data streaming into Fabric, Microsoft's data analytics system.
Microsoft is also partnering with Snowflake to enable seamless interoperability between Fabric and Snowflake's data.
Additionally, a new Copilot feature will allow developers to manage Azure resources using natural language, streamlining their workflow and enhancing productivity.
These announcements demonstrate Microsoft's ongoing commitment to advancing AI capabilities and making them more accessible to developers and businesses.
The launch of Cobalt 100 chips, the availability of AMD's MI300X accelerators, the price reductions for large language models, and the integration of new features like real-time data streaming and natural language resource management in Copilot all contribute to Microsoft's vision of a more AI-powered future.