New infrastructure options for Azure customers

Telegram data gives you good opportunity to promote you business with tg users. Latest marketing technique to telegram marketing.
Post Reply
[email protected]
Posts: 95
Joined: Sun Dec 15, 2024 5:28 am

New infrastructure options for Azure customers

Post by [email protected] »

The Route layer is especially useful in several scenarios:

Microsoft Azure Maia , an AI accelerator chip designed to run cloud-based training and inference for AI workloads such as OpenAI, Bing models, GitHub Copilot, and ChatGPT. Azure Maia is capable of delivering cutting-edge AI performance with industry-leading power efficiency, enabling Azure customers to scale their AI applications cost-effectively and sustainably.
Microsoft Azure Cobalt , a cloud-native chip based on Arm architecture optimized for performance, power efficiency, and cost effectiveness for general-purpose workloads. Azure Cobalt is the first Arm-based server chip designed by Microsoft, offering greater flexibility and customization to meet the diverse needs of Azure customers.
Complementing its custom silicon, Microsoft is expanding its partnerships with its vendors to provide customers with infrastructure options. At Microsoft Ignite, we announced the addition of new GPU-accelerated virtual machines (VMs) to Azure, enabling customers to run their AI workloads faster and more efficiently in the cloud. Here are some of the new VMs coming soon:

ND MI300 virtual machines accelerated by AMD MI300X , designed australia telegram data to accelerate processing of AI workloads for high-end AI model training and generative inference. ND MI300 VMs will feature AMD’s latest GPU, the AMD Instinct MI300X, delivering high performance and high memory capacity for the largest and most complex AI models.
NVIDIA H100 Tensor Core-accelerated NC H100 v5 VMs , built for NVIDIA H100 Tensor Core GPUs, deliver increased performance, reliability, and efficiency for midrange AI training and generative AI inference. NC H100 v5 VMs will also feature NVIDIA Multi-Instance GPU (MIG) technology, which enables customers to split a GPU into multiple instances to run multiple workloads simultaneously and optimize resource usage.
ND H200 v5 virtual machines accelerated by NVIDIA H200 Tensor Core , an AI-optimized virtual machine featuring the upcoming NVIDIA H200 Tensor Core GPU.
Post Reply