Artificial Intelligence (AI) has emerged as a revolutionary technology that is transforming many industries and aspects of our daily lives from medicine to financial services and entertainment. The rapid evolution of real-time gaming, virtual reality, generative AI and metaverse applications are changing the ways in which network, compute, memory, storage and interconnect I/O interact. As AI continues to advance at unprecedented pace, networks need to adapt to the colossal growth in traffic transiting hundreds and thousands of processors with trillions of transactions and terabits of throughput.

With extensive experience in large scale and high performance networking, Arista provides the best IP/Ethernet based solution for AI/ML workloads built on a range of AI Accelerator and Storage systems. Exponential growth in AI applications requires standardized transports to build power efficient interconnects and overcome the scaling limitations and administrative complexities of existing approaches. Building an IP/Ethernet architecture with high-performance Arista switches maximizes the performance of the application while at the same time optimizing network operations.

Modern AI applications need high-bandwidth, lossless, low-latency, scalable, multi-tenant networks that interconnect hundreds or thousands of accelerators at high speed from 100Gbps to 400Gbps, evolving to 800Gbps and beyond.

Arista AI Etherlink Portfolio

 

Arista Networks and Ultra Ethernet Consortium (UEC)

 

Key Arista advantages in AI Networking include:

  • Arista Etherlink™: Arista’s Etherlink portfolio consists of ultra high performance standards-based Ethernet systems with smart features for AI networking.These include a portfolio of specialist load balancing and congestion control features, including Arista’s RDMA Aware QoS and load balancing capabilities which ensure reliable packet delivery to all NICs supporting RoCE. Arista Etherlink also introduces AI Analyzer with workload and NIC integration to improve cluster deployment time, operational stability and provide deep visibility supported by AVA machine learning. Etherlink features are supported across a broad range of 800G systems and line cards based on Arista EOS and are forwards compatible with UEC.
     
  • Network Platforms: Arista offers a range of fixed, modular and distributed platforms which can be deployed individually, for smaller clusters, or combined to build large topologies suitable for over one hundred thousand accelerators. The 7060X6, 7060X series and 7388X5 series offer a rich breadth of connectivity options in a range of 800G and 400G systems with up to 51.2Tbps of capacity in a choice of 1, 2 and 4U form factors.

    The modular 7800R4 series high performance 800G AI Spine delivers up to 460 Tbps of throughput with 576 ports of 800G or 1152 ports of 400G in single device, enabling large, highly efficient, one-switch clusters or supporting tens of thousands of end systems as the large radix spine and/or leaf of a 2-tier network.

    Arista’s 7700R4 series Distributed Etherlink Switch (DES) provides a novel third approach to AI back-end networks, scaling out the 7800R4 architecture to support up to 32,000 accelerators in a logical single-hop topology while retaining deterministic, fair and 100% efficient lossless forwarding - the perfect fabric for AI.

    The breadth of options provides a complete tool-set for building AI networks with any type of accelerator, NIC, topology or scale, using open standards based Ethernet and IP.
     
  • Network Operating Software: Arista Extensible Operating System (EOS) is the core of Arista cloud networking solutions for next-generation data centers and high-performance AI networks. Arista EOS provides all the necessary tools to achieve a highly reliable, premium lossless, high bandwidth and low latency network. EOS also supports extensive instrumentation, security, segmentation and policy capabilities which are critical for operating the highest value workloads.
     
  • Open Ecosystem: Arista has fostered strong partnerships and collaborations within the industry contributing to an open ecosystem for AI networking. Ethernet has a broad ecosystem with multiple system vendors, choices in silicon vendors, interconnect and optics that drive open standard solutions that interoperate flexibly across vendors. In short, customers have the flexibility and freedom of choice when it comes to building AI solutions with Arista’s accelerator and NIC agnostic portfolio and are never locked-in.
     
  • Interoperable: AI networks often interface with a range of storage and general purpose compute systems. An Ethernet-based AI network enables efficient and flexible network designs that eliminate compatibility issues and the need for inter-domain gateways which introduce pipeline bottlenecks. Ethernet is versatile and ubiquitous, providing customers with the option to deploy (and re-deploy) Arista’s consistent set of solutions across general purpose compute, Data Center, Storage and AI networks.
     
  • Visibility and Telemetry: Arista's platforms with EOS provide extensive telemetry capabilities for AI networking. They enable real-time monitoring, analytics, and visualization of network data, offering insights into network performance, traffic patterns, and resource utilization. The streaming telemetry features facilitate proactive troubleshooting, analytics, capacity planning, and optimization, helping organizations maximize the efficiency and effectiveness of their AI infrastructure.
     
  • Total Cost of Ownership: Arista's solutions are designed to provide high performance networks at the best value. Arista’s open, interoperable solutions and emphasis on operational efficiency, automation, and streamlined management workflows contribute to lower operational costs and overall total cost of ownership (TCO) for AI networking deployments.

Arista’s Etherlink for Standards Compatibility

As the Ultra Ethernet Consortium (UEC) completes their extensions to improve Ethernet for AI workloads, Arista is building forwards compatible products to support UEC standards. The Arista Etherlink™ portfolio leverages standards based Ethernet systems with a package of smart features for AI networks. These include dynamic load balancing, congestion control and reliable packet delivery to all NICs supporting ROCE. Arista Etherlink will be supported across a broad range of 400G and 800G systems based on EOS. As the UEC specification is finalized, Arista AI platforms will be upgradeable to be compliant.