Fortinet and Arista introduce an “AI-Ready” architecture combining low-latency networking and Zero Trust security

Fortinet and Arista Networks have announced a joint solution for AI-focused data centers aimed at tackling one of the biggest current bottlenecks: how to scale GPU clusters without turning security into a hindrance… or the network into a single point of failure. The proposal, dubbed Fortinet Secure AI Data Center, was developed in collaboration with Arista and, according to both companies, it has already been deployed at Monolithic Power Systems (MPS) as a reference implementation.

The core idea is simple to explain but complex to execute: combining “the best of two worlds”—high-speed switching and load balancing on one side, and high-performance security on the other—within a validated design that serves as a blueprint to deploy and operate AI infrastructure with less friction. In a context where many organizations are finding that the leap to training and inference workloads requires not only GPUs but also ultra-stable networks, bottleneck-free storage, and finer security controls, the move sends a clear message: the race for the “AI data center” is no longer just about compute, it’s about full architecture.

Stock market information for Fortinet Inc (FTNT)

  • Fortinet Inc is a publicly traded equity in the USA.
  • The current price is $81.10 USD, with a change of $0.44 USD (0.01%) compared to the previous close. The latest opening price was $80.63 USD, and the intraday volume is 279,480. The intraday high is $81.11 USD, and the intraday low is $80.38 USD.
  • The most recent trade occurred on Wednesday, December 24, at 16:33:24 CET.

A multi-vendor “blueprint”: less dependence, more predictability

Fortinet frames the announcement as an evolution of its Secure AI Data Center Framework, now strengthened with multi-vendor integration alongside Arista. In practical terms, the message is that companies can deploy a modular architecture — not a castle of proprietary parts difficult to mix — while ensuring performance guarantees and a more coherent Zero Trust strategy.

In summary, Fortinet highlights four main advantages of the solution:

  • Best-of-breed modular design, built for flexibility and long-term return.
  • Hyper-scale level performance, geared towards training and inference.
  • Zero-touch provisioning with deployments “up to 80% faster” (according to the announcement itself).
  • Future-proof integration to accommodate new accelerators without redesign from scratch.

In other words, the goal is to reduce improvisation: when the data center becomes a token factory, operational stability and repeatability are as important as the hardware.

The most striking technical point: moving TLS away from the CPU to give it back to AI

One of the most aggressive aspects of the announcement lies in Fortinet’s approach to encrypted traffic. The company claims that its ASIC allows offloading HTTPS/TLS and achieving “up to 33 times” the performance with latency “below one microsecond,” freeing the server CPU for tasks closer to actual inference work (e.g., model execution, planning, data pipelines).

In everyday terms: if the CPU spends cycles on cryptography and inspection, it competes for cache and memory with processes feeding inference. In a high-density cluster, this resource contention can lead to jitter, queues, and latency spikes that degrade the experience (or directly increase the cost per inference). Fortinet asserts that its approach reduces network contention and fine-tunes tail latencies under load.

Arista provides the highway; Fortinet handles access controls and inspection

In the described setup, Arista contributes its low-latency, high-performance network, with features like load balancing for clusters, while Fortinet offers ASIC-accelerated firewalls, Zero Trust segmentation, encrypted traffic inspection, and automated response, focusing on AI workloads.

The narrative aligns with a growing trend: the AI data center tends to be heterogeneous, highly sensitive to latency, and presenting a more complex attack surface (data, models, pipelines, internal APIs, third-party access, separated training/inference environments, etc.). That’s why the announcement emphasizes the convergence of networking and security: in AI, “speeding up” without controls can be costly.

A real-world example: Monolithic Power Systems as showcase

The announcement notes that the architecture has been deployed in Monolithic Power Systems (MPS), including statements from both parties. Arista highlights that combining “best-in-class” cybersecurity with high-speed networking is key for AI applications, while MPS emphasizes operating high-density GPU clusters with greater confidence.

The key takeaway for outside observers is that, although the text is corporate, anchoring the proposal to a concrete deployment helps position it as more than just a “slide architecture.” The question, as always, is how many organizations will adopt it as-is, and how many will see it as a starting point.

What’s at stake: scaled AI without “lock-in” and fewer surprises

Fortinet suggests that many AI projects stall due to cost, complexity, lack of skilled personnel, and risks (from data leaks to model manipulation). While the exact figures may vary by source and project type, the industry has been consistently diagnosing: AI fails not only because of the model, but also because of everything surrounding it (data, security, network, observability, operations).

In that context, the Fortinet–Arista partnership targets a space where a lot of money is being decided: ready-to-deploy blueprints that promise to accelerate time-to-value without turning the data center into a perpetual laboratory.

via: fortinet

Scroll to Top