Of the number of trends taking place in cloud and communications infrastructure in 2024, none loom as large as AI. Specifically in the networking markets, AI will have an impact on how infrastructure is built to support AI-enabled applications.
AI has interesting characteristics that make it different from previous cloud infrastructure. In general, training large language models (LLMs) and other applications requires extremely low latency and very high bandwidth.
Generative AI (GenAI), which creates text, images, sounds, and other output from natural language queries, is driving new computing trends toward highly distributed and accelerated platforms. These new environments require a complex and powerful underlying infrastructure, one that addresses the full stack of functionality, from chips to specialized networking cards to distributed high performance computing systems.

OpenAI displayed on a smartphone with ChatGPT 4 seen in the background, in this photo illustration, …
NurPhoto via Getty Images
This has raised the profile of networking as a key element of the “AI stack.” Networking leaders such of Cisco have grabbed a hold of this in marketing materials and investor conference calls. It was even one of the featured topics of conversation in HPE’s recently announced $14 billion deal to acquire Juniper Networks. HPE executives said the deal emphasis the growing importance of networking in the AI cloud world.
The AI networking companies that have drawn the most investor interest so far have been Nvidia, which has a full stack of networking elements including its BlueField networking platform, and Arista Networks, which has drawn extraordinary interest from investors for its role as a key supplier to AI providers such as Microsoft. There are also numerous interesting private companies in this market which we’ll detail in a bit.
Link to the original article:
https://www.forbes.com/sites/rscottraynovich/2024/01/23/what-ai-means-for-networking-infrastructure-in-2024/?sh=1c5f30371c9e