The Critical Role of Networking Infrastructure in AI Innovation

BNP Paribas Exane research explores the role of networking infrastructure in advancing AI innovation.

2 min

As artificial intelligence (AI) continues to innovate industries, the critical role of networking infrastructure in supporting AI development is often overlooked. While much attention has been given to advancements in compute power and storage, the importance of networking is becoming increasingly apparent, especially with the rise of Large Language Models (LLMs) and AI inferencing.

BNP Paribas Exane research sheds light on how networking is an underappreciated but crucial area of AI investment. As AI models grow in complexity, so does the demand for networking infrastructure capable of supporting them.

The evolving landscape of AI networking infrastructure

The scaling of AI models and inferencing workloads has led to a shift in procurement strategies, particularly for hyperscalers and large enterprises that now prioritise networking infrastructure. This shift is evident in the growing demand for AI-enabled back-end networks and traditional front-end networks. Supporting AI clusters that span thousands of compute nodes requires robust networking systems, and investors are beginning to take note.

BNP Paribas Exane shows that the Total Addressable Market (TAM) for AI networking could encompass 25% of total AI spending on accelerators. Furthermore, data-center switch sales could nearly double, and sales of back-end switches could even quadruple over the next few years.

As AI technology evolves, the importance of networking infrastructure continues to grow, presenting unique opportunities for business and investors alike.

Karl Ackerman, Managing Director, Semiconductors & IT Hardware, BNP Paribas Exane
The battle for AI networking supremacy: Ethernet vs. InfiniBand

At the heart of AI networking there are two prominent technologies: Ethernet and InfiniBand. InfiniBand has historically dominated AI networking due to its ability to meet the stringent demands of back-end AI networks. However, Ethernet is catching up fast. With its inherent flexibility, Ethernet users are closing the performance gap and potentially even surpassing InfiniBand in certain areas.

BNP Paribas Exane examines that the Ethernet ecosystem is maturing rapidly. The industry is preparing for smart NICs (Network Interface Cards) with flexible ordering capabilities that will support packet spraying—a key innovation aimed for release by the second half of 2025.

The future of AI networking infrastructure

As the technology matures, BNP Paribas Exane anticipates that Ethernet will capture an increasing share of the AI networking market. By 2027, BNP Paribas Exane projects Ethernet will command 46% of AI workload networking—a significant leap that underscores the growing importance of networking in the AI universe.

As AI continues to push the boundaries of technological innovation, the demand for sophisticated networking solutions will likely intensify.

Yang Pu, Senior Equity Research Associate, BNP Paribas Exane

BNP Paribas Exane remains committed to providing insights and guidance on how businesses can leverage these emerging trends to stay ahead in a rapidly evolving digital landscape.

Related solutions