The rapidly evolving landscape of artificial intelligence (AI) is on the cusp of a significant breakthrough, with predictions that enterprises are poised to invest billions into AI technologies. DDN (DataDirect Networks), a prominent player in storage solutions for high-performance computing (HPC) and AI, is at the forefront of this revolution. The company has a long history of providing storage solutions capable of supporting massive compute clusters, which are becoming increasingly crucial as AI demands escalate.
Rising Demand in AI and HPC
As the adoption of AI technologies expands, so does the need for robust storage solutions. Paul Bloch, co-founder and president of DDN, predicts that the demand for storage to support HPC environments will not only increase but potentially triple or quadruple by 2027. Currently, approximately 200,000 GPUs are deployed in Elon Musk’s xAI cluster in Memphis; this number is expected to surge to one million. Bloch emphasizes that the growing performance capabilities of GPUs are driving escalating data storage requirements, necessitating real-time data ingestion for effective processing.
During a recent discussion at the IT Press Tour in Silicon Valley, DDN’s CTO, Sven Oehme, pointed out that while AI has not yet reached its full potential, there is a clear understanding among enterprises of the transformative possibilities it presents. “The boom will really come when enterprises fully launch into inference and exploit AI in all sectors of activity,” Oehme stated, highlighting the optimistic trajectory of AI integration into various industries.
DDN’s valuation has seen dramatic growth, currently estimated at $5 billion, with projected revenues reaching $1 billion by 2025. The company secured a significant investment of $300 million from Blackstone this year, which in return secured 6% equity and a board seat. Bloch noted that this partnership will facilitate access to customers outside of DDN’s traditional market. He emphasized that companies unknown today could invest between $1 billion and $2 billion in AI, influencing entire sectors, particularly in industrial applications.
Innovative Storage Solutions for Enterprises
To meet the burgeoning demands of AI, DDN has made significant updates to its core product line, particularly with the introduction of the EXAscaler AI400X3. This new system features 24 NVMe SSDs with support for additional SSD shelves via NVMe over TCP, offering improved performance rates — 55% faster read speeds and over 70% faster write speeds with a throughput of 140 GBps (reading) and 75GBps to 100GBps (writing). The enhanced performance is attributed to the utilization of four Nvidia BlueField cards that leverage the SpectrumX protocol.
“We’ve already got a reference architecture for their new Blackwell GPU,” noted Oehme, reinforcing DDN’s strategic positioning in the rapidly advancing AI landscape. The EXAscaler AI400X3 also brings significant improvements in security and resource pooling, benefiting Managed Service Providers (MSPs) that require dynamic provisioning for varied client needs. The new system’s capacity for data compression allows enterprises to utilize storage more efficiently, thereby maximizing raw capacity.
One of the standout features of the EXAscaler AI400X3 is its enhanced resilience, which allows for the replacement of hardware or software components without necessitating a full shutdown of the array. Oehme explained, “The risk of downtime is reduced to a minimum. That’s fundamental because enterprises can’t entertain any disruption.” This differs from HPC customers, who often prefer to troubleshoot issues independently before reaching out for support.
Accelerated Object Storage with Infinia 2.1
Recognizing the shift towards object storage among enterprise AI users, DDN also launched its updated object storage product, Infinia 2.1. This version boasts an extraordinary performance increase, claiming speeds that are 100 times faster than previous iterations. Infinia 2.1 directly competes with other object storage solutions, such as Cloudian’s HyperStore and Scality’s Ring.
Infinia 2.1 is designed to be 10 times faster in terms of access time and 25 times faster for processing requests compared to AWS’s S3 Express. This product supports a range of AI software stacks, including TensorFlow, PyTorch, and Apache Spark. Moreover, the update includes integrations with observability platforms like Datadog and Chronosphere, as well as Hadoop-based data lakes, enhancing functionality for enterprises looking to streamline their AI workflows.
In addition to Infinia 2.1, DDN also introduced two other hardware products: xFusionAI and Inferno. The xFusionAI device is integrated with the EXAscaler AI400X3 and utilizes both the Lustre file system and Infinia 2.1, allowing users to access the same data through S3 object storage, which offers better metadata indexing capabilities. Inferno acts as a network proxy that accelerates data communication between object storage and Lustre file storage, enhancing overall efficiency.
The Growth of DDN: Major Milestones and Client Base
DDN currently supports an impressive 700,000 GPUs across 7,000 customers, with approximately 4,000 active in AI-focused projects. The company commenced operations in 1998, providing storage solutions for early supercomputer centers and has since built a solid foundation in parallel computing. DDN’s pioneering efforts caught the attention of Nvidia, culminating in a partnership that has flourished over the last decade.
Bloch encapsulated the scale of DDN’s capabilities by stating, “Imagine 10,000 GPUs. If you wanted to get 1 GBps on reads per GPU, you’d need throughput of 10TBps for the whole array. There are few systems in the world capable of providing 10TBps on reads and on writes. That’s what we have mastered.” This statement underscores DDN’s commitment to high-performance solutions essential to the future of AI technology.