Time’s almost up! There’s only one week left to request an invite to The AI Impact Tour on June 5th. Don’t miss out on this incredible opportunity to explore various methods for auditing AI models. Find out how you can attend here.
Dell reported earnings after the market close Thursday, beating both earnings and revenue estimates, but its results suggest AI uptake across its enterprise and tier-2 cloud service providers is slower than expected.
Dell’s stock was down -17.78% in after hours trading after posting a -5.18% loss during the regular trading session, but is still up 86.79% year to date.
“Data is the differentiator, 83% of all data is on-prem, and 50% of data is generated at the edge”, said Jeff Clarke, Dell’s COO, on the earnings call. “Second, AI is moving [closer] to the data because it’s more efficient, effective and secure, and AI inferencing on-prem can be 75% more cost effective than the cloud”.
Dell’s current AI strategy rests on the key presumption that enterprises will need to deploy infrastructure on-premises instead of in the cloud to take advantage of close proximity to data. If this seems familiar, it should. The company used almost exactly the same play during the Great Cloud Wars.
June 5th: The AI Audit in NYC
Join us next week in NYC to engage with top executive leaders, delving into strategies for auditing AI models to ensure fairness, optimal performance, and ethical compliance across diverse organizations. Secure your attendance for this exclusive invite-only event.
Back then, it was believed enterprises would want the agility of cloud services, but the control of owning their own infrastructure.
In the end, those purported benefits proved insufficient to resist the inexorable pull of hyperscale clouds for most companies.
The question that lost Dell $10B in market cap
Toni Sacconaghi, an analyst with Bernstein, picked apart Dell’s narrative on AI servers: “So really, the only thing that changed was you added $1.7 billion in AI servers, and operating profit was flat. So does that suggest that operating margins for AI servers were effectively zero?” Hey, ouch, Toni.
Yvonne McGill, Dell’s CFO quickly weighed in, saying “those AI-optimized servers, we’ve talked about being margin rate dilutive, but margin dollar accretive”.
That was CFO-speak for you are totally right, Toni, we are making very little profit on those AI servers right now, but not to worry.
This is the tried and true tactic Dell has been using successfully for decades, which is to sell a loss leading product assuming it will drag in higher margin gear immediately or in the near future.
Operationally, it is much easier for customers to deal with a single vendor for purchase and ongoing support, and the drag effect is quite real.
Specifically, Dell’s margins on networking and storage gear are significantly higher, and those solutions are likely to be bundled with these AI servers as Jeff Clarke noted, “These [AI] models that are being trained require lots of data. That data has got to be stored and fed into the GPU at a high bandwidth, which ties in networking.”
Why enterprise AI adoption is still slow
Jeff Clarke’s further remarks give us some clues about the problems stalling enterprise AI adoption.
First and foremost, customers are actively trying to figure out where and how to apply AI to their business problems, so there is a significant services and consultative selling of Dell’s AI solutions.
“Consistently across enterprise, there are 6 use cases that make their way to the top of most every discussion,” said Clarke. “It’s around content creation, support assistance, natural language search, design and data creation, code generation and document automation. And helping customers understand their data, how to prepare their data for those use cases are what we’re doing today.”
That last statement is especially revealing because it suggests just how early AI projects still are across the board.
It also points at something Clarke isn’t saying directly, which is that AI is still incredibly complicated for the average customer. The data processing, training, and deployment pipeline still works like a fragile Rube Goldberg machine and requires a lot of time and expertise to gain the promised value. Even just knowing where to start is a problem.
Let’s not forget that enterprises faced similar challenges in the Great Cloud Wars which were a barrier to on-prem cloud deployments. An entire cohort of startups emerged to solve the complexity problems and replicate the functionality of public clouds on-premise. Most burnt to ashes when public clouds showed up with their own on-prem solutions, AWS Outposts and Azure Stack.
Then as now, there was the problem of talent. It took an entire decade for cloud skills to diffuse throughout the technical workforce, and the slow process of cloud migration is still taking place even now.
Today’s AI stack is even more complicated, requiring even deeper domain expertise, another problem hyperscale clouds are well positioned to solve through tools and automation deeply integrated with their infrastructures.
Back in the Cloud Wars vendors also touted lower costs of on-prem infrastructure, which could even be true in some cases at scale.
Ultimately, economics prevailed for most enterprises, and the arguments for cheaper infrastructure paled to eliminating operational cost, complexity, and bridging the skills gap.
Even for enterprises who are ready to take on the challenges now, there are supply constraints to overcome. In effect, companies are competing for the same Nvidia GPUs hyperscale and tier-2 cloud providers are purchasing at scale.
In that regard, Dell is a truly massive buyer with an excellent track record in balancing supply of difficult to source components to many customers. However, Dell customers can expect long lead times for GPU servers right now.
Dell is playing a long game — but the cloud providers might win first
While enterprise AI adoption is still in the early stages, Dell is playing for keeps.
The company is betting that the need for on-premises AI infrastructure, especially for latency-sensitive inference workloads, will prove compelling enough for enterprises to invest despite the complexity and skills challenges.
The strategy hinges on helping enterprises overcome the barriers to AI adoption, even if it means sacrificing margins in the near-term on GPU servers.
In doing so, Dell is leveraging its decades of experience in solving complex infrastructure challenges for customers, and its massive scale to keep component supply flowing.
It remains to be seen whether the data problem and allure of edge computing for AI will be enough to overcome the inexorable pull of the cloud this time around.
The next few quarters will tell us if Dell’s strategy is really working, but this game might already be rigged with the cloud providers already fielding numerous enterprise AI offerings running virtually, without a need for much in the way of specific equipment on the customer side.
Source link
lol