00:00 Speaker A
Big tech companies are set to report this week. We got Amazon, Google, Meta, and Microsoft out on Wednesday. Woo. Followed by Apple on Thursday. Gil Luria, a DA Davidson head of technology research, joins me now for this earnings report brought to you by EY. Gil, wow, I hope you’ve been like in training. Um, as we all have been to get ready for Wednesday and all these numbers coming out. Um, your preview note looking ahead to this kind of frames it around a number of questions. And I want to sort of go through those cuz I I like the way that you’ve framed it.
00:41 Speaker A
Um, your question number one has to do with whether the we’ve seen the last three months of AI compute demand continued to accelerate. It certainly feels like we have because folks who are using these LLMs have at times gotten constrained about how much they’ve been able to use them because they seem to be running out of compute in some cases.
01:14 Gil Luria
That’s right. That’s the one question I think we already know the answer to. We’re just going to hear it from these companies on Wednesday. By the way, biggest earnings day ever. to the best of my knowledge, these four largest companies have never reported on the same day. So we’re going to learn a lot in a very short period of time. But yes, demand for AI compute has skyrocketed. It this I mean it was already growing fast, but since the uh introduction to uh to the broad population, the broad enterprise market of a genetic tools, uh genetic tools just take up so much more compute than any other use of AI. and and so the demand has just gotten to a point where Anthropic doesn’t have enough compute to serve its models. It can’t even give us access to its most advanced models because there’s no way they’d have the compute, which is why it’s cutting these huge deals with Amazon and Google in order to get some compute at some point, uh which is to say, there’s a lot more demand for AI compute than we have capacity for right now.
02:40 Speaker A
Well, and it seems like one of the issues now Gil is that there is more resistance to adding that AI compute, right? On the ground, you know, you’ve got a number of municipalities saying we don’t want data centers here. Um you’ve got regulators stepping in also. So, um I I assume that they’re going to have to address that whether they’re going to be able to spend all this big CAPEX they’ve been earmarking.
03:17 Gil Luria
Right. That’s the big question that I’m not sure we know the answer to, because there’s been plenty of reporting over the last three months about delays in data centers to your point, a lot of not in my backyard, a lot of regulation, a lot of constraints on on the ability to access electricity. And so these again, the four big builders of data centers are all at one time going to tell us, are there delays because of this? Are they going to be able to deliver on their plans for opening data centers this year? Will they be able to spend all the CAPEX they’ve allocated? That has implications not just for them and their ability to ramp revenue, but for everybody else downstream. NeoClouds, the semi companies, all the companies selling equipment into data centers. If these if these four companies can’t build data centers at the rate they want, then everybody else won’t be able to live up to their expectations.
04:47 Speaker A
Well, and one of the other intriguing questions you can’t you you had is, okay, they’ve got all this money that they’re going to be spending on this at the same time that prices are going up for all of the things they need to buy the data centers, the ones that are going forward, right? memory prices, uh, included. And so, um will they end up spending as much but get a lesser result in terms of the number of data centers built or the capacity that is built?
05:25 Gil Luria
Right. That’s the irony is that they are they’re encountering so many bottlenecks now, CPUs, memory, optical components, uh energy behind the meter that they are getting through these bottlenecks by just paying for them. So, we heard from Intel last week that Intel had previously discarded CPUs because they weren’t good enough. They had to take them out of the trash and sell them to customers. That’s how desperate customers are for their CPUs. And obviously, Intel was charging more for full-priced CPUs. We know that the memory companies have had skyrocketing results, skyrocketing prices. It’s the hyper scalers that are paying for it. So to your point, ironically, they may still spend what they were planning, 200 billion dollars a year each, but just not get as much as they were hoping for because they have to pay so much more to get through these bottlenecks.
06:47 Speaker A
Gil it also feels like we are still not going to the the other outstanding question is the ROI question. What is the return on all these investments? And we’ve gotten like little bits and pieces from some of these companies, but it doesn’t feel like we’re going to still get a very satisfying answer this earnings season yet.
07:12 Gil Luria
I I think we’re getting that answer elsewhere. I think we’re getting that answer from Open AI and Anthropic who are now at least as of a couple of weeks ago at a combined run rate of more than $50 billion. And that’s before they’ve optimized on monetization. What I mean by that is Anthropic is still not being able is still not price discriminating. It’s still not able to sell all the all the demand because they don’t have enough compute. Open AI hasn’t ramped up advertising in Chat GPT yet. So those two are already at a $50 billion dollar a year run rate before they optimize and that number is still growing and their models are still getting better. So that’s the main place where we see that there really is true demand for what AI can do. And again, those two companies then turn around and spend that money on Microsoft, Amazon, Google, who then turn around and spend that money on Nvidia, AMD, Broadcom, and Intel. So we are getting some returns now and and we can see a path for those returns to get a lot bigger than they are right now.