View Single Post
Old 01-27-2025, 01:12 PM   #216
Hondaracer
I have named my kids VIC and VLS
 
Hondaracer's Avatar
 
Join Date: Oct 2001
Posts: 39,479
Thanked 16,089 Times in 6,561 Posts
Failed 2,165 Times in 747 Posts
This is commentary from an analyst that works for my investment advisor


Some of our tech names and industrials are being hit on the release of DeepSeek's new AI model which has made some technical leaps on efficiency. This is a quick summary of what we know so far and some of the
potential implications. We hope to gain more clarity as MSFT, GOOG, AWS, and META all report over the next couple of weeks.

Here is what we know so far:
• DeepSeek's models are considered to be as good as top-performing models. The model achieves performance comparable to leading closed-source models like GPT-4o and Claude-3.5-Sonnet.
• DeepSeek offers its API at a much cheaper price point for users. Specifically, it is cited as being 96% cheaper than some alternatives.
• Training the DeepSeek model cost $5.6mm. As a comparator, Meta's Llama cost well over $60mm.
• So DeepSeek is roughly 90-95% cheaper to operate than current leading models.
• DeepSeek is opensource. Any of the Hyperscalers can pick it up and run it. (which addresses sending data to China)
Here is what we are thinking:
• The hyperscalers (GOOG, MSFT, AWS) seem like poential winners here. They can implement a cheaper model, incorporate the techniques into their existing models, and provide AI at a much cheaper cost. Pricing may have to come down as well. But AI specific revenue is a small part of revenue compared to the rest of their businesses.
• If DeepSeek's math turns out to be true, the Hyperscalers have just been gifted a significant amount of capacity. 10 Blackwells can now do the work of 100 or more. There could be a risk that they have too much capacity in the short term and may rethink their capital spending.
• That is putting pressure on the semiconductor companies and the companies tied to datacenter buildouts (NVDA, AVGO, ETN, APH).
• If capex does slow, semiconductor companies will slow their own capex spending which could negatively impact the semi equipment companies like AMAT.
• But we don't know if capex is at risk until we here from GOOG, META, AWS, and MSFT's earnings calls. Meta recently came out with a $60-65b estimate for 2025 capex which was much higher than analyst expectations. Will they stick to it on their call?
• At the end of the day, this innovation is positive. AI is becoming cheaper. And we would expect demand for AI to respond accordingly. For instance, Apple will be able to integrate AI into Siri in a cost effective manner. However, there could be some short term disruption/noise as Hyperscalers potentially find themselves underutilized given the large efficiency gains the new models bring.
What we'll be look for on earnings calls:
• Capex spending intentions.
• New models in response to DeepSpace
• Intentions to deploy DeepSpace on AWS, GOOG, Azure
• Any change in intentions on which types of chips to use
• Pricing / revenue implications for API access
__________________
Dank memes cant melt steel beams
Hondaracer is offline   Reply With Quote