The market is over discounting software disruption, under estimating datacenter disruption, and still missing the new moat in a world where everyone has AI.
One of the best under the radar software investors around is Long Path. He has a very similar thesis, but explains it slightly differently. He's focused on non-US SMID companies to exploit the valuation arbitrage.
I'm curious if you think that this evolution will favor the larger companies who can invest around the interpretation layer? All else equal, doesn't it favor the bigger companies with more resources, versus the niche specialty companies (i.e. those that CSU would target)?
Coding use to be the bottleneck and that is now no longer the case. We are moving from information arbitrage to cognitive arbitrage, and therefor having access to information within your firm is critical. I wrote about this in "IT Spaghetti" https://pancrasbeekenkamp1.substack.com/p/pers-view-it-spaghetti and The Agentic Dilemma https://pancrasbeekenkamp1.substack.com/p/pers-view-the-agentic-dilemma. The problem for smaller (nimbler?) companies is that they can solve a niche problem quicker and cheaper. But for the buyer there is a cost of finding them, a risk of using them (nobody was every fired for buying IBM), the question of long term service, and the biggest issue of intra-operability. Anthropic solves part of this issue with MCP, but these are still core reasons why I think this will not happen until the growth rate of innovation has slowed down considerably and we are in a steady state. And I think we are a long way away from that - companies now first need to solve their IT spaghetti, not add to it
Pancras really enjoyed reading your post. One push back I have is on the depreciation of chips. I'm with Burry here. I used to work as an HPC Sys Admin. When your run the CPUs/GPUs at 100% for three straight years the wear and tear is real. At my job we always renew the servers every 4 years a) because of new tech b) the wear and tear. All that heat that is generated by the chip has an a real effect despite the cooling. Cheers.
I’m advocating that not only wear and tear but especially the tech obsolescence because of the replacement cycle should shorten half-life and increase depreciation. Something few companies currently do
Thanks Pancras,
One of the best under the radar software investors around is Long Path. He has a very similar thesis, but explains it slightly differently. He's focused on non-US SMID companies to exploit the valuation arbitrage.
I'm curious if you think that this evolution will favor the larger companies who can invest around the interpretation layer? All else equal, doesn't it favor the bigger companies with more resources, versus the niche specialty companies (i.e. those that CSU would target)?
Coding use to be the bottleneck and that is now no longer the case. We are moving from information arbitrage to cognitive arbitrage, and therefor having access to information within your firm is critical. I wrote about this in "IT Spaghetti" https://pancrasbeekenkamp1.substack.com/p/pers-view-it-spaghetti and The Agentic Dilemma https://pancrasbeekenkamp1.substack.com/p/pers-view-the-agentic-dilemma. The problem for smaller (nimbler?) companies is that they can solve a niche problem quicker and cheaper. But for the buyer there is a cost of finding them, a risk of using them (nobody was every fired for buying IBM), the question of long term service, and the biggest issue of intra-operability. Anthropic solves part of this issue with MCP, but these are still core reasons why I think this will not happen until the growth rate of innovation has slowed down considerably and we are in a steady state. And I think we are a long way away from that - companies now first need to solve their IT spaghetti, not add to it
Pancras really enjoyed reading your post. One push back I have is on the depreciation of chips. I'm with Burry here. I used to work as an HPC Sys Admin. When your run the CPUs/GPUs at 100% for three straight years the wear and tear is real. At my job we always renew the servers every 4 years a) because of new tech b) the wear and tear. All that heat that is generated by the chip has an a real effect despite the cooling. Cheers.
I’m advocating that not only wear and tear but especially the tech obsolescence because of the replacement cycle should shorten half-life and increase depreciation. Something few companies currently do
This is the best article on AI I have read (I have read a lot). Just excellent.
Thank you for sharing, lucid writing and thought provoking.