AI changes the world. But how?
What will it look like?
It’s a tough one, but to predict the AI future, we first we have to go back.
We have to go back to the 1950’s and a prior tech revolution which changed the world. Because its the same pattern for AI.
How Malcom McLean changed the World.
On April 26, 1956 Malcom McLean’s trucking company delivered 58 shipping containers from New Jersey to Houston. The first container shipping event in history.
Shaking off the negativity, his companies pressed on, building prototype containers, designing cranes, and retrofitting an old tanker to hold containers, and finally running real-world experiments. Eventually, he had enough equipment to make a full-scale test run between two ports. It began on April 26, 1956, when 58 containers were loaded onto the Ideal-X in New Jersey. Five days later, they were unloaded in Houston, making this the first significant container shipping event in history.
It was one of those days we appreciate only in hindsight. The Houston container delivery never made the nightly news, but 60 years later we realise it was the most important event of the day. An event which would change the world.
McLean had been trying to figure out how to use containers to integrate the shipment of cargo between trucks and ships to remove the cost and time of loading and unloading. The idea had been around, and others had worked on it, but McLean realised it required more than just a container. The container alone wouldn’t do it. It required a whole ecosystem.
Malcolm McLean’s fundamental insight, commonplace today but quite radical in the 1950s, was that the shipping industry’s business was moving cargo, not sailing ships. That insight let him to a concept of containerization quite different from anything that had come before. McLean understood that reducing the cost of shipping goods required not just a metal box but an entire new way of handling freight. Every part of the system — ports, ships, cranes, storage facilities, trucks, trains, and the operations of the shippers themselves — would have to change. In that understanding, he was years ahead of almost everyone else in the transportation industry.
Source: ‘The Box’ by Marc Levinson
The move to shipping containers standardised global trade and lowered the cost of moving goods. But it didn’t just standardise the equipment, and ports. Containers standardised the legal contracts too. Now a factory could move a ‘container’ from Shanghai to New York with one contract, and insure a ‘container’ as well. Everything could be quoted in ‘TEU’s’, 20ft equivalent containers.
The first order effect of McLean’s innovation was that trade was easier and cheaper.
The second order effects were harder to see, but bigger. And would unfold over decades.
Cheaper, easier shipping unlocked the potential to manufacture in far away places, like Asia, where the cost of labor was lower. It was a key enabler of globalisation. Container volumes boomed.
A new business model emerged, enabled by containerisation. Outsourced manufacturing. Firms like Nike and Apple realised maybe you didn’t have to own your own factory in Ohio. Maybe a company just needs to focus on product design, marketing and sales. Manufacturing can be unbundled, and done in Asia. Maybe, it’s actually a disadvantage to own your own factory. In Asia we see the simultaneous rise of outsourced manufacturing companies like Flextronics, and Foxconn.
And the whole US labor force has to change. Manufacturing isn’t something you do anymore as a job. You don’t grow up to work in a factory. Engineering maybe, but working in a factory is over. Manufacturing as % of employment drops from 26% to 10%.
On the other side of the Pacific it’s the rise of Asia. South Korea, Japan, Taiwan and Singapore go from sewing T-shirts to fabricating H100’s. Real estate in HK, Singapore and Tokyo booms. Fortunes are made.
The container revolution also separated nations into the haves and have-nots of modern ports. A country with efficient, but expensive ports could be integrated into these emerging global supply chains. But for countries without mega container ports fewer ships would connect with a country, and so manufacturing there wasn’t as attractive. If you weren’t part of the ecosystem, you got left behind. It was the genius of Dubai building Jebel Ali port in 1979. The reduction in cost of shipping over water also advantaged coastal regions over inland cities.
So many unexpected second order effects.
Yes, lots of overlapping trends were at play, but McLean’s vision of containers, and the ecosystem around shipping changed the world and enabled globalisation.
There were fortunes to be made, but you had to envision a whole new way of doing things globally. You had to see how who made what, and where, would radically change. US business structures would change, the US labor force would change, the geography of Asia would change. Where value would accrue would change.
And if you had predicted all these things in the 1960’s it would have sounded crazy.
The story of the shipping container is fascinating, but I introduce it because it’s how we should look at AI.
AI as the New China and JSON as the container
I think of AI as a new China for office work, and JSON (Java Script Object Notation) objects as the modern day container. AI, or LLM’s, provide us with an unlimited supply of cheap office labor which never complains, or strikes. And JSON objects and API’s are how those AI’s ship ‘thought’ around the world.
And like with the 20 foot steel box, there is nothing amazing about a JSON object. It’s just a box for information. And nothing says the standard can’t evolve in the future, but it’s what computers use now and there is a whole system built around it already.
It’s easy to recognise a JSON object. In computer code the ‘box’ is two curly braces ({}). In between the curly braces are whatever key:value parings you want to create.
Below is an example of a possible JSON object for a write up on Las Vegas Sands. The Formatting style is that the opening and closing {}’s are on separate lines. That’s your modern day information container.
{
"name": "Las Vegas Sands Corp.",
"ticker": "LVS",
"sector": "Casinos/Gaming",
"currentSharePrice": 37,
"theme": "LVS is in a great position to benefit from the long- term growth of wealth in Asia, with dominant market shares in Macau and Singapore."
}
Behind the OpenAI website this ‘key: value’ format is how you are interacting with LLM’s. These JSON’s are packaged into ‘API calls’ (Application Programming Interface) which are sent to the LLM’s. Here is an example.
In line 1 we define a variable ‘data’ (const data = {JSON object}) which becomes our JSON package. Our JSON package includes which LLM model to use ("model": "gpt-3.5-turbo"), how the LLM should respond in its ‘role’ ("You are a helpful assistant."), and our chat question (“Why do YWR readers seem so smart?”).
This ‘data’ variable is sent to ‘api.openai.com/v1/chat/completions' in an API ‘POST’ statement (line 15-20). We are ‘posting’ information to ChatGPT and then waiting for a response.
‘Then’ (line 23) we print the response variable from the JSON object ChatGPT sent back, which is also called ‘data’, but which has in it a key:value pairing of content:”answer to our question”.
And the ‘answer’ could be “YWR readers seem so smart because they are individual thinkers who engage deeply with the content, applying critical thinking and curiosity.”
Sorry to drag you through the code, but that’s your container shipping system for information and I want you to see how it works.
Information packaged into JSON objects ({}’s) and shipped around the word using API calls (‘POST’ and ‘GET’).
Now how this all comes together.
The ‘$APP’ Example and The Future of AI workflow
A surprise benefit of writing YWR is all the switched on people I get to meet. Like Graham, an entrepreneur in Johannesburg.
Graham was interested in the ‘anchoring bias’ I wrote about in ‘Why Estimate Revisions Work’ and wanted to see if he could use AI to search for ‘anchoring bias’ in a stock called App Lovin ($APP).
This was fascinating for two reasons. First, it blew me away that Graham using AI to search for a behavioural bias (anchoring) within the analyst research about a stock. That is an insanely cool new way to use AI! It’s the step beyond ‘sentiment analysis’.
But I also loved his process:
(1) I created a custom prompt to get all the latest info on $APP and gave this to ChatGPT Deep Research
(2) this produced an in-depth (11,000 word) up-to-date factbase on $APP and it's earning trends and anchoring factors for further investigation
(3) I loaded this research piece into ChatGPT o1 Pro as the core "factbase" re $APP. I then dived into a "de-anchoring exercise" where I used a set of 10 custom prompts (developed inline with your piece) to systematically de-anchor from bad analyst habits.
(4) I got o1 Pro to (a) summarise the findings of the exercise in a research report and then to (b) critically analyse it's own findings
TL/DR the answer I got:
Recommendation: BUY with a target time horizon of 12–18 months, capitalizing on current undervaluation and the structural pivot to high-margin ad tech leadership. Analysts who adjust forecasts to incorporate both near-term caution and mid-term optimism (sustained margin power, new verticals) will find compelling upside relative to consensus.
I went through Graham's files for Step 2 and 4 and they were immense, especially step 2.
But you know what’s fascinating?
When I spoke to Graham he said he had never read any of the files. He didn’t care about the 11,000 word report in Step 2. Yes, he cared how it was created, but not what it said. Step 2 was just fuel for the AI in Step 3. Graham cared about the design of the AI work flow and how these documents were processed at each stage to get to the final product.
Like a CEO, Graham had identified what work was profitable (searching for behavioural bias in $APP), and how the work should be done. Graham would evaluate the final result, but didn’t need to micromanage everything in between.
This is such a big mind shift we are doing a whole livestream on it.
Graham did this manually using cut and paste in ChatGPT, but imagine this done using JSON objects. Imagine AI’s passing JSON ‘containers’ from one to the other, with each AI adding special value based on its training.
Now lets pull it all together.
AI as a Workflow Ecosystem not a Tool.
Here is where most miss the big picture.
We are asking how to use AI to do our current job. We are asking how AI fits into our Microsoft Windows interface because Windows is our concept of how information flows. And it’s what our IT department will allow.
This is thinking of AI as a ‘tool’ as in How can Co-Pilot help me write an email? ‘How can AI help me summarise this file?
These questions are fine for some roles, like for a CEO, but for many workplace jobs it’s upside down. Maybe in the future that email doesn’t need to be written. And maybe you don’t need to summarise that file.
Flip everything around and ask, ‘How do I augment/fit into an AI work flow?’ What value add do I bring to an AI process?
And how do we pay for ‘work’ in the future? Is it still based on time and hours’? My guess is no.
Remember the container changed the flow of everything. It created a new global network. AI is the same.
The power of the AI is not one AI as a tool on your desktop. The information workflow of the future is not emails, but AI’s passing JSON’s from one to another. And companies built for AI agentic workflows will surpass traditional human oriented companies.
Build your firm with a backbone around how AI’s will work and how humans augment that process. Build for that ecosystem.
Another example of a paradigm change.
When the internet was invented there was a new type of store, the online store, and while it wasn’t apparent at first, the economics of an online store was vastly superior to a brick and mortar store. And it also took a while to learn which stores should remain physical (human services) and which should be online (buying goods). In the years to follow, physical retail chains would still exist, but in many cases it became a cost disadvantage to have physical stores vs. online stores. And what would have been unimaginable in 1999, the most valuable store in the world, would have no physical shops at all.
Having a large labor force in 2030 is like having a factory in Ohio in the 70’s, or having a nationwide store footprint in 2010. A prior advantage becomes a liability. Yes, there will still be lots of companies with information workflows built around humans and Microsoft Windows, but we will have to figure out where humans still make sense. Companies who don’t adjust will struggle compared to companies built for Agentic AI workflow, where humans are designed to augment things AI cannot.
We’re all figuring this out and there is still much to learn.
In the meantime I want to introduce you to Graham and his work on how we make you a super human using AI.
Thursday, April 10th at 2pm UK time we are going live stream on:
Thinking like a CEO: How to Thrive not Survive in an AI world.
I don’t want steal Graham’s thunder but he’s done really cool work on how to use LLM’s to improve both your work and your personal life.
Have a great weekend.
It’s Spring!
Erik
Excellent, just excellent. This is the piece that made me a subscriber. Fewer than 10 people in discretionary active management could have written this. I know I couldn't and I've been at this for 30 years and in the AI space for 3 years. I've been building AI analytics, and here is the launch of positive revisions for your NVDA example, right on cue. In of course, .json :)
date": "2023-01-05",
"transcriptLabel": "NVDA JPMorgan CES Tech/Auto Forum",
"keyTopics": [
"Introduction of new Ada architecture GPUs (4090, 4080, 4070 Ti) driving the gaming cycle",
"Integration of cloud gaming (GFN) with automotive applications",
"Continued advancements in data center platforms and automotive solutions leveraging Orin"
],
"metrics": {
"ProductAdoption": "High demand for 4090; robust DLSS 3 implementation"
},
"notableQuotes": [
"Our 4090 is tight in availability and 4080 pricing is marked up relative to MSRP, reflecting strong demand."
]
},
{
"date": "2023-02-22",
"transcriptLabel": "NVDA Q4 2023",
"keyTopics": [
"Stable data center growth driven by H100 uptake and generative AI emphasis",
"Mixed segment performance with recovering Gaming and record Automotive revenue",
"Launch of NVIDIA AI cloud services (DGX Cloud) and integrated AI platforms across public/private clouds"
],
"metrics": {
"QuarterRevenue": "$6.05 billion",
"FullYearRevenue": "$27 billion",
"DataCenter": "$3.62 billion",
"Gaming": "$1.83 billion",
"Automotive": "$294 million"
},
"notableQuotes": [
"H100 is as much as 9x faster than the A100 for training and delivers up to 30x faster inferencing for transformer-based models.",
"Generative AI applications will help almost every industry do more faster."
]
}
]
The AI/JSON is China/Container analogy is genius!