All articles

Featured

Saying Goodbye: 10 Years, a 19% Annual Return, and 17 Investing Lessons

9 years 7 months and 6 days. This is how much time has passed since I started managing my family’s investment portfolio of US stocks on 26 October 2010. 19.5% versus 12.7%. These are the respective annual returns of my family’s portfolio (without dividends) and the S&P 500 (with dividends) in that period.

As of 31 May 2020

I will soon have to say goodbye to the portfolio. Jeremy Chia (my blogging partner) and myself have co-founded a global equities investment fund. As a result, the lion’s share of my family’s investment portfolio will soon be liquidated so that the cash can be invested in the fund. 

The global equities investment fund will be investing with the same investment philosophy that underpins my family’s portfolio, so the journey continues. But my heart’s still heavy at having to let the family portfolio go. It has been a huge part of my life for the past 9 years 7 months and 6 days, and I’m proud of what I’ve achieved (I hope my parents are too!).

In the nearly-10 years managing the portfolio, I’ve learnt plenty of investing lessons. I want to share them here, to benefit those of you who are reading, and to mark the end of my personal journey and the beginning of a new adventure. I did not specifically pick any number of lessons to share. I’m documenting everything that’s in my head after a long period of reflection. 

Do note that my lessons may not be timeless, because things change in the markets. But for now, they are the key lessons I’ve picked up. 

Lesson 1: Focus on business fundamentals, not macroeconomic or geopolitical developments – there are always things to worry about

My family’s portfolio has many stocks that have gone up multiple times in value. A sample is given below:

Some of them are among the very first few stocks I bought; some were bought in more recent years. But what’s interesting is that these stocks produced their gains while the world experienced one crisis after another.

You see, there were always things to worry about in the geopolitical and macroeconomic landscape since I started investing. Here’s a short and incomplete list (you may realise how inconsequential most of these events are today, even though they seemed to be huge when they occurred):

  • 2010 – European debt crisis; BP oil spill; May 2010 Flash Crash
  • 2011 – Japan earthquake; Middle East uprising
  • 2012 – Potential Greek exit from Eurozone; Hurricane Sandy
  • 2013 – Cyprus bank bailouts; US government shutdown; Thailand uprising
  • 2014 – Oil price collapse
  • 2015 – Crash in Euro dollar against the Swiss Franc; Greece debt crisis
  • 2016 – Brexit; Italy banking crisis
  • 2017 – Bank of England hikes interest rates for first time in 10 years
  • 2018 – US-China trade war
  • 2019 – Australia bushfires; US President impeachment; appearance of COVID-19 in China
  • 2020 (thus far) – COVID-19 becomes global pandemic

The stocks mentioned in the table above produced strong business growth over the years I’ve owned them. This business growth has been a big factor in the returns they have delivered for my family’s portfolio. When I was studying them, my focus was on their business fundamentals – and this focus has served me well.

In a 1998 lecture for MBA students, Warren Buffett was asked about his views on the then “tenuous economic situation and interest rates.“ He responded:

“I don’t think about the macro stuff. What you really want to do in investments is figure out what is important and knowable. If it is unimportant and unknowable, you forget about it. What you talk about is important but, in my view, it is not knowable.

Understanding Coca-Cola is knowable or Wrigley’s or Eastman Kodak. You can understand those businesses that are knowable. Whether it turns out to be important depends where your valuation leads you and the firm’s price and all that. But we have never not bought or bought a business because of any macro feeling of any kind because it doesn’t make any difference.

Let’s say in 1972 when we bought See’s Candy, I think Nixon [referring to former US President, Richard Nixon] put on the price controls a little bit later, but so what! We would have missed a chance to buy something for [US]$25 million that is producing [US]$60 million pre-tax now. We don’t want to pass up the chance to do something intelligent because of some prediction about something we are no good on anyway.”

Lesson 2: Adding to winners work

I’ve never shied away from adding to the winners in my portfolio, and this has worked out well. Here’s a sample, using some of the same stocks shown in the table in Lesson 1.

Adding to winners is hard to achieve, psychologically. As humans, we tend to anchor to the price we first paid for a stock. After a stock has risen significantly, it’s hard to still see it as a bargain. But I’ll argue that it is stocks that have risen significantly over a long period of time that are the good bargains. It’s counterintuitive, but hear me out.

The logic here rests on the idea that stocks do well over time if their underlying businesses do well. So, the stocks in my portfolio that have risen significantly over a number of years are likely – though not always – the ones with businesses that are firing on all cylinders. And stocks with businesses that are firing on all cylinders are exactly the ones I want to invest in. 

Lesson 3: The next Amazon, is Amazon

When I first bought shares of Amazon in April 2014 at US$313, its share price was already more than 200 times higher than its IPO share price of US$1.50 in May 1997. That was an amazing annual return of around 37%.

But from the time I first invested in Amazon in April 2014 to today, its share price has increased by an even more impressive annual rate of 40%. Of course, it is unrealistic to expect Amazon to grow by a further 200 times in value from its April 2014 level over a reasonable multi-year time frame. But a stock that has done very well for a long period of time can continue delivering a great return. Winners often keep on winning.    

Lesson 4: Focus on business quality and don’t obsess over valuation

It is possible to overpay for a company’s shares. This is why we need to think about the valuation of a business. But I think it is far more important to focus on the quality of a business – such as its growth prospects and the capability of the management team – than on its valuation.

If I use Amazon as an example, its shares carried a high price-to-free cash flow (P/FCF) ratio of 72 when I first invested in the company in April 2014. But Amazon’s free cash flow per share has increased by 1,000% in total (or 48% annually) from US$4.37 back then to US$48.10 now, resulting in the overall gain of 681% in its share price.

Great companies could grow into their high valuations. Amazon’s P/FCF ratio, using my April 2014 purchase price and the company’s current free cash flow per share, is just 6.5 (now that’s a value stock!). But there’s no fixed formula that can tell you what valuation is too high for a stock. It boils down to subjective judgement that is sometimes even as squishy as an intuitive feeling. This is one of the unfortunate realities of investing. Not everything can be quantified.   

Lesson 5: The big can become bigger – don’t obsess over a company’s market capitalisation

I’ve yet to mention Mastercard, but I first invested in shares of the credit card company on 3 December 2014 at US$89 apiece. Back then, it already had a huge market capitalisation of around US$100 billion, according to data from Ycharts. Today, Mastercard’s share price is US$301, up more than 200% from my initial investment. 

A company’s market capitalisation alone does not tell us much. It is the company’s (1) valuation, (2) size of the business, and (3) addressable market, that can give us clues on whether it could be a good investment opportunity. In December 2014, Mastercard’s price-to-earnings (P/E) ratio and revenue were both reasonable at around 35 and US$9.2 billion, respectively. Meanwhile, the company’s market opportunity still looked significant, since cashless transactions represented just 15% of total transactions in the world back then.

Lesson 6: Don’t ignore “obvious” companies just because they’re well known

Sticking with Mastercard, it was an obvious company that was already well-known when I first invested in its shares. In the first nine months of 2014, Mastercard had more than 2 billion credit cards in circulation and had processed more than 31.4 billion transactions. Everyone could see Mastercard and know that it was a great business. It was growing rapidly and consistently, and its profit and free cash flow margins were off the charts (nearly 40% for both).

The company’s high quality was recognised by the market – its P/E ratio was high in late 2014 as I mentioned earlier. But Mastercard still delivered a fantastic annual return of around 25% from my December 2014 investment.

I recently discovered a poetic quote by philosopher Arthur Schopenhauer: “The task is… not so much to see what no one has yet seen, but to think what nobody has yet thought, about that which everyone sees.” This is so applicable to investing.

Profitable investment opportunities can still be found by thinking differently about the data that everyone else has. It was obvious to the market back in December 2014 that Mastercard was a great business and its shares were valued highly because of this. But by thinking differently – with a longer-term point of view – I saw that Mastercard could grow at high rates for a very long period of time, making its shares a worthy long-term investment. From December 2014 to today, Mastercard’s free cash flow per share has increased by 158% in total, or 19% per year. Not too shabby.   

Lesson 7: Be willing to lose sometimes

We need to take risks when investing. When I first invested in Shopify in September 2016, it had a price-to-sales (P/S) ratio of around 12, which is really high for a company with a long history of making losses and producing meagre cash flow. But Shopify also had a visionary leader who dared to think and act long-term. Tobi Lütke, Shopify’s CEO and co-founder, penned the following in his letter to investors in the company’s 2015 IPO prospectus (emphases are mine):

“Over the years we’ve also helped foster a large ecosystem that has grown up around Shopify. App developers, design agencies, and theme designers have built businesses of their own by creating value for merchants on the Shopify platform. Instead of stifling this enthusiastic pool of talent and carving out the profits for ourselves, we’ve made a point of supporting our partners and aligning their interests with our own. In order to build long-term value, we decided to forgo short-term revenue opportunities and nurture the people who were putting their trust in Shopify. As a result, today there are thousands of partners that have built businesses around Shopify by creating custom apps, custom themes, or any number of other services for Shopify merchants.

This is a prime example of how we approach value and something that potential investors must understand: we do not chase revenue as the primary driver of our business. Shopify has been about empowering merchants since it was founded, and we have always prioritized long term value over short-term revenue opportunities. We don’t see this changing…

… I want Shopify to be a company that sees the next century. To get us there we not only have to correctly predict future commerce trends and technology, but be the ones that push the entire industry forward. Shopify was initially built in a world where merchants were simply looking for a homepage for their business. By accurately predicting how the commerce world would be changing, and building what our merchants would need next, we taught them to expect so much more from their software.

These underlying aspirations and values drive our mission: make commerce better for everyone. I hope you’ll join us.”       

Shopify was a risky proposition. But it paid off handsomely. In investing, I think we have to be willing to take risks and accept that we can lose at times. But failing at risk-taking from time to time does not mean our portfolios have to be ruined. We can take intelligent risks by sizing our positions appropriately. Tom Engle is part of The Motley Fool’s investing team in the US. He’s one of the best investors the world has never heard of. When it comes to investing in risky stocks that have the potential for huge returns, Tom has a phrase I love: “If it works out, a little is all you need; if it doesn’t, a little is all you want.” 

I also want to share a story I once heard from The Motley Fool’s co-founder Tom Gardner. Once, a top-tier venture capital firm in the US wanted to improve the hit-rate of the investments it was making. So the VC firm’s leaders came up with a process for the analysts that could reduce investing errors. The firm succeeded in improving its hit-rate (the percentage of investments that make money). But interestingly, its overall rate of return became lower. That’s because the VC firm, in its quest to lower mistakes, also passed on investing in highly risky potential moonshots that could generate tremendous returns.

The success of one Shopify can make up for the mistakes of many other risky bets that flame out. To hit a home run, we must be willing to miss at times.  

Lesson 8: The money is made on the holding, not the buying and selling

My family’s investment portfolio has over 50 stocks. It’s a collection that was built steadily over time, starting with the purchase of just six stocks on 26 October 2010. In the 9 years, 7 months and 6 days since, I’ve only ever sold two stocks voluntarily: (1) Atwood Oceanics, an owner of oil rigs; and (2) National Oilwell Varco, a supplier of parts and equipment that keep oil rigs running. Both stocks were bought on 26 October 2010.

David Gardner is also one of the co-founders of The Motley Fool (Tom Gardner is his brother). There’s something profound David once said about portfolio management that resonates with me:

“Make your portfolio reflect your best vision for our future.” 

The sales of Atwood Oceanics and National Oilwell Varco happened because of David’s words. Part of the vision I have for the future is a world where our energy-needs are met entirely by renewable sources that do not harm the precious environment we live in. For this reason, I made the rare decision to voluntarily part ways with Atwood Oceanics and National Oilwell Varco in September 2016 and June 2017, respectively.

My aversion to selling is by design – because I believe it strengthens my discipline in holding onto the winners in my family’s portfolio. Many investors tend to cut their winners and hold onto their losers. Even in my earliest days as an investor, I recognised the importance of holding onto the winners in driving my family portfolio’s return. Being very slow to sell stocks has helped me hone the discipline of holding onto the winners. And this discipline has been a very important contributor to the long run performance of my family’s portfolio.

The great Charlie Munger has a saying that one of the keys to investing success is “sitting on your ass.” I agree. Patience is a virtue. And talking about patience… 

Lesson 9: Be patient – some great things take time

Some of my big winners needed only a short while before they took off. But there are some that needed significantly more time. Activision Blizzard is one such example. As I mentioned earlier, I invested in its shares in October 2010. Then, Activision Blizzard’s share price went nowhere for more than two years before it started rocketing higher.

Peter Lynch once said: “In my investing career, the best gains usually have come in the third or fourth year, not in the third or fourth week or the third or fourth month.” The stock market does not move according to our own clock. So patience is often needed.

Lesson 10: Management is the ultimate source of a company’s economic moat

In my early days as an investor, I looked for quantifiable economic moats. These are traits in a company such as (1) having a network effect, (2) being a low-cost producer, (3) delivering a product or service that carries a high switching cost for customers, (4) possessing intangible assets such as intellectual property, and (5) having efficient scale in production. 

But the more I thought about it, the more I realised that a company’s management team is the true source of its economic moat, or lack thereof.

Today, Netflix has the largest global streaming audience with a pool of 183 million subscribers around the world. Having this huge base of subscribers means that Netflix has an efficient scale in producing content, because the costs can be spread over many subscribers. Its streaming competitors do not have this luxury. But this scale did not appear from thin air. It arose because of Netflix’s CEO and co-founder, Reed Hastings, and his leadership team.

The company was an early pioneer in the streaming business when it launched its streaming service in 2007. In fact, Netflix probably wanted to introduce streaming even from its earliest days. Hastings said the following in a 2007 interview with Fortune magazine: 

“We named the company Netflix for a reason; we didn’t name it DVDs-by-mail. The opportunity for Netflix online arrives when we can deliver content to the TV without any intermediary device.”

When Netflix first started streaming, the content came from third-party producers. In 2013, the company launched its first slate of original programming. Since then, Netflix has ramped up its original content budget significantly. The spending has been done smartly, as Netflix has found plenty of success with its original programming. For instance, in 2013, the company became the first streaming provider to be nominated for a primetime Emmy. And in 2018 and 2019, the company snagged 23 and 27 Emmy wins, respectively.  

A company’s current moat is the result of management’s past actions; a company’s future moat is the result of management’s current actions. Management is what creates the economic moat.

Lesson 11: Volatility in stocks is a feature, not a bug

Looking at the table in Lesson 1, you may think that my investment in Netflix was smooth-sailing. It’s actually the opposite. 

I first invested in Netflix shares on 15 September 2011 at US$26 after the stock price had fallen by nearly 40% from US$41 in July 2011. But the stock price kept declining afterward, and I bought more shares at US$16 on 20 March 2012. More pain was to come. In August 2012, Netflix’s share price bottomed at less than US$8, resulting in declines of more than 70% from my first purchase, and 50% from my second.  

My Netflix investment was a trial by fire for a then-young investor – I had started investing barely a year ago before I bought my first Netflix shares. But I did not panic and I was not emotionally affected. I already knew that stocks – even the best performing ones – are volatile over the short run. But my experience with Netflix drove the point even deeper into my brain.

Lesson 12: Be humble – there’s so much we don’t know

My investment philosophy is built on the premise that a stock will do well over time if its business does well too. But how does this happen?

In the 1950s, lawmakers in the US commissioned an investigation to determine if the stock market back then was too richly priced. The Dow (a major US stock market benchmark) had exceeded its peak seen in 1929 before the Great Depression tore up the US market and economy. Ben Graham, the legendary father of value investing, was asked to participate as an expert on the stock market. Here’s an exchange during the investigation that’s relevant to my discussion:

Question to Graham: When you find a special situation and you decide, just for illustration, that you can buy for 10 and it is worth 30, and you take a position, and then you cannot realize it until a lot of other people decide it is worth 30, how is that process brought about – by advertising, or what happens?

Graham’s response: That is one of the mysteries of our business, and it is a mystery to me as well as to everybody else. We know from experience that eventually the market catches up with value. It realizes it in one way or another.”   

More than 60 years ago, one of the most esteemed figures in the investment business had no idea how stock prices seemed to eventually reflect their underlying economic values. Today, I’m still unable to find any answer. If you’ve seen any clues, please let me know! This goes to show that there’s so much I don’t know about the stock market. It’s also a fantastic reminder for me to always remain humble and be constantly learning. Ego is the enemy.  

Lesson 13: Knowledge compounds, and read outside of finance

Warren Buffett once told a bunch of students to “read 500 pages… every day.” He added, “That’s how knowledge works. It builds up, like compound interest. All of you can do it, but I guarantee not many of you will do it.” 

I definitely have not done it. I read every day, but I’m nowhere close to the 500 pages that Buffett mentioned. Nonetheless, I have experienced first hand how knowledge compounds. Over time, I’ve been able to connect the dots faster when I analyse a company. And for companies that I’ve owned shares of for years, I don’t need to spend much time to keep up with their developments because of the knowledge I’ve acquired over the years.

Reading outside of finance has also been really useful for me. I have a firm belief that investing is only 5% finance and 95% everything else. Reading about psychology, society, history, science etc. can make us even better investors than someone who’s buried neck-deep in only finance books. Having a broad knowledge base helps us think about issues from multiple angles. This brings me to Arthur Schopenhauer’s quote I mentioned earlier in Lesson 6:  “The task is… not so much to see what no one has yet seen, but to think what nobody has yet thought, about that which everyone sees.”

Lesson 14: The squishy things matter

Investing is part art and part science. But is it more art than science? I think so. The squishy, unquantifiable things matter. That’s because investing is about businesses, and building businesses involves squishy things.

Jeff Bezos said it best in his 2005 Amazon shareholders’ letter (emphases are mine):

As our shareholders know, we have made a decision to continuously and significantly lower prices for customers year after year as our efficiency and scale make it possible. This is an example of a very important decision that cannot be made in a math-based way.

In fact, when we lower prices, we go against the math that we can do, which always says that the smart move is to raise prices. We have significant data related to price elasticity. With fair accuracy, we can predict that a price reduction of a certain percentage will result in an increase in units sold of a certain percentage. With rare exceptions, the volume increase in the short term is never enough to pay for the price decrease.

However, our quantitative understanding of elasticity is short-term. We can estimate what a price reduction will do this week and this quarter. But we cannot numerically estimate the effect that consistently lowering prices will have on our business over five years or ten years or more.

Our judgment is that relentlessly returning efficiency improvements and scale economies to customers in the form of lower prices creates a virtuous cycle that leads over the long term to a much larger dollar amount of free cash flow, and thereby to a much more valuable Amazon.com. We’ve made similar judgments around Free Super Saver Shipping and Amazon Prime, both of which are expensive in the short term and—we believe—important and valuable in the long term.”

On a related note, I was also attracted to Shopify when I came across Tobi Lütke’s letter to investors that I referenced in Lesson 7. I saw in Lütke the same ability to stomach short-term pain, and the drive toward producing long-term value, that I noticed in Bezos. This is also a great example of how knowledge compounds. 

Lesson 15: I can never do it alone

Aaron Bush is one of the best investors I know of at The Motley Fool, and he recently created one of the best investing-related tweet-storms I have seen. In one of his tweets, he said: “Collaboration can go too far. Surrounding yourself with a great team or community is critical, but the moment decision-making authority veers democratic your returns will begin to mean-revert.” 

I agree with everything Aaron said. Investment decision-making should never involve large teams. But at the same time, having a community or team around us is incredibly important for our development; their presence enables us to view a problem from many angles, and it helps with information gathering and curation.

I joined one of The Motley Fool’s investment newsletter services in 2010 as a customer. The service had wonderful online forums and this dramatically accelerated my learning curve. In 2013, I had the fortune to join an informal investment club in Singapore named Kairos Research. It was founded by Stanley Lim, Cheong Mun Hong, and Willie Keng. They are also the founders of the excellent Asia-focused investment education website, Value Invest Asia. I’ve been a part of Kairos since and have benefited greatly. I’ve made life-long friends and met countless thoughtful, kind, humble, and whip-smart people who have a deep passion for investing and knowledge. The Motley Fool’s online forums and the people in Kairos have helped me become a better human being and investor over the years.   

I’ve also noticed – in these group interactions – that the more I’m willing to give, the more I receive. Giving unconditionally and sincerely without expecting anything in return, paradoxically, results in us having more. Giving is a superpower. 

Lesson 16: Be honest with myself about what I don’t know

When we taste success in the markets, it’s easy for ego to enter the picture. We may look into the mirror and proclaim: “I’m a special investor! I’ve been great at picking growth stocks – this knowledge must definitely translate to trading options, shorting commodities, and underwriting exotic derivatives. They, just like growth stocks, are all a part of finance, isn’t it?” 

This is where trouble comes. The entrance of ego is the seed of future failure. In the biography of Warren Buffett, The Snowball: Warren Buffett and the Business of Life, author Alice Schroeder shared this passage about Charlie Munger:

“[Munger] dread falling prey to what a Harvard Law School classmate of his had called “the Shoe Button Complex.”

“His father commuted daily with the same group of men,” Munger said. “One of them had managed to corner the market in shoe buttons – a really small market, but he had it all. He pontificated on every subject, all subjects imaginable. Cornering the market on shoe buttons made him an expert on everything. Warren and I have always sensed it would be a big mistake to behave that way.”

The Shoe Button Complex can be applied in a narrower sense to investing too. Just because I know something about the market does not mean I know everything. For example, a few years after I invested in Atwood Oceanics and National Oilwell Varco, I realised I was in over my head. I have no ability to predict commodity prices, but the business-health of the two companies depends on the price of oil. Since I came to the realisation, I have stayed away from additional commodity-related companies. In another instance, I know I can’t predict the movement of interest rates, so I’ve never made any investment decision that depended on interest rates as the main driver. 

Lesson 17: Be rationally optimistic

In Lesson 1, I showed that the world had lurched from one crisis to another over the past decade. And of course, we’re currently battling COVID-19 now. But I’m still optimistic about tomorrow. This is because one key thing I’ve learnt about humanity is that our progress has never happened smoothly. It took us only 66 years to go from the first demonstration of manned flight by the Wright brothers at Kitty Hawk to putting a man on the moon. But in between was World War II, a brutal battle across the globe from 1939 to 1945 that killed an estimated 66 million, according to National Geographic. 

This is how progress is made, through the broken pieces of the mess that Mother Nature and our own mistakes create. Morgan Housel has the best description of this form of rational optimism that I’ve come across: 

“A real optimist wakes up every morning knowing lots of stuff is broken, and more stuff is about to break.

Big stuff. Important stuff. Stuff that will make his life miserable. He’s 100% sure of it.

He starts his day knowing a chain of disappointments awaits him at work. Doomed projects. Products that will lose money. Coworkers quitting. He knows that he lives in an economy due for a recession, unemployment surely to rise. He invests his money in a stock market that will crash. Maybe soon. Maybe by a lot. This is his base case.

He reads the news with angst. It’s a fragile world. Every generation has been hit with a defining shock. Wars, recessions, political crises. He knows his generation is no different.

This is a real optimist. He’s an optimist because he knows all this stuff does not preclude eventual growth and improvement. The bad stuff is a necessary and normal path that things getting better over time rides on. Progress happens when people learn something new. And they learn the most, as a group, when stuff breaks. It’s essential.

So he expects the world around him to break all the time. But he knows – as a matter of faith – that if he can survive the day-to-day fractures, he’ll capture the up-and-to-the-right arc that learning and hard work produces over time.”

To me, investing in stocks is, at its core, the same as having faith in the long-term potential of humanity. There are 7.8 billion individuals in the world today, and the vast majority of us will wake up every morning wanting to improve the world and our own lot in life – this is ultimately what fuels the global economy and financial markets. Miscreants and Mother Nature will wreak havoc from time to time. But I have faith in the collective positivity of humanity. When there’s a mess, we can clean it up. This has been the story of our long history – and the key driver of the return my family’s portfolio has enjoyed immensely over the past 9 years, 7 months, and 6 days.

My dear portfolio, goodbye.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I, the author, will be making sell-trades on the stocks mentioned in this article over the coming weeks.

The Latest Thoughts From American Technology Companies On AI (2026 Q1)

A collection of quotes on artificial intelligence, or AI, from the management teams of US-listed technology companies in the 2026 Q1 earnings season.

The way I see it, artificial intelligence (or AI), really leapt into the zeitgeist in late-2022 or early-2023 with the public introduction of DALL-E2 and ChatGPT. Since then, developments in AI have progressed at a breathtaking pace.

We’re thick in the action of the latest earnings season for the US stock market – for the first quarter of 2026 – and I thought it would be useful to collate some of the interesting commentary I’ve come across in earnings conference calls, from the leaders of technology companies that I follow or have a vested interest in, on the topic of AI and how the technology could impact their industry and the business world writ large. This is an ongoing series. For the older commentary:

With that, here are the latest commentary, in no particular order:

Alphabet (NASDAQ: GOOG)

Gemini Enterprise has 40% sequential growth in paid monthly active users in 2026 Q1; Gemini 3.1 Pro is pushing the frontier in reasoning, multimodal understanding, and cost; there are now a wide variety of models in the Gemini 3.1 family to meet different developer needs; Gemini 3.1 Flash Live is powering conversational features in search and the Gemini app, and speech-to-text is now available in 70 languages; Gemini 3.1 Pro has delivered a big upgrade to Alphabet’s Deep Research product; the Lyria 3 model has generated over 150 million songs since its launch in the Gemini app; Nano Banana 2 has generated 1 billion images in half the time of Nano Banana 1; management recently launched Gemma 4, Alphabet’s best open model to date, and it has been downloaded more than 50 million times in a few weeks; Nano Banana 2 was recently integrated into the Gemini app to enable personalised image creation; Gemini is now integrated with Google Maps, so users can converse with Google Maps via chat

Gemini Enterprise is seeing tremendous momentum with 40% growth quarter-over-quarter in paid monthly active users…

…Gemini 3.1 Pro continues to push the frontier in reasoning, multimodal understanding and cost. We have quickly expanded the Gemini 3.1 series of models to offer more choices for developers, including our cost-efficient Flash models. 3.1 Flash Live, our latest audio model, has improved precision and reasoning, making voice interactions more natural and intuitive. It’s now powering conversational features in search and the Gemini app. Speech-to-text is now available in 70 languages. And with 3.1 Pro, our Deep Research agent got a big upgrade, including MCP support and native visualizations.

Our generative media models are incredibly popular. Lyria 3 has generated over 150 million songs since launching on the Gemini app. Nano Banana 2 reached 1 billion images in nearly half the time of Nano Banana 1. And Veo 3.1 Lite is our most cost-efficient video model to date.

On top of this, we launched Gemma 4, our most intelligent open model. It’s been downloaded over 50 million times in just a few weeks. In fact, our open models have now been downloaded over 500 million times…

…This month, we integrated Nano Banana 2 to make personalized image creation possible in the Gemini app. Maps recently got its most significant upgrade in over a decade with Gemini. Users can now have a conversation with Maps and get more personalized suggestions and intuitive directions.

Alphabet’s management thinks Google Cloud has the widest variety of compute options with Alphabet’s custom TPUs and Axion CPUs, and NVIDIA GPUs; Google Cloud will be among the first cloud providers to offer NVIDIA’s Vera Rubin NVL72 systems; Alphabet recently introduced the 8th generation of TPUs that has a training variety and an inference variety; TPU 8t, the training variety, offers 3x the processing power and 2x the performance of the previous generation; TPU 8i, the inference variety, has 80% better performance per dollar in inference compared to the previous generation; Alphabet’s TPUs are powering the company’s AI research in both training and tooling; management will begin to deliver TPUs to select customers in their own data centers to expand the TPU opportunity; management expects to recognise most of the revenue of external TPU shipments in 2027; management does think about the ROIC of external TPU shipments compared to internal deployment

Our custom TPUs, Axion CPUs and the latest NVIDIA GPUs continue to form the industry’s widest variety of compute options. NVIDIA GPUs are a core part of our AI accelerator portfolio and will be among the first to offer NVIDIA Vera Rubin NVL72 in addition to the Blackwell and Hopper-based instances already available.

At Cloud Next, we introduced our 8-generation TPUs, individually specialized for training and serving and able to take on the most demanding agentic workloads. TPU 8t provides high-performance model training with 3x the processing power of Ironwood and 2x the performance. TPU 8i delivers cost-effective, low-latency inference with 80% better performance per dollar than the prior generation. This exceptional infrastructure powers our world-class AI research that includes models and tooling, which continue to progress really well.

Our TPUs continue our leadership in performance, cost and power efficiency for customers like Thinking Machines Lab, Hudson River Trading and Boston Dynamics. As TPU demand grows from AI labs, capital markets firms and high-performance computing applications, we’ll begin to deliver TPUs to a select group of customers in their own data centers in the hardware configuration to expand our addressable market opportunity…

…We expect to begin recognizing a small percent of the revenues from these agreements later this year with the vast majority of revenues to be realized in 2027. It is important to keep in mind that revenues from TPU hardware sales will fluctuate from quarter-to-quarter, depending on when TPUs are shipped to customers…

…On the second question around TPUs, obviously, I would — we do think about it as what are we doing through Google Cloud to help our customers? And that’s the framework with which we think about it. In that context, there are situations where it makes sense. For example, you take customers like capital markets where they are running this highly performant AI workloads. They wanted TPUs in their data centers. So there are — and those trends are true across a diverse set of industries and in certain cases, frontier AI labs, too. And so we are opportunistic about it. But I do think we step back and think about it overall as the opportunity for Google Cloud. A lot of it is providing infrastructure through cloud. At times, it is direct sales of TPU hardwares to a select group of customers. But again, we do take ROIC approach. And some of it helps us get more economies of scale, scale in our overall compute environment as well. And so helps us invest in the cutting edge, which we need to do in the next generation as well.

Alphabet is using Antigravity, the company’s 1st-party agentic coding solution, to manage fully autonomous digital task forces

With Antigravity, we are shifting to truly agentic workflows. Our engineers are now orchestrating fully autonomous digital task forces and building at a faster velocity. Much more to come here. 

Google Search queries are at an all-time high, driven by AI; AI Overviews is driving overall search growth; AI Mode is seeing strong growth in both users and usage globally; management recently shipped agentic experiences in Google Search, such as restaurant booking, to new countries; management recently shipped the multi-modal capability, Search Live (where users can have voice conversations AI while sharing their phone’s camera feed to study surroundings), globally; search latency has been reduced by 35% in the past 5 years despite the new AI features introduced in Google Search; management has reduced the cost of responses by AI Overviews and AI Mode by 30% since they were upgraded to Gemini 3

I continues to drive search usage and queries are at an all-time high. We continue to invest in improvements to AI Overviews, which are driving overall search growth and we are also seeing strong growth in both users and usage of AI Mode globally…

…We also shipped agentic experiences like restaurant booking to new countries and new multimodal capabilities like Search Live globally…

…Even as we have brought new AI features into our results page, we have reduced search latency by more than 35% over the past 5 years. And since upgrading AI Overviews and AI Mode to Gemini 3, we have reduced the cost of core AI responses by more than 30%, thanks to continued hardware and engineering breakthroughs.

Alphabet’s management thinks a key point of Google Cloud’s differentiation is its 1st-party solutions across the enterprise AI stack; Google Cloud’s enterprise AI solutions became Google Cloud’s primary growth driver for the first time in 2026 Q1; revenue from products built on Alphabet’s GenAI models was up 800% year-on-year in 2026 Q1; new customer acquisition doubled in 2026 Q1 from a year ago; the number of $100 million to $1 billion deals doubled year-on-year in 2026 Q1; Google Cloud customers outpaced initial commitments by 45% in 2026 Q1, accelerating from 2025 Q4; Google Cloud recently introduced new capabilities across its vertical AI stack, including a new Gemini Enterprise AI Platform that helps users build and manage agents; Gemini Enterprise paid monthly active users was up 40% sequentially in 2026 Q1; the partner ecosystem for Gemini Enterprise had 9x year-on-year growth in 2026 Q1 in seats sold by partners and number of partners using Gemini Enterprise internally; 330 Google Cloud customers processed over 1 trillion tokens each over the last 12 months, with 35 processing over 10 trillion tokens each

Google Cloud is differentiated because we are the only provider to offer first-party solutions across the entire enterprise AI stack…

…Our enterprise AI solutions have become our primary growth driver for cloud for the first time. In Q1, revenue from products built on our GenAI models grew nearly 800% year-over-year. We are winning new customers faster with new customer acquisition doubling compared to the same period last year. We are seeing strong deal momentum, doubling the number of $100 million to $1 billion deals year-on-year and signing multiple $1 billion-plus deals…

…Customers outpaced their initial commitments by 45%, accelerating over last quarter.

At Cloud Next last week, we introduced hundreds of new capabilities across our vertically optimized AI stack that are designed to work together for our enterprise customers. We introduced a new Gemini Enterprise Agent Platform that empowers users to build, orchestrate, govern and optimize agents with the controls that enterprise customers need. Along with new capabilities in Gemini Enterprise app like Projects, Canvas, Long-Running agents and Skills, every employee can build agents.

In Q1, Gemini Enterprise paid monthly active users grew 40% quarter-over-quarter. That includes major global brands like Bosch, Citi Wealth, Merck and Mars Inc. Our partner ecosystem plays an increasingly critical role in driving Gemini Enterprise adoption. We saw 9x year-over-year growth, both in seats sold with partners and in the number of partners adopting it for internal use…

…Over the past 12 months, 330 Google Cloud customers each processed over 1 trillion tokens. 35 reached the 10 trillion token milestone.

Gemini is applied in Youtube for better matching and discovery between brands and creators; Gemini now powers YouTube Creator Partnerships; management has made it easier for advertisers to buy premium advertising space on Youtube; Supergoop! partnered with a YouTube creator for a Shorts and CTV campaign and it led to a 93% lift for a product and a 55% overall brand lift.

We are applying Gemini to drive better matching and discovery between brands and creators of all sizes. And Gemini now powers YouTube Creator Partnerships, a centralized platform integrated directly into YouTube Studio for creators and Google Ads for advertisers. 

We’ve also made it easier to buy premium ad space in top-tier podcast shows by curating the most watched podcasts into popular genres. For example, Supergoop! partnered with YouTube creator, Liza Koshy on a multi-format shorts and long-form CTV campaign, resulting in a 93% lift for their Glowscreen product and a 55% overall brand lift.

Waymo has so far launched in 6 new cities in 2026 and is currently in 11 major US cities; Waymo is now providing 500,000 rides per week (was 400,000 in 2025 Q4)

Waymo is on a great trajectory. It launched in Nashville a few weeks ago, that makes 6 new cities so far in 2026 and operations in 11 major U.S. cities in total. Waymo also surpassed 500,000 fully autonomous rides per week, doubling in less than a year.

Alphabet’s management is accelerating the deployment of Gemini across the company’s entire advertising infrastructure; the deployment of Gemini has led to new performance breakthroughs in advertising quality, advertiser tools, and new AI user experiences; Alphabet is making significant strides in improving relevance even when there isn’t a direct user query; advertising in Discover is getting better aligned with unique user interests; promoted pins in Maps are deeply relevant to user surroundings, location of interest, history and intent; Alphabet’s advertising relevance has increased by nearly 10%; Gemini is now powering Smart Bidding to more accurately match user intent to an advertiser’s product; management launched AI Max to help advertisers adapt to a new conversational way of searching by consumers; AI Max was moved out of beta earlier in April 2026; Hilton EMA used AI Max to capture 33% more clicks at 20% of the spend, and to increase average booking value by 55%; Etsy used AI Max to increase search volume by 10% with 15% of queries being net new; more than 30% of customer search spend now uses AI Max or Performance Max, and advertisers using the tools enjoy more conversions for the same spend; management is reinventing advertising formats for AI-native experiences; direct offers in AI mode are resonating with users; management is testing a new advertising format in AI Mode that displays retailers who sell recommended products in the AI Mode’s answer to a query; management launched Universal Commerce Protocol (UCP) in January 2026; UCP has new members consisting of major technology companies; brands such as Sephora, Macy’s and Ulta Beauty have already rolled out UCP; Ulta Beauty recently launched agentic commerce experiences in AI Mode and the Gemini App; management has received great feedback on UCP and they think UCP will power a new checkout experience in AI Mode, Search, and the Gemini app

We are accelerating the deployment of Gemini across our entire ads infrastructure to help businesses reach more customers in more places than ever before. This is driving significant improvements across all areas of marketing and continues to fuel new performance breakthroughs across 3 areas critical for our customers’ success, ads quality, advertiser tools and new AI user experiences.

First, ads quality. AI is boosting our ability to deeply understand user intent for a given search query and to find the most relevant ad. Even when we don’t have a direct user query, we’re making significant strides in improving relevance. In Discover, new AI models and classifiers are driving higher relevance by better aligning ads with unique user interests. In Maps, we’re using Gemini to ensure promoted pins are deeply relevant to user surroundings, location of interest, history and intent. This work is improving ads relevance by nearly 10%, leading to significant increase in user engagement. We’re pairing this strengthened prediction-driven relevance with bottom-of-funnel precision. Over the past year, we’ve made over 20 improvements to search and shopping bid strategies. Smart Bidding now uses Gemini to match user intent to an advertiser’s product and services more accurately and further drive performance. This level of granularity was previously impossible to achieve at scale.

Second, on advertiser tools, where Gemini helps advertisers drive more efficient and effective campaigns. People no longer search in fragments. They search conversationally and share more context. We launched AI Max to help advertisers adapt to this new way of searching. And earlier this month, it moved out of beta with improved performance quality across targeting and creative capabilities. Take Hilton EMA, they captured 1/3 more clicks for 1/5 of the spend while simultaneously increasing the average booking value by 55%. And Etsy saw a 10% search volume uplift with 15% of those queries being net new to their business. We see significant opportunity as advertisers continue to make good progress on AI readiness and the adoption of AI tools. For instance, more than 30% of our customer search spend now uses AI-enabled campaigns, AI Max or Performance Max. And these advertisers are seeing more conversion for the same spend.

Third, how we monetize new AI user experiences in search? We aren’t just bringing existing ad formats into AI experiences. We are reinventing ads for this new era. Direct offers in AI Mode are resonating with users and continue to receive positive customer feedback. Gap, L’Oreal and Chewy are just some of the latest partners who have now signed up to test this Google Ads pilot.

We’re also exploring new formats for retailers. AI Mode already surfaces organic product recommendations based on the user’s query and we’re now testing a new ad format that displays retailers who sell those recommended products. In addition, the retail industry is rapidly coalescing around the open source Universal Commerce Protocol, or UCP, we launched in January in partnership with the ecosystem. Last week, we welcomed Amazon, Meta, Microsoft, Salesforce and Stripe as new members to the UCP Tech Council. They joined founding members, Shopify, Etsy, Target, Wayfair and Google to further accelerate the transition towards an agentic future. Partners like Sephora and Macy’s have joined companies like Ulta Beauty, who are already rolling out UCP and can now redefine consumer journeys from discovery to checkout. Ulta Beauty just last week launched agentic commerce within AI Mode and Search and the Gemini app. Shoppers can now review product recommendations, compare options and complete streamlined checkout for eligible purchases directly within AI Mode and Gemini…

…We’ve received tremendous feedback so far from hundreds of top tech companies, payments partners, retailers, really interested in integrating. And it will help power a new checkout experience in AI Mode, in Search and the Gemini app and allowing shoppers to actually check out from select merchants, right as they’re researching on Google and going through this journey.

Google Cloud had 63% revenue growth in 2026 Q1 (was 48% in 2025 Q4) driven by growth in GCP; GCP grew at a much higher rate than Google Cloud’s overall growth; Google Cloud’s growth was driven by AI solutions and AI infrastructure; Google Cloud operating margin was 32.9% (was 30.1% in 2025 Q4 and was 17.8% in 2025 Q1); Google Cloud backlog grew nearly 100% sequentially to $462 billion in 2025 Q4 (was $240 billion in 2025 Q4); most of Google Cloud’s backlog are GCP contracts, and just over 50% of the backlog is expected to be recognised as revenue in the next 2 years; Google Cloud’s impressive margin improvement was driven by leverage from revenue growth, and management’s insistence on running an efficient organisation

 Cloud revenues accelerated across all key areas and were up 63% to $20 billion. Revenue growth was driven by strong performance in GCP, which continued to grow at a rate that was much higher than cloud’s overall revenue growth rate. The largest contributor to cloud’s growth this quarter was AI solutions, driven by strong demand for industry-leading models, including Gemini 3. In addition, we had strong growth in AI infrastructure due to continued deployment of TPUs and GPUs and core GCP continues to be a sizable contributor driven by demand for infrastructure and other services such as cybersecurity and data analytics. Workspace again delivered strong double-digit revenue growth, driven by an increase in the number of seats and the average revenue per seat. Cloud operating income was $6.6 billion, tripling year-over-year and operating margin increased from 17.8% in the first quarter of last year to 32.9%.

Google Cloud’s backlog nearly doubled sequentially, reaching $462 billion at the end of the first quarter. The increase was driven by strong demand for enterprise AI offerings and the inclusion of TPU hardware sales that Sundar referenced earlier. The majority of the backlog is related to typical GCP contracts and we expect to recognize just over 50% of the backlog as revenue over the next 24 months…

…[Question] There’s a thesis out there that AI revenues are a lower margin in general but we are seeing margins improve. So more insights on just the cloud business and what’s driving that margin expansion.

[Answer] There are pushes and pulls across the business, including within cloud specifically. And I would start with the top line. When we see this robust strong revenue growth, both in Cloud and Google Services, it does provide leverage all the way down to the bottom line within the income statement. And you know we’ve been working hard to ensure we have — we’re running a productive and efficient organization. And it’s not just how we operate the business but even in areas such as our technical infrastructure, where we are investing the significant CapEx investments in our data centers and servers, we are looking at how we drive scientific process innovation within that organization. And that is reflected both in Cloud and Google Services as we allocate costs based on based on consumption. In the past, I did talk about the depreciation associated with these investments that is hitting both Google Cloud and Google Services. Google Cloud expanded margin quite significantly from a year ago, as you’ve seen in our numbers that we’ve just previewed. And a lot of it, again, is the top line growth that Google Cloud is providing or producing as well as an incredibly efficient way of running the business.

Alphabet’s management has raised capex guidance for 2026 to $180 billion to $190 billion (was previously $175 billion to $185 billion; 2025’s capex was $91.4 billion, which was itself up 65% from $55.4 billion in 2024, and 2024’s capex was up 69% from 2023); management is seeing unprecedented demand for AI compute; Alphabet’s investments in AI compute are delivering strong growth; management expects 2027’s capex to be much higher than 2026’s; management is investing in capex based on tangible demand signals and a ROIC framework; Google Cloud remains constrained by supply and would have grown faster in 2026 Q1 if supply was higher

…We will begin to deliver TPU hardware to a select group of customers in their own data centers. We expect to begin recognizing a small percent of the revenues from these agreements later this year with the vast majority of revenues to be realized in 2027. It is important to keep in mind that revenues from TPU hardware sales will fluctuate from quarter-to-quarter, depending on when TPUs are shipped to customers…

…Wiz will be reported in the Google Cloud segment. And second, we expect a low single-digit percentage point headwind to cloud’s operating margin for the remainder of 2026 related to the acquisition…

…We are updating our full year 2026 CapEx guidance range to $180 billion to $190 billion, up from our previous estimate of $175 billion to $185 billion to now include investment related to the acquisition of Intersect, which closed in March.

We are seeing unprecedented internal and external demand for AI compute resources. The investments we are making in AI is delivering strong growth as evidenced by the record revenue and backlog growth in Google Cloud and strong performance in Google Services. Looking ahead, these strong results reinforce our conviction to invest the capital required to continue to capture the AI opportunity. As a result, we expect our 2027 CapEx to significantly increase compared to 2026. In terms of expenses, as we’ve discussed previously, the significant increase in our investment in technical infrastructure will continue to put pressure on the P&L in the form of higher depreciation expense and related data center operations costs such as energy. We also expect to continue hiring in key investment areas such as AI and cloud and are investing in marketing to support our AI products…

…You’ve seen us over the past several years increase CapEx every year. And we have done it very thoughtfully to meet the demand that we are seeing, both from external customers as well as demands across the organization. And you’re seeing the proof point, the ROIC on that in terms of just the growth rate we’re seeing, whether it’s growth rate within search or certainly the cloud business and the opportunity we have within the cloud backlog…

…I do think looking ahead, our ability to invest in this moment and stay at the frontier, I think puts us in a strong position. And I think we are doing it based on tangible demand signals we are seeing. And it’s not just on the revenue side but I’m talking from a ROIC framework and that’s what is helping us navigate this moment responsibly…

…We are compute constraint in the near term. And as an example, our cloud revenue would have been higher if we were able to meet the demand.

Amazon (NASDAQ: AMZN)

AWS grew 28% year-on-year in 2026 Q1 (was 24% in 2025 Q4) and is now growing at its fastest pace in 15 quarters; AWS’s run rate has reached $150 billion (was $142 billion in 2025 Q4); the last time AWS grew at a similar rate, it was half its current size; AI’s growth is unprecedented; the 1st 3 years of AWS’s AI revenue run rate was $15 billion, 260x larger than AWS’s run rate in its 1st 3 years; management thinks customers are choosing AWS for AI for 4 reasons, namely, (1) AWS’s broader capabilities, (2) customers want their AI inference to be at where their other applications and data reside, and this happens to be in AWS, (3) customers want to consume non-AI services as they grow their AI usage, and AWS has a broad set of offerings, and (4) AWS has the strongest security and operational performance; AWS has won many new enterprise customers since 2025 Q4’s earnings call, including OpenAI, Anthropic, Meta Platforms, and NVIDIA; AWS continues to see strong growth in non-AI workloads as enterprises focus on cloud migrations; management is seeing customers who want to benefit from AI accelerate their migration to the cloud; management is seeing a strong correlation in customers’ AI spend and core growth in AWS; AWS’s AI revenue is growing triple digits year-on-year; AWS operating income in 2026 Q1 was $14.2 billion, reflecting 37.7% operating margin (was 35.0% in 2025 Q4 and 39.5% in 2025 Q1); AWS’s backlog is $364 billion in 2026 Q1 with significant sequential growth (was $244 billion in 2025 Q4), and the backlog has reasonable breadth and does not include a recent $100 billion deal with Anthropic

AWS growth continued to accelerate, up 28% year-over-year, the fastest growth rate in 15 quarters, up $2 billion quarter-over-quarter, the largest Q4 to Q1 AWS revenue increase ever. AWS is now a $150 billion annualized revenue run rate business. It’s very unusual for a business to grow this fast on a base this large. And the last time we saw growth at this clip, AWS was roughly half the size. We’ve never seen a technology grow as rapidly as AI…

…3 years after AWS launched, it had a $58 million revenue run rate. In the first 3 years of this AI wave, AWS’ AI revenue run rate is over $15 billion, nearly 260x larger.

There are several reasons customers are choosing AWS for AI. First, we’ve built broader capabilities than others…

…Second and another reason customers continue choosing AWS is that as they expand their use of AI, they want their inference to reside near their other applications and data and much more of it resides in AWS than any place else. Third, as customers expand their AI usage, they also want to consume additional non-AI services, and they’re choosing AWS because we’ve built the broadest and most capable core offerings by a wide margin. We offer thousands of features across compute, storage, databases, analytics, security and more, and Gartner consistently recognizes AWS’ leadership across their major cloud evaluation areas. Fourth, AWS is the strongest security and operational performance of any AI and infrastructure provider and start-ups, enterprises and governments continue to choose AWS as the foundation for their most critical workloads…

…Since last quarter’s call, we’ve announced new agreements with OpenAI, Anthropic, Meta, NVIDIA, Uber, U.S. Bank, Fox, Southwest Airlines, U.S. Army, Bloomberg, Cerebras, AT&T, Nokia, Fundamental, The National Geographic Society, PGA TOUR and many more…

…Moving to our AWS segment. Revenue was $37.6 billion and growth accelerated 480 basis points to 28% year-over-year, driven by both core and AI services. We continue to see customers increase cloud migrations and scale their use of AWS core services. Customers seeking the full benefit of AI are accelerating their transition to the cloud. We also see a strong correlation between AI spend and core growth. As customers spend more on AI, we see a corresponding demand increase in core. We expect this to increase over time as customers move more AI workloads into production, strengthening demand for our core services…

…Our AI revenue is growing triple digits year-over-year…

…AWS operating income was $14.2 billion and reflects our strong growth, coupled with our focus on driving efficiencies across the business…

…The backlog for Q1 is $364 billion. That does not include the recent deal that we announced with Anthropic for over $100 billion. There’s reasonable breadth in that as well. It’s not just 1 customer or 2 customers.

AWS’s chips business, including Graviton and Trainium, grew 40% sequentially in 2026 Q1; the chips business is now at a $20 billion annual revenue rate (was $10 billion in 2025 Q4), and growing triple-digits; if AWS sold its chips as a stand-alone business, its annual revenue run rate would be $50 billion; AWS’s custom silicon business is now 1of the top 3 data center chip businesses in the world; Anthropic and OpenAI both recently signed very large multi-year commitments for Trainium; Trainium now has $225 billion in revenue commitments; Trainium 2 has 30% better price-performance than competitor GPUs and is largely sold out; Trainium 3, which only started shipping at the start of 2026, is 30%-40% more price-performant than Trainium 2 and is nearly fully subscribed; Trainium 4 is already been reserved despite being 18 months from broad availability; Amazon Bedrock runs most of its inference on Trainium; Meta Platforms has committed to using tens of millions of AWS’s Graviton CPUs; Amazon management sees massive demand for CPUs as agentic AI, post-training, and inference scales up; Graviton has 40% better price-performance than other x86 CPUs; Graviton is used by 98% of the top 1,000 AWS EC2 customers; AWS is bringing in more Trainium chips than NVIDIA GPUs, but NVIDIA remains an important partner; management expects Trainium to eventually save AWS tens of billions of dollars of capex annually and provide several hundred basis points of operating margin; management believes that people will always want choice in models and chips; management is currently not interested in selling Trainium racks to 3rd party data centers, but thinks AWS could do so in the next few years

Our chips business continues to grow rapidly and is larger than what a lot of folks thought. We saw nearly 40% quarter-over-quarter growth in Q1, and our annual revenue run rate is now over $20 billion and growing triple-digit percentages year-over-year…

…If our chips business was a stand-alone business and sold chips produced this year to AWS and other third parties as other leading chip companies do, our annual revenue run rate would be $50 billion. As best as we can tell, our custom silicon business is now one of the top 3 data center chip businesses in the world, the speed at which we’ve gotten here is extraordinary…

…We’ve recently shared very large multiyear, multi-gigawatt Trainium commitments from the 2 leading AI labs in the world in Anthropic and OpenAI as well as an increasing number of companies like Uber betting on Trainium. And we now have over $225 billion in revenue commitments for Trainium. Our Trainium2 chip has about 30% better price performance than comparable GPUs and is largely sold out. Trainium3, which just started shipping at the start of 2026 and is 30% to 40% more price performance than Trainium2 is nearly fully subscribed. And much of Trainium4, which is still about 18 months from broad availability has already been reserved. Amazon Bedrock, which is used expansively by over 125,000 customers, runs most of its inference on Trainium and almost 80% of the Fortune 100 companies are using Bedrock.

We also just announced that Meta is committed to using tens of millions of Graviton cores. Graviton is our industry-leading CPU chip, which allows Meta to run the CPU-intensive workloads behind agentic AI with the performance and efficiency they need at their scale. AI is commonly seen as a GPU story, but the rise of agentic workloads, real-time reasoning, code generation, reinforcement learning and multistep task orchestration is driving massive CPU demand as well. As AI systems shift from answering questions to taking actions and as post-training and inference scale up, the compute required pulls heavily on CPUs. That’s why Meta chose Graviton, which delivers up to 40% better price performance than any other x86 processors and now used by 98% of the top 1,000 EC2 customers…

…While the largest number of AI chips we’re bringing in are Trainium, we continue to have a deep partnership with NVIDIA. We have immense respect for them, continue to order substantial quantities. We’ll be partners for as long as I can foresee, and we’ll always have customers who want to run NVIDIA on AWS, and we will also have a very large chips business ourselves. Customers always want choice. It’s always been true and always will be true…

…At scale, we expect Trainium will save us tens of billions of dollars of CapEx each year and provide several hundred basis points of operating margin advantage versus relying on others’ chips for inference…

…But the one thing you learn over and over again with every technology, it was true in databases, it was true in analytics. It was true in models. It’s true in chips, too, by the way, is that customers want choice. There is not one tool to rule the world, and they want choice…

…On the question about Trainium and the notion of our selling racks over time, I do think that’s very much a possibility. Always, we have to balance — we have such demand right now for Trainium, and we have such demand from various companies who will consume as much as we make that we have to decide how much we’re going to allocate to the existing demand and customers and how much we’re going to save to sell as racks. And for our existing customers that we sell Trainium to, how many will be Trainium plus running on our cloud infrastructure versus just the chips themselves. But I expect over time, there’s a good chance we’re going to sell racks over the next couple of years.

Amazon’s management remain confident in the returns generated by the company’s capex; much of the capex spent in 2026 will be installed in future years; customers have already committed to substantial portions of the 2026 capex; management sees attractive margins and ROIC (return on invested capital) for the 2026 capex; AWS has to spend more short-term capex the faster it grows, since AWS needs to spend on land, power, chips etc 6-24 months in advance of monetisation; AWS’s capex often fund assets with years and decades of useful lives; AWS’s capex generate attractive cumulative free cash flow and ROIC a few years after being in service; Amazon’s free cash flow in the early years of high-growth periods for AWS is limited until the early capacity is monetized and revenue growth outpaces capex growth, and management has seen this cycle in AWS’s first big growth wave and expects similar positive outcomes from the current wave; management expects to continue making significant investments in AI; management has no change on Amazon’s 2026 capex plan (original guidance for 2026 was for $200 billion, and this is up from $128 billion in 2025, and $83 billion in 2024); management first saw the trend of rising input prices for capex in 2025 H2 and has been working with suppliers to get supply; management is seeing rising memory prices be a push-factor for companies to shift from on-premise to the cloud

We continue to be confident in the long-term CapEx investments we’re making. Of the AWS CapEx we intend to spend in 2026, much of which will be installed in future years, we have high confidence this will be monetized well as we already have customer commitments for a substantial portion of it and that it will yield compelling operating margins and ROIC…

…The faster AWS grows, the more short-term CapEx we will spend. AWS is to lay out cash for land, power, buildings, chips, servers and networking gear in advance of when we can monetize it, typically 6 to 24 months before we start billing customers depending on the component. However, these CapEx investments fund assets with many year useful lives, 30-plus years for data centers, 5 to 6 years for chips, servers and networking gear. The free cash flow and ROIC for these investments are cumulatively quite attractive a couple of years after being in service. However, in times of very high growth like now, where the CapEx growth meaningfully outpaces the revenue growth, the early years free cash flow is challenged until these initial tranches of capacity are being monetized and revenue growth outpaces CapEx growth. We’ve been through this cycle with the first big AWS growth wave and like the results. We expect to feel similarly about this next wave with much larger potential downstream revenue and free cash flow…

…We will continue to make significant investments, especially in AI, as we believe it to be a massive opportunity with the potential to drive long-term revenue and free cash flow…

…I don’t have an update on — a new update on capital. Our plan is largely the same…

…Everybody knows that the cost of these components, particularly memory has skyrocketed. And we’re just in a stage where there’s just not enough capacity for the amount of demand. We have worked very closely with our strategic partners. We saw this trend happening early in the kind of the middle of the latter part of last year, and we’ve worked with our strategic suppliers here to get a significant amount of supply. And so we’re working very closely with them. I think the team has been very scrappy. I think we’ve done a good job in making sure that we’re not capacity constrained there, but we’re watching that very closely.

One of the interesting things that we see right now with the change in price and in supply on things like memory is that it is a further impetus pushing companies who have on-premises infrastructure into the cloud. And it’s because a meaningful part, these suppliers are prioritizing their very largest customers which cloud providers are. And so we have seen a number of conversations we’ve been having with enterprises for many months where it’s just been slower in getting the transformation plan to move to the cloud accelerate rapidly just because we have a lot more supply than what others have.

SageMaker, AWS’s model-building service, reduces training time of models by up to 40%; Bedrock, AWS’s fully-managed service for companies to build upon frontier models, had 170% sequential growth in customer spend in 2026 Q1; Bedrock processed more tokens in 2026 Q1 than all prior years combined; OpenAI’s latest models are already, or will soon be, available on Bedrock; Amazon management recently added the Amazon Bedrock Managed Agents feature, which helps organizations build generative AI applications and agents at production scale;  Amazon Bedrock Managed Agents is powered by OpenAI, and OpenAI is seeing unprecedented demand for the product; Amazon management believes companies will derive the most value from AI from agents; Strands, AWS’s open source AI agents SDK (software development kit) has been downloaded more than 25 million times, with downloads up 3x sequentially in 2026 Q1; AgentCore is used to deploy an agent every 10 seconds; AWS has turnkey agentic solutions, including Kiro and Quick; Kiro, AWS’s coding agent, saw users double sequentially in 2026 Q1 and enterprise usage 10x; Quick, AWS’s AI assistant, has seen new customers grow 4x sequentially in 2026 Q1; management recently launched the Quick desktop app, which helps improve productivity of users; Amazon Bedrock now has 125,000 customers; 80% of the Fortune 100 are using Amazon Bedrock; AWS delivered 4x improvement in Trainium 2’s token throughput for Bedrock, leading to more capacity to serve customers; management thinks having OpenAI’s models on Bedrock is a big deal; Bedrock is already serving 3rd-party models from all the non-OpenAI key players; management believes that people will always want choice in models and chips; management believes that most of the work being done with models in the future will be of the stateful variety; Bedrock Managed Agents is a feature unique to AWS 

We’ve built broader capabilities than others. That includes model building with SageMaker, which reduces training time by up to 40%, high-performance inference with the leading selection of frontier models in Bedrock, which saw 170% growth in customer spend quarter-over-quarter and processed more tokens in Q1 than all prior years combined.

We’re excited to make OpenAI’s models available in Bedrock. Yesterday, we added OpenAI’s GPT-5.4 model with 5.5 coming soon. Yesterday, we also started the preview of Amazon Bedrock Managed Agents powered by OpenAI, the Stateful Runtime Environment that enables any organization to build generative AI applications and agents at production scale. We believe that modern agentic applications will be stateful, and this new technology will rapidly accelerate agentic AI adoption. OpenAI has said they’re already seeing unprecedented demand for this new product, and we’re seeing heavy customer interest as well.

Most of the value companies derive from AI will be through agents. In AWS customers can build agents with their proprietary data and Strands, which has been downloaded more than 25 million times and saw 3x more downloads quarter-over-quarter. Customers can deploy agents with enterprise scale, security and reliability with AgentCore, which is being used to deploy an agent as frequently as every 10 seconds. We also offer turnkey agents for coding, software migrations, business operations and knowledge workers in Kiro, Transform, Connect and Quick, and they continue to resonate with customers. The number of developers using Kiro more than doubled quarter-over-quarter and enterprise customer usage increased nearly 10x. Customers have used Transform to save over 1.56 million hours of manual effort when migrating and modernizing their workloads. The number of new customers using Quick has grown more than 4x quarter-over-quarter, and we just announced our Quick desktop app yesterday. It’s very compelling as it can query your e-mail, calendar, Slack, local files and several other applications you use every day to flag important communications, retrieve and summarize information, make recommendations, compose and send communications to others and create agents that highlight or automatically do work that you used to have to do yourself. You can easily keep refining your preferences and Quick’s advanced knowledge graph enables its AI agents to automatically learn from your interactions to become more personalized over time…

…Amazon Bedrock, which is used expansively by over 125,000 customers, runs most of its inference on Trainium and almost 80% of the Fortune 100 companies are using Bedrock…

…Bedrock has been a significant growth driver. In 2025, we delivered 4x improvements in Trainium2’s token throughput. And since the majority of Bedrock’s workloads run on Trainium, these efficiency gains directly translate into more capacity to serve customers…

…The fact that we’re going to have all of the OpenAI models available in Bedrock is a big deal. It’s a big deal for customers. And we have — we obviously have a very large amount of AI being done in Bedrock today on the models we have and this is Anthropic and Llama and Mistral and a host of others. But the one thing you learn over and over again with every technology, it was true in databases, it was true in analytics. It was true in models. It’s true in chips, too, by the way, is that customers want choice. There is not one tool to rule the world, and they want choice…

…Most of the model work and most of the AI has been done in these stateless models, kind of tokens in and tokens out. And while I think there will continue to be lots of work done that way, I think the future of using these models is a stateful model, a stateful API. And that’s because when you’re building agents, you’re building AI applications, you don’t want to start a new every time you interact with the model. You want to store state. You want to store identity, you want to store what the conversation or the actions have been, you want to reach out and do a little bit of compute here. You want to have the tools to be able to reach — the models reach out to the different tools to accomplish different tasks. And that only happens if you’re able to store state. And so the Bedrock Managed Agents that we collaborated with and invented with OpenAI that we just announced a preview of yesterday is also — I think that’s the future of how these agents are going to be built. It’s something that nobody else has, and I think it’s very exciting to our customers.

Amazon is able to deliver items faster while lowering its cost to serve, and management sees meaningful opportunities to further improve the fulfillment network’s productivity; Amazon’s latest generation of robotics offers a step change in efficiency; management is deploying the latest generation of robotics in both new and existing fulfillment facilities, and early results are positive

Overall unit growth of 15% continues to outpace our cost to operate the fulfillment network as outbound shipping costs grew 12% year-over-year and fulfillment expense grew 9% year-over-year, both on an FX-neutral basis. As our network efficiency improves, we’re able to deliver items faster and improve the customer experience while at the same time lowering our cost to serve. Looking ahead, we see meaningful opportunities to further enhance productivity across our global fulfillment network, all while continuing to raise the bar in delivery speed. We will keep optimizing inventory placement to shorten distance traveled, reduce touches per package and improve consolidation rates.

Alongside these efforts, we deploy robotics and automation, which have been integral to our operations for decades. Our latest generation technologies offer a step change in efficiency, which we’re deploying in both new and existing facilities. All of our U.S. large-format fulfillment center launches in 2026 will have this latest generation technology. We’re seeing early positive results with improved site safety, higher productivity and lower cost to serve.

Amazon management recently launched Health AI, a personal health agent

We launched Health AI, a 24/7 AI-powered personal health agent backed by One Medical clinicians that gives U.S. customers instant clinical guidance and takes action with their permission from booking appointments to managing prescriptions to facilitating medical treatment with a real One Medical provider.

Rufus, Amazon’s AI shopping assistant, saw monthly active users grow 115% year-on-year in 2026 Q1, and engagement increase by 400%; Rufus has improved a lot over the past year

Rufus, our agentic AI shopping assistant continues to resonate with customers. Rufus can research products, track prices and auto buy products in our store when they reach a set price. Monthly active users are up over 115% and engagement is up nearly 400% year-over-year…

…If you haven’t checked out Rufus in a while, it’s really substantially improved over the last year.

Amazon management recently launched Seller Central, an AI-powered insights-hub for sellers on Amazon; the initial response to Seller Central has been very strong

We recently introduced a new AI experience for sellers in Seller Central that dynamically generates a custom, personalized visualization of data, key insights and scenarios tailored to the sellers’ goals. It’s early, but the initial response and feedback are very strong.

Amazon’s management recently expanded Creative Agent to more countries; Creative Agent is Amazon’s agentic offering that helps advertisers plan and execute the entire advertising creative process; management recently launched sponsored products and brand prompts in Rufus; 20% of shoppers interacting with brand prompts in Rufus carry on the conversation

Our Ads team also continues to invent and deliver for advertisers with AI. For example, we expanded Creative Agent, an agentic partner that plans and executes the entire ad creative process to Canada, France, Germany, India, Italy, Spain and the U.K. And we recently introduced Sponsored Products and Brand Prompts in Rufus that help brands showcase products and customers make more informed buying decisions. It’s early, but we’re seeing nearly 20% of shoppers who interact with the Brand Prompts in Rufus continue the conversation about that brand.

Amazon’s management recently expanded early access to Alexa+ to Mexico, UK, Italy, and Spain; compared to the previous Alexa, users are completing 3x more purchases on device, streaming 25% more music, and using smart home functionality 50% more 

Alexa+ early access expanded to millions more Prime members in Mexico, the U.K., Italy and Spain. Customers are loving Alexa+, talking to Alexa twice as much and for longer durations across a wider breadth of topics, completing purchases on devices 3x more, streaming music 25% more and using smart home functionality 50% more than Alexa classic.

Amazon’s management continues to be very bullish on agentic commerce; management thinks agentic commerce will be very good for customers and Amazon in the long run; agentic commerce is currently only a small fraction of referrals from search engines; management thinks the user-experience with agentic commerce from 3rd-party agents is still poor, as pricing and product information are often wrong, and the agents don’t have personalization data and shopping history; management is working with 3rd-party agent providers to improve the experience; management continues to think that the agentic shopping assistant that will prevail will come from existing retailers that customers already have a good relationship with, and management is attempting to build Rufus to be the prevailing agentic shopping assistant; management thinks agentic commerce will be a great thing for Amazon’s advertising business because of 2 reasons, namely, (1) agentic AI will drive greater volume of advertising, and (2) agentic commerce provides multiple opportunities to surface relevant products to customers

We are very bullish on what agentic commerce will look like. I think it’s going to be very good for customers in the long term. I think it will be good for us, too…

…We’ll do a lot of work with third-party horizontal agents to try and make that customer experience better. And by the way, I do think today, it reminds me in some ways the stage we’re in of what we saw in the early days of search engines and they’re trying to refer business to e-commerce. It’s never been a giant part of the referrals to our e-commerce business. But over the years, the experience got better. And what you see with agentic commerce is it’s a small fraction of what we see with the search engine referrals, but the experience just hasn’t gotten great with these third-party horizontal agents yet. They’re not often able to get the pricing right or the product information right. They don’t have any personalization data or any shopping history. And so we do want to see that get better with third-party horizontal agents. We’re having conversations with all those folks to try and make that better and find something that works for customers and all the companies.

And then it will be interesting over time which agents customers choose to use. I happen to think that if you’re going to a particular retailer that you’d like to do business with and you like to shop from, if they have a great agentic shopping assistant, you’re going to often start there because it’s where you’re doing your shopping, it’s easier to — they have better product information. They have better information about what other customers like you are buying. You can make all sorts of changes to how your account and your shipping information is working there. And so that’s what we’re aiming to make Rufus be is we’re aiming to have it be the best shopping assistant anywhere, and I think we’re on that path…

…On the Agentic Commerce and how that impacts advertising, I actually believe that we’re going to like this for advertising. I think it’s going to be good for customers, and it’s going to be good for our business. And I think, first of all, the first thing to remember is the way that our ads team has built tools and agents themselves is making it so much easier to do advertising. If you look at small and medium-sized businesses that had to take weeks and months to do creative and to pick the right audience, all of that is just — it’s so much faster and so much easier because of our advertising agentic tools. And you no longer have to take as much time or spend as much money building the creative.

So I think there are going to be a lot more advertising — advertisers with the rise of what’s happening in AI. And then if you look at the Agentic Commerce experiences, if you look at any of these agentic experiences, they tend to be multi-turn conversations where you’re not interacting with one search and getting an answer. You tend to find that you’re asking questions, you’re narrowing questions, it’s asking you questions on what you want. And in that process of having multi turns, there are multiple opportunities to surface relevant products to customers, many of which will be organic and some of which will be sponsored. And it also gives rise to opportunities like sponsored prompts.

In the 2025 Q4 earnings call, Amazon’s management said market demand for AI compute looked like a barbell with AI labs on one end spending a lot on compute for just a handful of applications, and with enterprises on the other end using AI for productivity purposes; now, management is starting to see enterprises using AI for brand-new experiences

The AI labs are spending an incredible amount of money on compute at this point and in compute, both on the AI side as well as on the core side. And the models that they’re building and the companies that have successful generative AI applications are certainly spending a lot. And there are several of those labs. But we also see quite a bit of enterprise adoption and usage of AI. As I’ve said before, the largest absolute place that we see enterprises having success is in projects that are around cost avoidance and productivities. These are things like automating customer service or business process automation or fraud or things of that sort. But the number of projects that we’re working with across enterprises and that we’re now starting to see to come to production around brand-new experiences, trying to figure out how to reinvent their current experiences, but using inference and AI to be smarter, also very significant. So we’re seeing the adoption in both of those segments.

Amazon’s management sees a giant impact on how AI will shape Amazon’s business internally; management believes AI will completely reinvent Amazon’s current customer experiences in the fullness of time; management is aware of the innovator’s dilemma that can trap Amazon in reinventing AI-native customer experiences, and is actively avoiding the trap; Amazon swapped the engine of a service running at full tilt with a team of just 5 people who used agentic coding tools to build the new engine in 65 days; the engine would previously have taken 40-50 people a year to rebuild

On the use of AI internally and for our current businesses, I think that the shortest first summary I could give you, Colin, is that I do not see a place in any of our businesses or any of the ways that we do work where we’re not going to have giant impact on what we do. I think I’ve long had this belief that while you can add incrementally to a lot of your existing customer experiences, different agentic and AI experiences, I really believe that in the fullness of time, and I don’t know if that’s 3 years from now or 5 years from now or it could be sooner, too, that all of these customer experiences we know are going to be completely reinvented…

…It’s tricky for — if you have an existing business that’s doing well. But you have to look at every single one of your customer experiences and you have to be able to carve off resource for that team to think anew about what would the future customer experience look like if you started from scratch today, and if you had all the technologies like AI available to you when you started. And that is what we’re doing in every single one of our experiences…

…If you look at one of our services, we swapped out the engine of the service while we are also running the service full tilt. And normally, that would have taken 40 or 50 people about a year to do, and we took 5 really smart people, AI forward-thinking people building on agentic coding tools and those 5 people rebuilt it in 65 days. Like that is a very different world of operating. And that’s the world I think we’re heading to over the next few years.

Apple (NASDAQ: AAPL)

The iPhone 17 family contains the A19 and/or the A19 Pro chips, which include neural accelerators to deliver strong AI capabilities

During the quarter, we welcomed iPhone 17E, the newest addition to what is already the strongest iPhone lineup we’ve ever had. It brings outstanding performance and core iPhone experiences at a remarkable value for everyone from enterprise teams to consumers. Across the lineup, this is the most powerful, capable and versatile iPhone family we’ve ever created. That starts with the latest in Apple silicon for iPhone, A19 and A19 Pro, which include neural accelerators in the GPU to deliver a huge boost to AI performance

Apple’s management thinks the Mac is the best platform for AI, with Apple’s in-house chips giving Macs the ability to run advanced AI models on-device; the MacBook Air now comes with the M5 chip, which enables the product to run AI models on device; the MacBook Pro has even more advanced versions of the M5 chip in M5 Pro and M5 Max

From Mac Mini to MacBook Pro and everything in between, Mac is the best platform for AI with Apple Silicon delivering exceptional performance, industry-leading efficiency and the ability to run advanced models locally in ways that simply weren’t possible before…

…We’ve also further improved MacBook Air, already the world’s most popular laptop with M5, making everyday tasks faster and more responsive than ever. MacBook Pro reaches new heights with M5 Pro and M5 Max, delivering extraordinary performance and dramatically advancing what users can do with AI on a portable system…

Apple’s new AirPods Max 2 has Apple’s most advanced active noise cancellation technology; AirPods can now do live translation, thanks to Apple Intelligence

During the quarter, we introduced customers to a new level of audio experience with AirPods Max 2, delivering stunning sound quality and our most advanced active noise cancellation yet…

…AirPods can bridge languages too, thanks to Live Translation powered by Apple Intelligence.

Apple Intelligence now has more powerful capabilities such as visual intelligence for cleanup; management is looking to launch a more personalised Siri later in 2026 ; Apple Intelligence is powered by Apple’s self-designed chips; management is not treating AI as a standalone feature but is instead treating AI as an essential experience

In addition to live translation, Apple Intelligence brings together dozens of powerful capabilities from visual intelligence to cleanup and photos that are seamlessly integrated into the moments that matter most to our users every day. And we look forward to bringing a more personalized Siri to users coming this year. What truly sets Apple apart is how Apple Intelligence is woven into the core of our platforms, powered by Apple Silicon and designed from the ground up to deliver intelligence that is fast, personal, and private. This is not AI as a stand-alone feature, but AI as an essential intuitive part of the experience across our devices. It builds on years of innovation from the neural engine to advanced on-device processing, enabling capabilities that are not only incredibly powerful, but also respectful of user privacy.

Reminder that in 2025, management committed to invest $600 billion over 4 years (was a $500 billion commitment in 2025 Q2; Apple has around $190 billion in gross profit per year, for perspective) in the USA in areas such as advanced manufacturing, silicon engineering and artificial intelligence; Apple now has Mac mini production in the USA; in March 2026, management brought 4 new companies to Apple’s American manufacturing program; Apple is on track to buy over 100 million advanced chips from TSMC’s Arizona fab; later in 2026, Apple will open its advanced manufacturing center in Houston to provide hands-on training for students, supplier employees and American businesses

We’re also making great progress in advancing American supply chain innovation. As part of our $600 billion commitment to the U.S., we were pleased to share recently that Mac mini production is coming to America later this year, expanding our factory operations in Houston with a brand-new facility. In March, we were thrilled to welcome 4 new companies to our American manufacturing program to help manufacture essential materials and components for Apple products sold worldwide. These include sensors that support key iPhone features like camera stabilization and integrated circuits essential for features like crash detection and activity tracking. These efforts build on the progress we’ve made in the American manufacturing program, including the work we’re doing to advance an end-to-end silicon supply chain across the U.S. At TSMC’s Arizona facility, for example, Apple is on track to purchase well over 100 million advanced chips.

As we’re accelerating our long-standing support for U.S. innovation, we’re also investing in America’s workforce. We’re looking forward to opening the doors to an all-new advanced manufacturing center in Houston later this year, which will provide hands-on training led by Apple experts and tailor-made for students, supplier employees and American businesses.

The Mac Mini and Mac Studio models are great devices for AI and agentic AI, and so demand from consumers was greater than management expected; management thinks the supply constraints with the Mac Mini and Mac Studio will take a few months to resolve; management’s guidance for 2026 Q2 already embeds significantly higher memory costs; management thinks memory costs will have an increasing impact on Apple’s business

You look forward to the June quarter, the majority of our supply constraints will be on several Mac models given the continued high levels of demand that we’re seeing. And we have less flexibility in the supply chain than we normally would. For Mac, in the June quarter, there’s 2 factors that are driving the constraints. One is that on the Mac Mini and the Mac Studio, both of these are amazing platforms for AI and Agentic tools. And the customer recognition of that is happening faster than what we had predicted. And so we saw higher-than-expected demand. The second reason is that the customer response to Mac Neo has just been off the charts, with higher-than-expected demand…

…We think looking forward that the Mini and the Mac Studio may take several months to reach supply-demand balance…

…I’ll go back to December for a moment and just walk you through the chronology. In the December quarter, we really had a minimal impact due to memory, and you can kind of see that in the gross margin results. We said it would be a bit more in the March quarter, and we did see higher memory costs in the March quarter, and they were partially offset by benefits from carry-in inventory that we had. For the June quarter and what’s embedded in the guidance that Kevan went through earlier, we expect significantly higher memory costs. They are also partly offset by the benefit of carry-in inventory. And then where we don’t give color beyond June, I can tell you that beyond the June quarter, we believe memory costs will drive an increasing impact on our business.

Apple’s management has been investing more in AI in both products and services, and this shows up in the company’s operating expenses, specifically in R&D (research and development); the increased investments in AI include building Apple’s own foundation models, and in the collaboration with Google; Apple’s collaboration with Google on foundation models is going well

[Question] As we think longer term, do you think Apple will invest more? Where will Apple invest more heavily over the next several years? And is this at all related to your net cash comments in terms of perhaps building out more infrastructure as we enter an AI-centric world?

[Answer] We are clearly investing more. You can see that in the OpEx numbers. And if you click down on those a step deeper and look at the R&D area separate than SG&A, you’ll find that R&D is even accelerating much higher than the company is. And so we are clearly investing. We’re investing in products and services, and we see opportunities in both of those…

…We believe AI is a really important investment area for Apple, and we’re going to be doing that incrementally on top of what we normally invest in our product road map…

…[Question] Last quarter, you did talk about Apple foundational models and sort of the two-pronged strategy there of the collaboration with Google as well as continuing to internally sort of work on your own models. Hoping you can sort of give us an update in terms of how you’re able to balance those 2 priorities as well as do you feel like you need to double down and invest more to be able to balance those 2 priorities side by side?

[Answer] We are investing more. You can see that in the OpEx numbers. And as I’ve mentioned before, the R&D, in particular, is — has scaled rather significantly on a year-over-year basis. The collaboration with Google is going well. We’re happy with where things are, and we’re happy with the work that we’re doing independently as well.

ASML (NASDAQ: ASML)

ASML’s management is seeing the semiconductor industry’s growth continue to solidify, driven by AI investments, and this applies to both advanced Memory and advanced Logic; management thinks semiconductor supply will not meet demand for the foreseeable future, and this is creating constraints in end markets, including AI; management is seeing ASML’s Memory customers being asked to ramp supply; ASML’s memory customers are sold out for 2026, with supply constraints extending beyond the year; management is seeing ASML’s Logic customers building capacity, including for the 2nm node to meet AI demand and mobile demand; management is seeing ASML’s customers increasing their capital expenditure to ramp up their capacity, and this capacity is supported by long-term commitments from their customers; management is seeing ASML’s Memory customers and Logic customers increase their adoption of EUV and DUV immersion lithography; the level of demand for ASML’s DUV immersion lithography systems in 2025 was significantly lower what’s currently seen; besides DUV immersion, management is also seeing health in the DUV dry lithography business; management has seen major adoption of EUV by ASML’s DRAM customers in 2025 because EUV provides better performance; DRAM has been a really good story for lithography intensity in 2025; ASML’s customers have been very open with the company on their expansion plans

We see that the semiconductor industry growth continues to solidify. This is still very much driven by investments in AI infrastructure. So, this translates into a lot of demand for advanced Memory, for advanced Logic. We expect in fact that the supply will not meet the demand for the foreseeable future. So, this is creating a strong constraint in the end markets from AI to mobile and PC. As a result our customers are strongly invited to create more capacity. So if we look at Memory, what our customers tell us is that they are sold out for 2026. And their supply constraints will last beyond 2026. For advanced Logic, we see our customers building capacity for several nodes, while they also continue to ramp 2 nm in order to address the AI products…

…We see our Memory and Logic customers increasing their capital expenditure and trying to accelerate basically their capacity ramp in 2026 and beyond. What’s also very interesting is that a lot of this demand is supported by long-term commitment from their customers. On top of that, we see both Memory customers, DRAM customers and advanced Logic customers continuing to increase their adoption of EUV, but also immersion. So this translates basically into higher lithointensity and a higher litho demand for ASML…

…When it comes to immersion DUV, we actually had a bit of a slow start because in the course of last year, we were looking at a significantly lower demand for immersion. That has now reversed itself…

…I already mentioned what we’re doing on immersion, but also the dry business is doing quite nicely…

… In the Logic business, our customers are adding capacity across multiple advanced nodes to support demand while continuing to ramp the 2-nanometer node in support of next-generation HPC and mobile application…

…We have seen a major adoption of EUV in DRAM in 2025. And you may have noticed that our, I will say, U.S. DRAM customer also made this announcement that they were shifting also pretty strongly on EUV. And the reason for that is, of course, performance, but it’s also capacity because if you are going to use more EUV layers, you are going to need less multi-patterning and multi-patterning takes a lot of space also in the fab. So I think this is also definitely another argument in favor of EUV. I think this was mentioned, by the way, by this U.S. customer in their call. So I would say the first results of that is, first, more adoption of Low NA EUV…

…DRAM has been really a good story when it comes to litho intensity in ’25…

…Customers are very, very open. By the way, that’s also the case on the Logic side. But very — customers are very open to us, and they’re very openly discussing with us also their expansion plans for this year, but also beyond.

ASML’s management does not want EUV systems to be the bottleneck in building compute capacity for AI; EUV systems are not the bottleneck today

We do not want EUV to be the bottleneck. So I think I’d like to say that very, very strongly…

…I know the question of bottleneck comes back very often. I think we don’t feel at all that we are the bottleneck today.

Intel (NASDAQ: INTC)

Intel’s management expects sustained momentum for the company’s Xeon server CPU products in 2026 and 2027, with the Xeon 6 being Intel’s fastest new product ramp in 5 years alongside the Core Series 3 products; Xeon’s momentum is powered by the reinsertion of CPUs as a foundation for AI where the CPU-to-GPU (accelerators) ratio is swinging back to the CPU’s favour; management thinks the CPU’s resurgence in AI is great news for Intel’s x86 CPU ecosystem; Intel saw strong ASIC growth in 2026 Q1 sequentially and year-on-year; Intel’s DCAI (Data Center and AI) segment, signed multiple long-term agreements in 2026 Q1; Xeon 6 was recently selected as the host CPU for NVIDIA’s DGX Rubin NVL8 systems; Xeon remains the most deployed host CPU for AI systems; DCAI recently started a multiyear collaboration with SambaNova to design a next-generation AI inference architecture; management’s confidence in the sustained growth of CPUs for AI is growing; management’s outlook for server CPU demand has improved in 2026 Q1; management expects the server CPU industry to have a strong year of double-digit unit growth in 2026, extending to 2027; the long-term agreements signed by DCAI have volume and pricing terms, and last 3-5 years; Intel’s customers are telling the company that CPUs are more important in AI inferencing and agentic AI than AI training, with the ratio of GPUs-to-CPUs flipping from 8:1 to possibly 1:more-than-1; management believes Intel’s CPUs will be very effective competitors to the likes of ARM, AMD, and the hyperscalers

Demand continues to run ahead of supply for all our businesses, especially for Xeon server CPUs, where we expect sustained momentum this year and next. Intel 3-based Xeon 6 and Intel 18A based Core Series 3 products are now in full volume production ramp and each represents the fastest new product ramp in 5 years…

…For the last few years, the story around high-performance computing was almost exclusively about GPU and other accelerators. In recent months, we have seen clear signs that the CPU is reinserting itself as the indispensable foundation of the AI era. CPU now serves as the orchestration layer and critical control plane for the entire AI stack. This is not just our wishful thinking, it is what we hear from our customers, and it is evident in the demand profile for our products. Xeon server demand is seeing strong and sustained momentum. Customers are deploying server CPUs along accelerators in the ratio that is moving back towards CPU. The accelerator remains central to Frontier AI, and we will continue to participate, innovate and partner in that category. Our recent announcement with SambaNova Systems is an example of such partnership on heterogeneous compute architectures. But the backbone of AI computing in production remain a CPU anchored architecture. That is good news for the x86 ecosystem. It is great news for Intel…

…We also saw strong ASIC growth with revenue up more than 30% sequentially and nearly doubling year-over-year…

…Within the quarter, DCAI signed multiple long-term agreements, including Google, supporting our view that the current business momentum is sustainable. In addition, Xeon 6 was selected as the host CPU for NVIDIA’s DGX Rubin NVL8 systems, and Xeon remains the most deployed host CPU due to its industry-leading memory, security and networking orchestration. Lastly, DCAI also established a multiyear collaboration with SambaNova to design a next-generation heterogeneous AI inference architecture combining SambaNova’s RDUs and Intel Xeon 6 processors…

…Our confidence in the sustained growth of CPUs driven by the AI infrastructure build-out is growing. Our outlook for server CPU demand has improved over the last 90 days, and we expect a strong year of double-digit unit growth for the industry and for us with momentum extending into 2027…

…Most of these agreements are structured with volume and pricing, and they are usually somewhere between 3 and 5 years…

…The feedback from the customer, CPU is very important when you move from training to inference. Inference side, I think in terms of orchestration, control plane and also managing all the different agent with data, CPU is much more efficient. So I think the ratio of CPU to GPU used to be 1 and 8, and now it’s 1:4 and I think towards parity or even better…

…One statistic that we look at is the ratio of CPUs to GPUs. And if you look at training solutions, they’re generally running in the kind of 7 to 8 GPUs to 1 CPU. As we look into inference, it’s probably getting into like the 3 to 4:1 kind of level. And as you get into agentic and multi-agent, it’s one potentially even flip in the other direction a little bit…

…[Question] On server CPU competition. So both when we look at competition versus x86 against AMD, do you think you are gaining share? Do you expect to gain share against them? And then broader, I think the competition against Arm because NVIDIA is planning to launch a stand-alone Vera CPU Rack. Recently, we heard Amazon talk up their Graviton option. I think Google yesterday said they would launch Axion and connect it with every TPU. So just kind of near term, how do you look at competition versus AMD and x86?

[Answer] The CPU is a great demand right now. I think we all enjoy that. And then in terms of our product road map, we have been fine-tuning the last year… We are laser-focused on execution. Multithreading, I think we are putting in. So we’re going to have Coral Rapid, have the multithreading that we can compete effectively with AMD. And we try to accelerate that Coral Rapid ahead. And then the other part is we’re also looking at some of the architecture, CPU and GPU architecture… In all, I think we have the team, we have the technology road map. I think we’re going to be — over time, going to be a very effective competitors to them.

Intel’s management sees the semiconductor industry’s addressable market approaching $1 trillion, driven by AI demand, and the company is well positioned to benefit

Driven by tremendous demand for AI, the semiconductor industry TAM is now approaching $1 trillion. Intel is well positioned to benefit from this demand with 3 strategically important assets: our x86 CPU franchise, our advanced packaging technology and our vast manufacturing network.

Intel’s management sees AI moving into the real world, with more distributed inference

Artificial intelligence is now moving into the real world towards a more distributed inference and reinforced learning workloads like agentic, physical AI and robots and edge AI.

Intel’s management is pleased with the progress of the company’s foundry technology development, but it will be a long journey; the manufacturing yields of the Intel 3 and Intel 18A process technologies are now running ahead of management’s projects; Intel continues to make progress in advanced packaging technologies, with additional customer backlog growth in 2026 Q1; Intel’s 14A process technology is now at a higher level of yield compared to 18A at a similar point in time, and the company is developing PDKs (process design kits) with multiple customers; management expects to see design commitments for 14A in 2026 H2 and 2027 H1; the progress of Intel Foundry has driven the company to land more of its own future product tiles on the Intel 14A process; Intel Foundry will be supporting TeraFab, the huge semiconductor project undertaken by Elon Musk’s companies; management wants to work with TeraFab to improve the manufacturing efficiency of semiconductors; rising prices for memory chips and other materials are a headwind for Intel Foundry’s gross margin in 2026 H2; management will continue to utilise a multi-foundry approach for Intel; Intel Foundry’s advanced packaging business is seeing demand in the billions of dollars; Intel Foundry’s advanced packaging is a differentiated offering – it allows customers to use larger reticles – and so it’s getting attractive pricing; Intel Foundry’s 18A yields are going to hit management’s end-2026 targets by the middle of the year; most of Intel Foundry’s supply is for internal demand at the moment, but management expects it to win customers over time

The accelerating deployment of AI infrastructure creates a meaningful opportunity for us as we continue to build our external foundry business. I’m pleased with the progress we have made in foundry technology development over the last year, even though I will continue to remind you this will be a long journey for us. We have made steady progress with Intel 4 and Intel 3 and 18A yields are now running ahead of the internal projections, representing a meaningful inflection in our execution and our factory finished good output.

We also continue to make steady progress on our advanced packaging technologies, including additional growth in customer backlog in the quarter.

Intel 14A maturity yield and performance are outpacing Intel 18A at a similar point in time, and we continue to develop PDKs with multiple customers actively evaluating the technology…

…We expect to see earlier design commitments emerge beginning in the second half of 2026 and expanding into the first half of 2027…

…I’m particularly pleased that our progress today has driven us to land more of our own future product tiles on Intel 14A as well. At a time when advanced wafer capacity is in the short supply, this enables us to have better control over our supply chain…

…As we look to continue challenging the status quo, I can think of no better partners than Elon Musk. We recently announced our partnership with SpaceX, xAI and Tesla to support Terafab. Elon and I share a strong conviction that global semiconductor supply is not keeping pace with the rapid acceleration in demand. We are excited to explore innovative ways to refactor silicon process technology, looking for unconventional ways to improve manufacturing efficiency that will eventually lead to a dynamic improvement in the economics of semiconductor manufacturing…

…Our foundry team is delivering consistent yield and throughput improvements across all process nodes, which will help gross margins. With that said, Intel 18A is still early in its ramp and rising input costs, especially in memory, present growing headwinds in the second half that we need to overcome…

…I’d say the one cautionary concern I have on gross margin in the back half of the year is just some of the materials have gone up in terms of cost, substrates are going up, T glass. We’ve got memory going up, as you know. So those things offset some of the improvements that we’re having through the year…

…TSMC is a very important partner for us. Morris and C.C. have decades of friendship. And then clearly, with our product group will decide which is the best foundry. So I think we’re going to use a multi-foundry approach, our own internal and also external. And so we really have good relationship, continue to build from both sides to benefit the customer…

…[Question] I would love to kind of level set where we are on the advanced packaging front. You talked about rising backlog. Anything you can share in terms of what that number looks like?

[Answer] We have been really pleased with our traction there. And I think maybe naively, I had thought that these opportunities would come in the hundreds of millions of dollars level. But so far, what we’re seeing is that their demand is more in the billions of dollars per year kind of level. So this is going to be a big part of the foundry revenue as we get through this decade. And the good news is advanced packaging really is a differentiated offering for us, and it does a lot for the customer in terms of allowing them to use larger reticles. So there’s real value to the customer. And as a result, we get very attractive pricing relative to some of the other areas of the foundry business…

…18A yields are somewhat a closely guarded proprietary piece of information for us. So we don’t typically — I would just say Lip-Bu had a target as we came into the year for the end of this year, and we’re probably going to hit that probably the middle of this year…

…[Question] As we think about your capacity tightness, the leading edge foundries are also quite tight as well. Has this driven any near- to medium-term share gains?

[Answer] All the supply right now or the lion’s share of the supply is all internal, but we do expect, obviously, to win customers over time.

Intel’s AI-driven businesses are now 60% of revenue, and was up 40% year-on-year in 2026 Q1

AI-driven businesses now represent 60% of revenue and grew 40% year-over-year.

Intel’s management now expects capital expenditures to be flat in 2026, but the actual dollar-amounts spent on tools will be up 25% in 2026, as management is seeing a lot of demand and wants to catch up on supply

We forecast capital expenditures in 2026 to be flat to last year versus our prior expectation of flat to down, reflecting increased capacity investments to support committed demand and a continued emphasis on improving fab productivity and output. We now expect expenditures to be roughly equal across the year and still to be heavily weighted towards the equipment that directly grows wafer outs to support growth this year and next…

…In the last few years, a lot of our CapEx spending was space. And I think we’re actually in a pretty good position in space. We wanted to have white space available to move into when needed. And I think Lip-Bu and I both feel like we’re in a good place. So we actually will be bringing the space spend down pretty materially, even though the total is flat. And so what that means is the tool spend is actually increasing pretty significantly. In fact, tool spending will be up year-over-year 25% or so. And so that’s, I think, a function of the fact that we just see a lot of demand, and we want to make sure we’re catching up on the supply front.

Intel’s management thinks the ASIC business will be a fast-growing one for the company in the next 5 years; the ASIC business is already at a run rate of more than $1 billion

[Question] On the ASIC business, Dave, I think you said it doubled year-on-year. If you could maybe help us with what is included in that? I believe it’s IPUs, but I just want to get a better sense how big it is.

[Answer] Stay tuned on that one, the next 5 years is going to be a fast growing for us…

…One thing that people have been surprised about is how big the business is already. It’s at a run rate that’s north of $1 billion already.

Intuitive Surgical (NASDAQ: ISRG)

The da Vinci 5 captures real-world surgical data at greater scale and fidelity, enabling deeper surgical insights; the surgical insights captured by da Vinci 5 will be used by Intuitive Surgical for AI-enabled capabilities; management expects to add telesurgery and more automation to Intuitive Surgical’s robotic surgery platforms over the long term; management believes that AI will help Intuitive Surgical to move its Quintuple Aim forward; the data captured by da Vinci 5 includes video, kinematic, and force data; the AI-powered insights that management wants to deliver to customers can be in the form of operational guidance, learning of a surgeon/care team, and in the operating theatre

da Vinci 5 captures real-world surgical data at greater scale and fidelity, enabling deeper insight into how procedures are performed in practice. That insight paired with clinical context from connected electronic medical records, provides better understanding of variation, workflow and outcomes, and informs current and planned digital and AI-enabled capabilities…

…Collectively, these efforts are foundational to our long-term digital and AI road map where we expect to add telesurgery, deeper decision support and augmented dexterity, including aspects of future automation, all in pursuit of advancing the Quintuple Aim…

…We believe, yes, that AI will be a contributor to moving the Quintuple Aim forward…

…It starts with high-quality data, and that data will exist in video data from surgeries. It will exist in robotic data streams like kinematic data and force data. It will exist in connected electronic medical records, where we’re working with customers to do so. And once we have that high-quality data set, then the job of our AI and our data scientists is to turn that into meaningful insights…

…So there are, I think, ways in which this will show up to the customer. Some will be as operational guidance and assistance as they look at their hospital robotic program and want to increase efficiencies or understand costs. Some of it may show up in the learning of a surgeon and/or a care team. But a lot of it will show up in the operating room and I think show up in the surgery itself. And an example of this kind of first phase might be AI-enabled anatomy identification where you can see AI showing critical structures in the surgical field, showing tissue planes to help assist the surgeon. Then, over time, what we expect is that many of those same foundations that are being established and built in kind of that first phase, if you will, will support more advanced assistance around augmented dexterity and it will include — likely include aspects of automation. There, an example might be helping to control the camera as the surgeon is focused on the procedure.

Intuitive Surgical’s management thinks the company’s differentiation in AI comes from its installed base of da Vinci 5 systems, and the number of procedures performed by the systems annually which generates unique data

How do we sit, how do we exist within the AI ecosystem and how are we differentiated? I think part of that differentiation is around the installed base of systems that we have out there, including about the 1,500 da Vinci 5 systems, the 3 million and more procedures that are being done on an annual basis. And I believe that gives us the foundation to strengthen the differentiation over the next 3 to 5 years. If you look at the industry and you say, what is broadly available, broadly available to everyone, it’s things like edge and cloud compute, the math that underscores much of this, some of the training algorithms. Our advantage, we believe, lies is in the unique data sets that are available to us today through something like Force Feedback and will be increasingly available to us as we add capability to da Vinci 5.

Mastercard (NYSE: MA)

Mastercard is working with key players in the agentic commerce ecosystem, including Google, Microsoft, and OpenAI; Mastercard is partnering with OpenAI on Mastercard Agent Pay, which enables agent-to-agent payments; nearly all Mastercards globally are enabled for Mastercard Agent Pay; Mastercard’s management launched Verifiable Intent in 2026 Q1; Verifiable Intent is a temper-resistent record of authorisations a user has given to his/her agent; the FIDO Alliance is using Verifiable Intent as a foundation for security standards in agentic commerce; Crossmint, a leading blockchain infrastructure provider, will integrate Mastercard Agent Pay and Verifiable Intent so that it can enable secure Mastercard transactions for agents; Crossmint’s integrations will be launched initially on OpenClaw; management thinks Mastercard’s network will serve agentic commerce with tokenised credentials; management thinks agentic commerce will bring even more incremental opportunity in transactions and services over time; volumes with Mastercard Agent Pay are still low

On Agentic, the ecosystem continues to evolve. Our payment solutions are ready, and we are engaged, shaping what comes next with key players, including Google, Microsoft, OpenAI, and other partners across the ecosystem. We’re deepening our partnership with OpenAI, reinforcing their use of Mastercard Agent Pay, working to enable agent-to-agent payments and collaborating to embed our services across their solutions while using their tools as an enterprise customer. I’m also happy to share that nearly all Mastercards around the world are now enabled for Mastercard Agent Pay…

…In quarter 1, we launched Verifiable Intent, a tamper-resistant record of what a user authorized when an AI agent acts on their behalf. In fact, the FIDO Alliance is now using it as a foundation for setting security standards in this space. And earlier this month, we announced a partnership with Crossmint, a leading blockchain infrastructure platform. Crossmint will integrate Mastercard Agent Pay and Verifiable Intent to enable secure Mastercard transactions for AI agents in its ecosystem. This will initially launch on the OpenClaw platform with plans to expand…

…But as agent-driven commerce gains traction, our network is there with tokenized credentials, powering the payments, bringing the security, and trust, and reach that everyone is looking for. It’s very clear there is even more incremental opportunity in transactions and in services over time…

…[Question] In Mastercard Agent Pay. Michael, you talked about some of the partners and some of the activity on the ground, but can you just give us a little bit more detail on volumes or any surprises with respect to actual activity or actual demand?

[Answer] In terms of where volumes are, we’re still at early stage. So that is also true because a few things were not quite in place yet.

More than 500 customers are already engaged with Mastercard Threat Intelligence, which was launched in 2025 and powered by Recorded Future’s capabilities (Recorded Future was acquired by Mastercard in 2024 Q4 and it provides AI-powered solutions for real-time visibility into potential threats related to fraud); Mastercard Threat Intelligence have helped customers take down malicious domains responsible for the payment card test impacting over 10,000 e-commerce sites; Recorded Future puts Mastercard in a unique position to provide insights on threats faced by states 

Last year, we launched Mastercard Threat Intelligence, bringing Mastercard and Recorded Future capabilities together. In a short period of time, more than 500 customers are already engaged. Using the product, partners have taken down malicious domains responsible for the payment card test impacting over 10,000 e-commerce sites. That’s tangible value…

…Asymmetrical warfare, state actors, all of that is going on, and Recorded Future puts Mastercard in a very unique position to be a trusted partner to provide those kind of insights.

Mastercard has started to launch Mastercard Agent Suite, where Mastercard will design and deploy AI agents within customer environments; management thinks Agent Suite could be a much bigger opportunity than on the consumer side

You heard us talk about Agent Suite, which we started to launch, where we’re going to get into the business of building agents with our customers in the B2B space, et cetera. So early-stage on B2B earlier than on the consumer side, but I would think this is a much bigger opportunity, and it fits right into our focus on commercial payments. So early-stage ecosystem building, covering your basis, that’s what we’re doing.

Meta Platforms (NASDAQ: META)

Meta’s AI research lab, Meta Superintelligence Labs (MSL), has released the first model, MuseSpark, in its Muse family of models; MSL has built what management thinks is the strongest research team in the industry; MSL is already training even more advanced models than Muse; management thinks MuseSpark has already made Meta AI a world class assistant for users in many areas; management has heard very positive feedback on MuseSpark; management thinks Meta’s product team is now able to build products on top of the company’s models because the models are now strong, unlike in the past; management thinks models in the future will have to be able to improve themselves in order for them to be considered leading models; management is not focused on building coding capabilities with Meta’s AI models; coding is not the only ingredient needed for models to be self-improving

Our biggest milestone so far this year has been the release of our Muse family of models and our first model MuSpark along with a significantly upgraded new version of Meta AI. This was the first release from Meta Super Intelligence Labs, and it shows that our work is on track to build a leading lab. Over the past 10 months, we have built the strongest research team in the industry and established the scientific and technical foundations to scale very advanced models. Spark is just one step on that scaling ladder, and we are already training even more advanced models…

…Spark has already made Meta AI, a world-class assistant that leads in several areas related to our vision of personal super intelligence, including visual understanding, health, shopping, social content, local, creating games and more. We’re hearing very positive feedback on it so far…

…We have our product team, and that team is now really unlocked to be able to build things on top of our models because we now have a very strong model. So before this, we have been prototyping a bunch of things using other different models, whether it was our previous older models or kind of using the APIs from other companies. And now we’re unlocked to be able to go build things and get them to scale on top of our own models…

…You’re not going to have leading models in the future if your models can’t improve themselves, right? So you’re getting to a point where today, the models are still able to learn from people — and then I think at some point, the models will have to improve themselves. And that’s how the growth is going to — an improvement in the models is going to happen…

…Does that make us a developer tools company? Not necessarily. I mean, I’m not against having an API or coding tools or anything like that. But it’s not our primary focus. But I actually think people conflate coding with self-improvement more than they should. Coding is one ingredient for the model self improving. It’s not the only thing. And we are focused on all of the parts that are going to be necessary for self-improvement in service of the personal super intelligence vision that we have for people and businesses.

Meta AI has seen large increases in usage since MuseSpark was introduced, with double-digit percent increases in Meta AI sessions per user; the Meta AI app has consistently been near the top in app stores; MuseSpark is now powering Meta AI in chat threads in Facebook, Instagram, WhatsApp, and Messenger, as well as in the standalone Meta AI app

We’ve seen large increases in Meta AI use since releasing the updates, and the Meta AI app has consistently been near the top of the app stores as well…

…We’re seeing encouraging results within Meta AI since we began powering responses with the first model from MSL, Muhspark. In tests we ran leading up to the launch, we saw meaningful engagement gains that accelerated week-over-week with each new iteration of the model. We’re seeing similar games within Meta AI following the broad rollout of our new model with double-digit percent increases in Meta AI sessions per user. MuseSpark is now powering Meta AI in direct chat threads across our family of apps as well as the stand-alone Meta AI app and website, giving billions of people globally access to our latest model.

Meta’s management has a very view on AI than others in the industry; management thinks that AI will help people and improve many aspects of their lives; management wants to build AI agents that empower people and businesses; management thinks there are clear monetisation opportunities for personal superintelligence

My view of AI is very different from many others in the industry. I hear a lot of people out there talk about how AI is going to replace people. Instead, I think that AI is going to amplify people’s ability to do what you want, whether that’s to improve your health, your learning, your relationships, your ability to achieve your personal career goals and more. My view is that human progress has always been driven by people pursuing their individual aspirations. And I believe that this will continue to be true in the future. People will be more important in the future, not less. Meta believes in empowering individuals. And those are the kinds of products that we’re going to build, and I believe that they’re going to be some of the most important and valuable products of all time. We are building a personal agent focused on helping people achieve the diverse goals in their lives. We’re also building a business agent focused on helping entrepreneurs and businesses across the world, use our tools and others to grow their efforts, reach new customers and serve existing customers better. These agents will work together to form an ecosystem…

…The focus is on building personal super intelligence, building a consumer agent that can work for you and help you get things done. That right now is a consumer experience that we’re focused on, but we think there will be clear monetization opportunities over time. You can imagine commission structures or a premium offering.

Meta’s management has been testing business AIs and weekly conversations have 10x-ed (from 1 million to 10 million) since the start of 2026; the Meta AI business assistant was recently fully rolled out to all eligible advertisers on supported Meta buying services and performance has been strong, with common account issues being resolved at a 20% higher rate; the business AIs are tested in SMBs across Latin America and Asia Pacific; management will expand access to the business AIs in 2026 Q2; the business AIs are currently free, but management expects to monetise them over time

We’re already testing an early version of business AIs and weekly conversations have grown 10x since the start of this year…

…The Meta AI business assistant has now been fully rolled out to all eligible advertisers on supported Meta buying services, providing personalized recommendations to advertisers, resolving account issues, and servicing campaign insights to help optimize results. Performance has been strong since we began testing the assistant in Q4 with common account issues being resolved at a 20% higher rate…

…In Q1, we expanded business AIs on WhatsApp to SMBs across Latin America and Indonesia as well as on Messenger in Asia Pacific. We now have more than 10 million conversations each week being facilitated through business AIs, up from 1 million at the start of the year. We’ll further expand access to more countries this quarter while adding more capabilities to the AIs…

…Business AIs today are currently free for most businesses on our messaging apps. But as we make more progress, we expect that we will also work towards establishing a longer-term monetization model.

Meta’s management is working to incorporate MuseSpark in the company’s upcoming models used in its recommendation systems, core apps, and advertising products; the upcoming models will enable Meta to understand more of people’s goals for the first time in the company’s history; in the last few years, Meta has seen an increasing return on the amount that it can improve user-engagement, and this has encouraged management to continue investing heavily in this area 

We’re also working on using Spark in our upcoming models to improve our recommendation systems and core business in Facebook, Instagram and ads. Right now, our apps primarily help people accomplish 3 important goals: connecting with people, learning about the world and entertainment. But we’ve always wanted our apps to understand more of people’s goals so we can help improve their lives in all the ways that they want. These new AI models will let us understand this in more detail. So instead of just looking at statistical patterns of what types of people engage with what content, for the first time in Meta’s history, we’re going to be able to develop a first principles understanding of what you care about and what each piece of content in our system is about — is that way we can show you more useful things for what you’re trying to accomplish. And we’ll also be able to create personalized content specifically for people to help you achieve your goals as well. Since our recommendation systems are operating at such a large scale, we’ll phase in this new research and technology over time.

But the trend over the last few years seems clear that we are seeing an increasing return on the amount that we can improve engagement for people and value for advertisers. This encourages us to continue investing heavily in what we expect will provide increasing value over the coming years as well.

Meta will be rolling out more than 1 GW (gigawatt) of its own custom chips; Meta’s AI compute infrastructure will include large amount of its own chips and AMD chips, alongside NVIDIA chips; Meta is investing in more compute, partly through multiyear cloud deals; Meta’s contract commitments increased by $107 billion in 2026 Q1; the multiyear cloud deals support both Meta’s training and inference needs; management has consistently underestimated Meta’s compute needs even as the company has been ramping up compute capacity significantly; management expects compute to be even more central for the business going forward

We are rolling out more than 1 gigawatt of our own custom silicon that we’re developing with Broadcom, as well as significant amount of AMD chips to complement the new NVIDIA systems that we’re rolling out as well…

…We’re also signing cloud deals that will come online over the course of this year and 2027, allowing us to scale more quickly. These multiyear cloud deals and our infrastructure purchase agreements drove a $107 billion step-up in our contractual commitments this quarter. Our investments will support our training needs for future models and most importantly, provide us the inference capacity necessary to deliver personal and business agents to billions of people around the world, along with several other AI product experiences we’re developing…

…Our experience so far has been that we have continued to underestimate our compute needs even as we have been ramping capacity significantly as the advances in AI have continued and our teams continue to identify compelling new projects and initiatives. And now to, there are very compelling internal use cases. So our expectation is that compute will become even more central to the business going forward.

Meta’s AI glasses continue to perform well, with daily users tripling year-on-year in 2026 Q1; the AI glasses continue to be one of the fastest-growing categories of consumer electronics ever; Meta released new glasses for all-day wear in 2026 Q1; Met has new partnerships and styles for AI glasses coming later this year; all of Meta’s AI glasses are designed to easily update to Meta’s newest AI models and features; Meta’s AI glasses are evolving into a personal agent product; the sales of Meta’s AI glasses have shifted from the prior generation to the latest generation; management is seeing strong interest in the Meta Ray-Ban Display product that comes with neural bands; management thinks the Meta Ray-Ban Display product will be the next generation for how AI glasses evolve

Our AI glasses continue to perform well with the number of people using them, daily tripling year-over-year. This continues to be one of the fastest-growing categories of consumer electronics ever. We released Ray-Ban Meta optics this quarter designed for all day wear rather than primarily as sunglasses. And building on our release of Oakley last year, we have some exciting new partnerships and styles that I think are going to have the potential to reach even more people coming later this year. All of our glasses are designed to easily update to use our newest AI models and features. I’m also really excited to see the glasses evolve from being able to answer questions to being able to be a personal agent that’s with you all day long, helping you remember things and achieve your goals…

…We’re seeing sales shift now from the prior generation of Ray-Ban Meta’s to the latest generation, which I think speaks to the value of the improved features like extended battery life and higher features like higher resolution video capture…

…We see strong interest now in the Meta Ray-Ban displays with the Meta neural bands. So that’s an encouraging sign that there is consumer appetite for display glasses, which is kind of the next generation of how this product evolves.

Ranking improvements made in 2026 Q1 drove a 10% increase in time spent on Instagram Reels, an 8% increase in total video time on Facebook globally, and a 9% increase in video watch time on Facebook in the US and Canada; the ranking improvements are driven by a number of things, including (1) the doubling in the length of user interaction sequences for training on Instagram, (2) increasing the speed of indexing new posts by the ranking models, and (3) applying more advanced content understanding techniques; same-day posts are now more than 30% of recommended posts in Instagram and Facebook, up more than 2x from a year ago; management is now using AI to auto translate and dub videos into a viewer’s local language; more than 500 million users are watching translated videos weekly on each of Facebook and Instagram; management continues to invest in Meta’s recommendation capabilities, and the investments include near term ones such as scaling up models in size and complexity and incorporating LLMs, or large language models, to deepen content understanding, and long-term ones such as building foundation models for organic content and ads recommendations, and LLM-based recommendation systems; management thinks there is still a lot of room to continue improving recommendations on both Facebook and Instagram

We’re continuing to see significant gains from our content recommendation initiatives. On Instagram, the ranking improvements that we made in Q1 drove a 10% lift in Reels time spent. On Facebook, total video time increased more than 8% globally in Q1, the largest quarter-over-quarter gain in 4 years. Within the U.S. and Canada, ranking improvements we made drove a 9% increase in video watch time on Facebook in Q1. 

These gains are benefiting from advances we’re making across the full stack. Starting with data, we doubled the length of user interaction sequences we use for training on Instagram in Q1 and increase the richness of how each user interaction is described, enabling our systems to develop a deeper understanding of user interests. Within our models, we’ve significantly increased the speed with which our ranking models index new posts, which is enabling us to recommend them sooner after they are published. We’re also applying more advanced content understanding techniques, which is enabling us to quickly identify posts that may be interesting to someone even if they haven’t engaged with a lot of similar content. These and other improvements have enabled us to increase the diversity and recency of recommended content with same-day posts now representing more than 30% of recommended reels on both Instagram and Facebook more than double the levels 1 year ago.

We’re also using AI to unlock more inventory by auto translating and dubbing videos into a viewer’s local language, enabling us to recommend a more diverse set of content. Over 0.5 billion users on each of Facebook and Instagram are now watching AI translated videos weekly. 

Looking forward, we’re making several investments we expect will deliver more valuable recommendations. This year, we will continue scaling up our models in several dimensions, including their size and complexity, while incorporating LLM to deepen content understanding across our platform. This will enable us to better match people to a wider variety of content aligned to their interests. At the same time, we are executing on our longer-term efforts to develop the next generation of our recommendation systems. This includes building foundation models that power organic content and ads recommendations as well as developing LLM based recommender systems. Our focus this year is validating the model architectures and techniques in these domains before we scale them out in future years…

…There is still a lot of room to continue improving recommendations over the rest of the year, and we expect we’ll be able to do that to drive additional engagement on both Facebook and Instagram.

Meta continues to enhance its systems to show advertising to users at the optimal time and location; improvements made to Lattice and GEM (Generative Ads Model) in 2026 Q1 increased conversion rates for landing page view advertising by more than 6%; management expanded coverage of Meta’s new adaptive ranking model, which was rolled out in 2025 H2, to off-site conversions and this drove a 1.6% increase in conversion rates across Facebook and Instagram’s major surfaces; Meta is introducing Meta Ads AI Connectors in open beta and it allows advertisers to connect their Meta advertising accounts directly to an AI agent; more than 8 million advertisers are now using at least one of Meta’s Gen AI advertising creative tools with very strong adoption among SMB advertisers; advertisers using Meta’s video generation feature are seeing 3% higher conversion rates in tests; Meta’s value optimisation suite, which maximises the return on advertising spend for advertisers by prioritising the highest value conversions, has seen strong adoption with the revenue run rate reaching $20 billion in 2026 Q1, more than double from a year ago; Meta’s new adaptive ranking model enables the company to leverage LLM-scale model complexity when it previously couldn’t

We continue to enhance our systems to show ads at the optimal time and location…

…In Q1, enhancements we made to Lattice’s modeling and learning techniques, along with advances in our GEM model architecture, drove a more than 6% increase in conversion rate for landing page view ads. In addition, we’ve been investing in more performing inference models for 1 more serving ads. In the second half of last year, we began rolling out our new adaptive ranking model, which is an LLM scale adds recommender model that we use for inference. This model improves our inference ROI by routing requests to more compute-intensive inference models when it determines there is a higher probability of conversion. In Q1, we expanded coverage of our adaptive ranking model to support off-site conversions, which drove a 1.6% increase in conversion rates across the major surfaces on Facebook and Instagram…

…This week, we’re also introducing Meta ads AI connectors in open beta, providing advertisers the ability to connect their Meta ad account directly to an AI agent. We’ve always supported advertisers both on our platform and through tools like the marketing API. And now we’re extending that to AI. So businesses and agencies can analyze and optimize campaigns with the tools they’re already using.

Usage of our ad creative tools is also scaling with more than 8 million advertisers using at least one of our Gen AI ad creative tools and particularly strong adoption among small- and medium-sized advertisers. These tools are benefiting performance as well with advertisers using our video generation feature seeing more than 3% higher conversion rates in tests…

…We also continue to invest in the value optimization suite, which helps advertisers maximize their return on ad spend by prioritizing the highest value conversions rather than optimizing solely for the most conversions at the lowest cost. Adoption by businesses has been strong following performance improvements we’ve made over the past year with the annual revenue run rate of our value optimization suite now over $20 billion, more than doubling year-over-year…

…The inference models are bound by strict latency requirements since they need to find the right ad within milliseconds, and that has, again, historically prevented us from meaningfully sizing up — scaling up their size and complexity. But in the second half of last year, we introduced a new adaptive ranking model, which enables us to leverage LLM scale model complexity of 1 trillion parameters, and we made advances in the model architecture and codesign the system with the underlying silicon, so it maintains the sub-second speed that is required to serve ads at scale. We also developed an approach that intelligently routes request more compute-intensive inference models if it determines that there is a higher probability of conversion and that lets us drive both better performance and increased inference ROI.

Microsoft (NASDAQ: MSFT)

Microsoft’s management has 2 priorities to capture the AI opportunity, namely, (1) build the leading cloud and AI infrastructure, and 2) build high-value agentic systems across core domains

We are at the beginning of one of the most consequential platform shifts that will change the entire tech stack as agents proliferate and become the dominant workload. This will drive TAM expansion and change the value creation equation across the entire economy. To capture this opportunity, we are executing against 2 priorities. First, we are building the world’s leading cloud and AI infrastructure for agentic computing era. Second, we are building high-value agentic systems across core domains such as productivity, coding and security

Microsoft’s management is optimising every layer of its technology stack and this is producing operational gains; Microsoft’s dock-to-live times for its data centers has reduced by 20% since the start of 2026; Microsoft has delivered a 40% improvement in inference throughput in Copilot’s most-used models

We’re optimizing every layer of the tech stack, from DC design, to silicon to system software, the model architecture as well as its optimization. This is translating into operational gains. We have reduced dock-to-live times for new GPUs in our biggest regions by nearly 20% since the beginning of the year. Our Fairwater data center in Wisconsin came online earlier this month, 6 weeks ahead of schedule, allowing us to recognize revenue earlier. And we delivered a 40% improvement in inference throughput for our most used models across Copilot, driven by our software and hardware optimization work.

Microsoft added 1 gigawatt of GPU compute capacity in 2026 Q1 (FY2026 Q3); Microsoft is on track to double its overall compute footprint in 2 years; management announced new data center investments across 4 continents in 2026 Q1 (FY2026 Q3)  

All up, we added another gigawatt of capacity this quarter and remain on track to double our overall footprint in just 2 years. We are moving aggressively to add capacity aligned to our demand signals we see and we have announced new data center investments across 4 continents.

Microsoft’s AI infrastructure utilises chips from NVIDIA, AMD, and itself (Maia); Microsoft’s Maia 200 chip has 30% better tokens per dollar compared to other leading AI chips, and is now live in 2 Microsoft data centers; Microsoft’s Cobalt server CPUs are deployed in half of the company’s data center regions; as Microsoft’s customers scale their AI workloads, they are increasingly using other Microsoft cloud services and are choosing Cobalt to run these services; management is expanding Cobalt’s supply significantly to meet demand

We also continue to modernize our fleet with our first-party innovation alongside the latest from NVIDIA and AMD. Across our fleet, millions of servers are powered by our custom networking security and virtualization silicon, including Azure Boost as well as our first-party CPUs and accelerators. Our Maia 200 AI accelerator, which offers over 30% improved tokens per dollar compared to the latest silicon in our fleet, is now live in our Iowa and Arizona data centers. Our Cobalt server CPU is deployed in nearly half of our DC regions running workloads at scale for customers like Databricks, Siemens and Snowflake. As our largest customers scale their AI deployments, they’re increasingly leveraging other services across our platform and choosing to run those workloads on Cobalt. And we are expanding Cobalt supply significantly to meet this demand.

Microsoft’s management thinks Microsoft offers the broadest selection of models among the cloud hyperscalers; over 10,000 customers have used more than 1 model on Foundry; the number of customers who used Anthropic and OpenAI models doubled sequentially in 2026 Q1, or FY2026 Q3 (was 1,500 in 2025 Q4, or FY2026 Q2); Bayer is using multiple models in Foundry to build its in-house agent platform; over 300 Microsoft customers are on track to process 1 trillion tokens each on Foundry in 2026, up 30% sequentially 

We offer the broadest selection of models of any hyperscaler, so customers can choose the right model for the right workload across OpenAI, Anthropic, open source and more. Over 10,000 customers have used more than one model on Foundry. 5,000 have used open source models, and the number who have used Anthropic and OpenAI models increased 2x quarter-over-quarter…

…Bayer is using multiple models in Foundry to create its own in-house agent platform with more than 20,000 active monthly users. All up, over 300 customers are on track to process over 1 trillion tokens on Foundry this year, accelerating 30% quarter-over-quarter.

Microsoft’s management is building a unified IQ layer for organisational intelligence; the IQ layer initiative is driving acceleration in Microsoft’s data businesses, with Cosmos DB revenue up 50% year-on-year in 2026 Q1 (FY2026 Q3), Fabric customers growing 60% year-on-year to 35,000, and Fabric OneLake data up 4x year-on-year; 15,000 customers now use both Fabric and Foundry, up 60% year-on-year; Fabric provides agents with operational, analytical, and unstructured data; Microsoft’s Copilot Studio is helping enterprises build agents; nearly 90% of the Fortune 500 have active agents built with Copilot Studio’s low-code and no-code tools; Copilot’s credit consumptive offer is up 2x sequentially in 2026 Q1 (FY2026 Q3); Agent 365 is a control plane for managing agents’ governance, identity, and security; tens of thousands of companies are already using Agent 365 to manage tens of millions of agents

Across Fabric, Foundry, Microsoft 365 and our Security Graph, we are building a unified IQ layer for organizational intelligence. Thousands of enterprises already are accessing context across these IQ layers. And as AI usage grows, so does the context layer, creating a flywheel that continuously improves the grounding, relevance and effectiveness of every agent they use and build, making our IQ layers an unmatched context engine for organizational intelligence. More broadly, our database business accelerated quarter-over-quarter. Cosmos DB alone saw 50% year-over-year revenue growth driven by AI app workloads. We now have 35,000 paid Fabric customers, up 60% year-over-year. And all up, the amount of data in Fabric OneLake data lake increased nearly 4x year-over-year. Over 15,000 customers now use both Foundry and Fabric, up 60% year-over-year as enterprises connect agents to real-time operational, analytical and unstructured data that Fabric brings together…

…We are also helping knowledge workers build agents with tools like Copilot Studio. Nearly 90% of the Fortune 500 now have active agents built with our low-code/no-code tools. And we are seeing fast growth of our Copilot credit consumptive offer, up nearly 2x quarter-over-quarter as customers increasingly extend Copilot with custom agents tailored to their workflows…

…With Agent 365, we offer a control plane that extends company’s existing governance, identity, security and management frameworks to agents. Tens of thousands of companies are already managing tens of millions of agents in Agent 365, and we expect this momentum to grow significantly as agents will increasingly need tools for identity, governance, security and more.

Microsoft’s management is turning its family of Copilots from synchronous assistance software to asynchronous digital workers; Microsoft 365 Copilot seat adds grew 250% year-on-year in 2026 Q1 (FY2026 Q3), the fastest growth since launch; there are now over 20 million Microsoft 365 Copilot paid seats; the number of companies with over 50,000 Microsoft 365 Copilot seats grew 4x year-on-year in 2026 Q1 (FY2026 Q3); WorkIQ grounds Copilot’s responses with an organisation’s full context; the data residing in WorkIQ now spans 17 exabytes, up 35% year-on-year; users can now access multiple models together in Microsoft 365 Copilot to generate the best responses; monthly active usage of Microsoft’s 1st-party agents in Microsoft 365 Copilot is up 6x year-to-date; Copilot queries per user was up 20% sequentially in 2026 Q1 (FY2026 Q3); weekly engagement of Microsoft 365 Copilot is now on par with Outlook

We are evolving our family of Copilots from synchronous assistance to async coworkers that can execute long-running tasks across key domains. In knowledge work, it was another record quarter for Microsoft 365 Copilot seat adds, which increased 250% year-over-year, representing our fastest growth since launch. Quarter-over-quarter, we continue to see acceleration and now have over 20 million Microsoft 365 Copilot paid seats. The number of customers with over 50,000 seats quadrupled year-over-year and Accenture now has over 740,000 seats, our largest Copilot win to date. And Bayer, Johnson & Johnson, Mercedes and Roche all committed to 90,000 or more seats…

…Work IQ grounds Copilot responses in the full context of an organization, including people, roles, documents and communications, all within the company’s security boundary. The system of work behind Work IQ alone now spans more than 17 exabytes of data growing 35% year-over-year. The liquidity and freshness of that data matters, with billions of e-mails, documents, chats, hundreds of millions of Teams meetings, and millions of SharePoint sites added each day. And that context is getting even richer as Copilot adoption grows, Copilot and Agent conversations and artifacts they create feedback into Work IQ, making it even more context-rich…

…In Microsoft 365 Copilot, you now have access in chat to multiple models by default with intelligent auto routing, in Agents with Critique and Council. You can use multiple models together to generate optimal responses. As of last week, Agent Mode is now default experience across Copilot in Word, Excel and PowerPoint. And with Cowork, you now have a new way to delegate and complete work using Copilot.

All this innovation is driving record usage intensity across Copilot. We have seen a surge in usage of our first-party agents with monthly active usage up 6x year-to-date. Copilot queries per user were up nearly 20% quarter-over-quarter. To put this momentum in perspective, weekly engagement is now at the same level as Outlook, as more and more users make Copilot a habit.

Microsoft’s management is observing a shift in pricing in business software from seat-based models to seat-plus-consumption models because of AI; nearly 60% of Microsoft’s service customers are already buying usage-based credits; HSBC is using pre-built agents to reduce issue resolution time for customer inquiries by 30%; LinkedIn Talent Solutions’ agentic products now have an annualised revenue run rate of more than $450 million; management thinks the pricing model for business software could yet evolve further to include business outcomes into the equation

When it comes to biz apps, we are seeing a new pattern emerge as customers shift from traditional seat model to seats plus consumption. The customer service category is at the forefront of this transformation as nearly 60% of our service customers are already purchasing usage-based credits. For example, HSBC uses prebuilt agents with Dynamics 365 to manage customer inquiries across products, markets, regulatory requirements, reducing issue resolution time by over 30%. And our agentic products in LinkedIn Talent Solutions, which help hirers automate time-consuming tasks like sourcing, screening and drafting messages have already surpassed a $450 million annualized revenue run rate…

…From a customer perspective, they’re going to evaluate it by evals. Where are they seeing the value of tokens, as simple as that. So where they see the outcome, the eval and the token, whether it’s improving revenue, improving efficiency, and that’s what will refine. Like when we talk about IT budgets, IT budgets are going to have to be reshaped by a combination of business outcomes, making their way into IT budgets and maybe reallocation from other line items on the income statement like OpEx.

GitHub is growing rapidly, driven by agentic coding; nearly 140,000 organisations are using GitHub Copilot; GitHub Copilot enterprise subscribers nearly tripled year-on-year in 2026 Q1 (FY2026 Q3); most users in GitHub Copilot use multiple models; usage of GitHub Copilot CLI (command line interface) nearly doubled month-on-month; management has shifted GitHub Copilot to a usage-based pricing model

GitHub itself is seeing unprecedented growth driven by proliferation of agentic coding, and we are hard at work to scale and meet this demand. We see this even with GitHub Copilot. Nearly 140,000 organizations now use GitHub Copilot and enterprise subscribers have nearly tripled year-over-year. The majority of users leverage multiple models. We’re also seeing rapid adoption of GitHub Copilot CLI with usage nearly doubling month-over-month. And earlier this week, we announced our move to usage-based pricing model for GitHub Copilot as we align pricing to actual usage and cost.

1/3 of Microsoft’s cloud and AI-related capex in 2026 Q1 (FY2026 Q3) are for long-lived assets that will support monetisation over the next 15 years and more, while the other 2/3 are for CPUs and GPUs; Azure is still capacity-constrained, and management wants to balance Azure demand for compute with 1st party demand for compute; Azure’s capacity-constrain is expected to last through at least 2026

Capital expenditures were $31.9 billion, down sequentially due to the normal variability from cloud infrastructure buildouts and the timing of delivery of finance leases. And this quarter, roughly 2/3 of our CapEx was for short-lived assets, primarily GPUs and CPUs. The remaining spend was for long-lived assets that will support monetization over the next 15 years and beyond. This quarter, total finance leases were $4.7 billion and were primarily for large data center sites. And cash paid for PP&E was $30.9 billion, roughly in line with capital expenditures as the impact from finance leases was partially offset by differences between the receipt of goods and payment…

…In Azure and other Cloud Services, revenue grew 40% and 39% in constant currency against a prior year that included accelerating growth. Results were ahead of expectations as we delivered capacity earlier in the quarter, enabling increased consumption across both AI and non-AI services. Strong customer demand across workloads, customer segments and geographic regions continues to exceed available capacity…

…Broad and growing customer demand continues to exceed supply, and we continue to balance the incoming supply we can allocate here against our other high ROI priorities, first-party applications, R&D and end-of-life server replacement…

…Even with these additional investments and continued efforts to bring GPU, CPU and storage capacity online faster, we expect to remain constrained at least through 2026.

Azure grew revenue by 40% in 2026 Q1 (FY2026 Q3) (was 39% in 2025 Q4); Azure’s revenue growth was better than expected because capacity was delivered earlier in the quarter; Azure continues to be constrained by capacity and the constraint is expected to last through at least 2026; management wants to balance Azure demand for compute with 1st party demand for compute; as Microsoft’s customers scale their AI workloads, they are increasingly using other Microsoft cloud services; Azure’s margin for its AI business remains better than the non-AI business when it was at a similar age

In Azure and other Cloud Services, revenue grew 40% and 39% in constant currency against a prior year that included accelerating growth. Results were ahead of expectations as we delivered capacity earlier in the quarter, enabling increased consumption across both AI and non-AI services. Strong customer demand across workloads, customer segments and geographic regions continues to exceed available capacity…

…Broad and growing customer demand continues to exceed supply, and we continue to balance the incoming supply we can allocate here against our other high ROI priorities, first-party applications, R&D and end-of-life server replacement. As a reminder, year-over-year Azure growth rates can vary quarter-to-quarter based on capacity, timing and contract mix…

…Even with these additional investments and continued efforts to bring GPU, CPU and storage capacity online faster, we expect to remain constrained at least through 2026…

…. As our largest customers scale their AI deployments, they’re increasingly leveraging other services across our platform and choosing to run those workloads on Cobalt…

…We’ve been talking about sort of where this AI business of ours has been in the cycle compared to even the cycle we saw with the cloud, which now seems very long ago. And how margins were actually better and they remained better in our AI business versus where we saw in the cloud transition, looking back.

Microsoft’s management has gained more confidence over the past 1-2 years that the economics of AI’s addressable market are in areas where the company has structurally strong positions in

One of the things that we have learned even in the last, whatever, 2 years or so in AI and also build more conviction and confidence on is where is the TAM and the category economics of the TAM. And so this, I mean, it’s fascinating that here we are in 2026 and the most exciting things are plug-ins in Word or Excel or CLIs in coding or — and so when you see that, that means we have a structural position in knowledge work, coding, security, which are the big TAMs.

Microsoft’s management continues to feel good about partnering with OpenAI after the recent change to the 2 companies’ agreement; Microsoft has full IP rights to OpenAI’s frontier models all the way to 2032; OpenAI remains a large customer of Microsoft

We feel good about our partnership with OpenAI. I’m always very, very focused on any partnership and ensuring that there’s a win-win construct at all times. I mean that’s how you can remain with partners. In this case, it starts with, quite frankly, IP, Amy referenced this. We have a frontier model, royalty-free with all the IP rights that we will have access to all the way to ’32, and we fully plan to exploit it…

…They’re a large customer of ours, not just on the AI accelerator side, but also on all the other compute side, and so we want to serve them well.

Netflix (NASDAQ: NFLX)

Netflix has been using generative AI to improve content recommendations for members; management is also leveraging generative AI to provide better tools for filmmakers; Netflix acquired InterPositive, a company providing AI-powered filmmaking tools, in March 2026; management thinks Netflix has significant and unique data for applying AI; management thinks even with AI tools, only great artists can make great art; Netflix’s content creation partners have been leveraging AI tools for many purposes, and these tools also help improve on-set safety; InterPositive contains proprietary technology created specifically for filmmakers and for filmmaking, so it’s different compared to other generative AI video apps; management is already seeing momentum around adopting InterPositive’s tools among Netflix’s content creation partners; management has been working on content recommendation and personalisation for many years, but they think generative AI provides plenty of opportunity for Netflix to continue improving in those areas; management thinks AI can be applied in Netflix’s advertising suite to make it easier to create new formats, customise ads, and improve contextual relevance 

We’ve been using machine learning and AI for many years, and as the technology advances with GenAI, we continue to find new opportunities to deliver an even more seamless experience for members and expand possibilities for storytellers. This includes using GenAI to improve recommendations for members through deeper content understanding so we can recommend the right title at the right moment, test conversational discovery experiences, and improve the breadth and quality of our promotional assets. Leveraging GenAI, we are enabling our creative partners with more and better tools to help them tell their stories, with the potential to make our single largest area of spend—content—even more impactful. To accelerate this opportunity, in March we announced our acquisition of InterPositive, the filmmaking technology company founded by Ben Affleck that develops AI‑powered tools built by and for filmmakers…

…Given our technology DNA, we have a significant and unique data assets here. We have tremendous scale. So we see that as all great opportunities to leverage new technical capabilities across every aspect of the business. So I think AI is going to deliver benefits for our members, for creators and for our employees…

…It takes a great artist to make great art and AI won’t change that. But AI will give those artists better tools to bring those visions to life in ways that we’re just scratching the surface on. So today, our talent leverages these tools for things like set references, pre visualization, visual effects, sequence prep, shot planning. All of these things, by the way, also improve on-set safety, which is something that’s not talked about enough…

…With our acquisition of InterPositive, we think it accelerates our GenAI capabilities because it’s a proprietary technology that was created specifically for filmmakers and specifically for filmmaking and that’s different than other GenAI video applications. So while our ownership of InterPositive is very new, we have generated a bunch of interest with our creators who spent time with the tools, and we’re seeing real momentum build around adoption…

…We’ve been in personalization and recommendation for 2 decades, but we still see tremendous room and opportunity to make it even better by leveraging some of these newer technologies. We see that recommendation systems based on these new model architectures, not only improve the current personalization, but it also allows us to iterate and improve more quickly to improve that velocity. Things like adding support for different content types going forward, that’s much more quick, much more efficient…

…We really see an opportunity to leverage AI within our Netflix ad suite. Makes it easier to design new creative formats, custom ads, improved — that improve contextual relevance. And the technology stack just allows us to roll them out more quickly, more effectively and allow partners to leverage those things in an easier manner.

Taiwan Semiconductor Manufacturing Company (NYSE: TSM)

TSMC’s capital expenditure is always in anticipation of growth in future years; management expects capex for 2026 to be near the high end of its previous guidance of US$52 billion to US$56 billion (growth at the high would be 37% from 2025’s capex of US$41 billion); management now expects TSMC to grow revenue by more than 30% in USD terms in 2026 (previous guidance was for growth to be nearly 30%); TSMC’s capex in the last 3 years was ~US$100 billion, and the next 3 years is expected to be much higher, although management does not expect a sudden surge in capital intensity; management thinks the AI accelerators business will have a CAGR for 2024-2029 towards the high end of the previously released growth forecast of mid-to-high-50% CAGR

At TSMC, a higher level of capital expenditures is always correlated with higher growth opportunities in the following years…

…We now expect our 2026 capital budget to be towards the high end of our range of between USD 52 billion and USD 56 billion, as we continue to invest heavily to support our customers’ growth…

…We maintain strong confidence for our full year 2026 revenue to now grow by above 30% in U.S. dollar terms…

…In the past 3 years, our total CapEx was $101 billion. This year, we’re already seeing is towards the high end, which is $56 billion, which is already over 50% of the past 3 years in total. So we have a strong conviction in the AI megatrend. So we expect the CapEx in the next few years, in the next 3 years will be significantly higher than the past few years…

…Now therefore, we do not expect in the next several years, a sudden surge in capital intensity…

…But again, let me say that is toward higher 50s of the CAGR that we observe.

TSMC has been sourcing helium (an element whose supply has been affected by the conflict in the Middle East) from different regions, and it has safety stock in hand; TSMC has been working with Taiwan’s government to secure power, and Taiwan has sufficient LNG supply through at least May; management does not expect any near-term impact to TSMC’s operations from the Middle East conflict in terms of materials and power supply

About the materials and energy supply update given the recent situation in the Middle East. TSMC operates a well-established enterprise risk management system to identify and assess all relevant risks and proactively implement risk mitigation strategies. In terms of material supply, TSMC’s strategy is to continuously develop multi-store supply solutions to build a well-diversified global supplier base and to improve the local supply chain. For specialty chemicals and gases, including helium and hydrogen, we source from multiple suppliers in different regions and we have prepared safety stock inventory on hand. We are also working closely with our suppliers to further strengthen the resiliency and sustainability of our supply chain. Thus, we do not expect any near-term impact on our operations for material supply.

In terms of energy, TSMC worked closely with Thai Power and the Taiwan government to ensure a stable and sufficient energy supply. With the recent situation in the Middle East, the Taiwan government has announced it has secured sufficient LNG supply through at least May. The government has also said it is actively working on securing further LNG supply, diversifying sourcing to other regions and other power backup plans. Therefore, we do not expect any near-term disruption or impact to our operations.

TSMC’s management sees very robust AI-related demand, as the shift from generative AI and queries (chatbots) to agentic AI is leading to a step-up in token consumption; management is seeing very strong signals and positive outlooks from TSMC’s customers’ customers, who are the cloud service providers; management’s conviction in the AI megatrend remains high

AI-related demand continues to be extremely robust. The shift from generative AI and the query mode to agentic AI and command and action mode is leading to another step-up in the amount of token being consumed. This is driving the need for more and more computation, which supports the robust demand for leading edge silicon. Our customers and customers of customers, who are mainly the cloud service providers, continue to provide us with a very strong signal and positive outlook. Thus, our conviction in the multiyear AI megatrend remains high, and we believe the demand for semiconductors will continue to be very fundamental.

TSMC’s management intends to ramp up new technology nodes in Taiwan because of the need for tight integration between production and R&D; TSMC’s N2 node entered high-volume manufacturing in 2025 Q4 in Taiwan with good yield; N2’s ramp is supported by strong demand from both smartphone and HPC AI applications; management believes that N2, N2P, and A16 will lead to the N2 family becoming another large and long-lasting node for TSMC; management has decided to add capacity for N3 even though TSMC has historically not added capacity to a node once it has reached its target capacity, because of the strong demand for N3 in AI applications; management is seeing robust multiyear demand for N3 nodes from end markets such as smartphone, HPC AI, and more; TSMC is adding a new N3 fab to its giga fab cluster in Tainan, with volume production expected in 2027 H1; TSMC is continuing to convert N5 tools to support N3 capacity in Taiwan; management is focusing on flexible capacity support among the N7, N5, and N3 nodes; the upcoming A14 node has 10-15 speed improvement at the same power or 25-30 power improvement at the same speed, and a nearly 20% chip density gain; the A14 node is on track and progressing well; management is seeing a high level of customer interest and engagmeent for A14; volume production of A14 is expected for 2028

Our practice is to prioritize the land in Taiwan to support the fast ramp of our new node due to the need for tight integration with R&D operations. Today, our new node, N2, has already entered high-volume manufacturing in the fourth quarter of 2025 with good yield. N2 is ramping successfully in multi phases at both Hsinchu and Gao Hsiung site supported by strong demand from both smartphone and HPC AI applications. With our strategy of continuous enhancement such as N2P and A16, we expect our N2 family to be another large and long lasting node for TSMC.

Historically, we do not add additional capacity to a node once it reached its targeted capacity. However, as a foundry, our first responsibility is to provide our customers with the most advanced technologies and necessary capacity to unleash their innovations. Based on our assessment, to meet the strong demand in AI application, we are stepping up our CapEx investment to increase our N3 capacity. Thus, we are now executing global capacity plan to support the robust multiyear pipeline of demand for 3-nanometer technologies, which are used by smartphone, HPC AI, including HBM based side, automotive and IoT customers. 

In Taiwan, we are adding a new 3-nanometer fab to our giga fab cluster in Tainan Volume production is scheduled for the first half of 2027…

…In addition to all the new fabs, we continue to convert 5-nanometer tool to support 3-nanometer capacity in Taiwan…

…We are also focusing on capacity optimization across nodes, which including flexible capacity support among the N7, N5 and N3 nodes…

…Figuring our second-generation transistor structure, A14 delivered another 4-node stride from N2, with performance and power benefit across to address the sensible need for high performance and energy efficient computing. Compared with N2, A14 will provide 10 to 15 speed improvement at the same power for 25 to 30 power improvement at the same speed and close to 20% chip density gain. Our A14 technology development is on track and progressing well. We are observing a high level of customer interest and engagement from both smartphone and HPC applications. Volume production is scheduled for 2028. Our A14 technology and its derivatives will further extend our technology leadership position and enable TSMC to capture the growth opportunities well into the future.

TSMC’s 2nd Arizona fab will utilise N3 technologies; the N3 nodes in the 2nd Arizona fab will begin volume production in 2027 H2; management has gained a lot of experience in Arizona, and expects to improve the cost structure of the Arizona fabs

In Arizona, our second fab will also utilize 3-nanometer technologies. Construction is already complete and volume production will begin in the second half of 2027…

…We already gained a lot of experience in Arizona. And so now we have much more confidence in last year that we can make good progress and moving aggressively forward and with, we expect we can improve the cost structure, of course.

TSMC’s management now plans to utilise N3 technology in the company’s 2nd fab in Japan; volume production is scheduled for 2028

In Japan, we now plan to utilize 3-nanometer technology in our second fab and volume production is scheduled in 2028.

TSMC’s management is open to including CPUs into its HPC (high-performance computing) AI calculation, but they will not do it right now, because TSMC is not able to tell where the CPUs it manufactures goes to

[Question] TSMC’s definition of AI revenue includes GPU, AI accelerator, HPM based maybe I up a few others, but it does specifically excludes data center CPU, I think you made that the definition very clear for a couple of years now. But with the CPU, there’s more and more conversation about CPU now becoming part of the AI infrastructure, especially for agentic workflows. Any chance for TSMC to maybe provide us revised numbers for AI revenue and maybe the AI revenue growth take a projection going 2029, 2030 and maybe hopefully give us some sense about the historical AI revenue numbers would have been if some of the data centers CPU numbers, especially for genetic AI workloads are included there.

[Answer] Certainly, CPUs becomes more and more important in today’s AI data center. But actually, let me share with you, this is a good question, by the way. Let me share with you that we are not able to identify which CPU goes to where, right? It’s a PC or it’s desktop or it’s AI data center. So today, we still not include the CPUs in our AI HPC’s calculation. Someday later, we might consider.

TSMC is working with NVIDIA for its next-generation LPU (language processing unit); the LPU comes with NVIDIA’s recent acqui-hire deal with Groq; Groq’s LPUs have historically been manufactured by Samsung

[Question] NVIDIA, of course, they recently added more CPU content to the overall but I think that most people are focusing on that brand-new LPU. They recently added — we understand I appreciate that the TSMC very strong institute and we’ll definitely participate in that upside in CPU. But the LPU business, it’s the acquired business, well, for historical reasons, it’s still at your competitors Samsung Foundry. And I think investors are looking at that and the thing that maybe looks like Samsung foundry finally made the first inroads into AI. So any thoughts from TSMC side, how should we think about whether and how TSMC will win back that LPU business or any future business coming from your customers?

[Answer] We are working with our customers for their next-generation LPU anyway. And we are very confident in our technology position, and we will work hard to capture every piece of business possible.

Tesla (NASDAQ: TSLA)

Tesla’s management is going to increase the company’s capital expenditure significantly, partly for AI-related investments; the increase in capital expenditure will last for a few years; management expects Tesla’s capex to be $25 billion in 2026, and thus cause the company to have negative free cash flow for the year

We’re going to be substantially increasing our investments in the future so you should expect to see significant — a very significant increase in capital expenditures, but I think well justified for a substantially increased future revenue stream…

…We’re investing in and improving our core technologies, battery powertrain, AI software, AI training, chip design, manufacturing — laying the groundwork for significantly increased manufacturing and production. We are also strengthening our supply chain across the board, batteries, energy, AI, silicon, everything, and laying the groundwork, like I said, for what we expect to be a significant increase in vehicle production in the future and, of course, a very significant increase — well, actually releasing Optimus…

…We are in a very big capital investment phase, which is going to start now and would last a couple of years. So based on that, our current expectation for 2025 — 2026 is over $25 billion of CapEx. And just to remind you, we are paying for 6 factories which were going to go into operation. Some have already started, some would go into operation later part of this year. We’re further increasing our investment in AI-related initiatives, including the AI infrastructure to support Robotaxi and the launch of Optimus. We’ve already started placing orders for the research semiconductor fab in Austin and for solar manufacturing equipment. While this may seem a lot and will have the impact of negative free cash flow for the rest of the year, we believe this is the right strategy to position the company for the next era.

Tesla’s management thinks Optimus can be useful outside of Tesla sometime in 2027; management continues to think Optimus will be the biggest ever product made; Tesla is preparing its Fremont factory for production of Optimus later this year; the production S-curve of Optimus will be very slow at the start, before ramping significantly in 2027; Tesla is building a 2nd Optimus factory, with production scheduled for mid-2027; v3 of Optimus (Optimus 3) is almost ready to be demonstrated, but management is hesitant because they have found competitors trying to copy Optimus’s design (in the 2025 Q4 call, management said Optimus 3 would be ready in a few months); management thinks Optimus can start production in July/August 2026, but it will take tremendous work to get there; management does not know what the production rate for Optimus will be in 2026; the production rate for Optimus will be limited by the slowest part in the entire Optimus supply chain; management wants to place a lot of intelligence locally in Optimus in the event that the robot loses wireless data; management thinks Optimus would need an orchestrator-AI and a voice AI, both of which can be Grok (a foundation model from one of Elon Musk’s companies, xAI)

But increasing our internal production for testing and then probably being able to have Optimus be useful outside of Tesla sometime next year. As you’ve heard me say a few times, I think, Optimus will be our biggest product — not just Tesla’s biggest product ever, but probably the biggest product ever. And I remain convinced of that conclusion…

…We’re preparing Fremont for start of production later this year with Optimus. Again, totally new supply chain, totally new technology. So therefore, the production S-curve is always very slow in the beginning, but it will ramp up to significant numbers next year. And we’re constructing a second Optimus factory in — at our Giga Texas location. And that will probably start production around summer next year.

The V3 Optimus design is almost ready to demonstrate. I think we want to just make sure it’s like polished. Like it works functionally, but there’s some aesthetic elements that need to be finalized. And I think probably middle of this year, we should be able to show it off. We’re also a little hesitant to show V3 off because we find our competitors do a frame-by-frame analysis whenever we release something and copy everything they possibly can. So I think there’s some value to not showing new technology until it’s close to production…

…We want to push the Optimus 3 unveil maybe closer to production. Start of production is — we’re assuming is somewhere around the late July, August time frame…

…The last S, X production will be in early May. But you have to look at the entire upstream portion of the production line. So you have to start with sales, battery packs, motor production, all the parts production. And so we’ve been dismantling the S, X production line from the more base-level parts — more basic level parts to — as you get to more larger subassemblies, you start dismantling the line from the small parts first, not from the final assembly first. So the final assembly line will — that will be dismantled next month and after the last of the S, X vehicle is done. You can’t dismantle some gigantic production line like overnight. It takes at least a few months to do so. And then you’ve got to install a new production line, and you’ve got to provide all of the wiring and communication, test out the machines of the new production line for Optimus. So that also takes several months. So frankly, if we’re able to go from stopping production on one line, dismantling that entire line, reinstalling a whole new line and turning that on in a matter of 4 months, that is an insanely fast speed. I don’t think any other company on earth has ever done that before…

…I don’t know what the production rate of Optimus will be this year. It is impossible to predict these things…

…when you have a brand-new product in an entirely new production line and you have 10,000 unique items, all of which have to go right into ramp production, it will move as fast as the least lucky, slowest, dumbest part in the entire 10,000. And this is a — Optimus is a completely new product with completely new production line. So it’s just literally impossible to predict, except that I think it will be quite slow at first as we iron out the 10,000-plus unique items that have to be sold for Optimus to reach volume production…

…We think we can put a lot of intelligence locally in the robot, and it certainly needs to be enough intelligence that if the robot gets disconnected, like if it’s a bad cellular signal or there isn’t WiFi, Optimus can’t just get stuck. It needs to have enough local intelligence that it can still do useful things even if it loses connection, kind of like a car…

…You can think of like Optimus needs kind of a manager to tell it what to do, broadly speaking, like if — otherwise it’s going to keep doing the same thing it did before. So I think you need kind of an orchestration AI, which Grok would be good for orchestration. And then for Optimus’ voice, having a low-latency intelligent voice AI, Grok is actually very good for that. So if you want to talk to Optimus and have kind of a Grok-level conversation, you kind of need to connect to a Grok-level AI for that.

All Tesla cars are autonomy-ready; supervised full self-driving is getting really good; v14.3 (version 14.3) of FSD was a major architectural update; management has a pipeline of improvements for FSD that they think will lead to unsupervised full self-driving being available globally; v15 of FSD is coming by end-2026 or early-2027; v15 of FSD will be a complete software architecture overhaul; v15 of FSD will run on Tesla’s AI4 chip; management thinks v15 of FSD will increase the safety level of FSD to way above human level; FSD now has 1.3 million paid customers globally (1.1 million in 2025 Q4); most of the growth in FSD customers in 2026 Q1 came from subscriptions, as management has removed the upfront-purchase option in some markets during the quarter; FSD recently received approvals in Netherlands; management is looking for EU-wide approval for FSD in 2026 Q2; FSD has received some approvals in China, although broader approval has yet to arrive; management hopes FSD can be fully approved in China by 2026 Q3; management has changed Tesla’s sales strategy to emphasise FSD as the product; management hopes to have unsupervised FSD in a dozen states by end-2026; management thinks unsupervised FSD revenue will not be material in 2026 but will be material in 2027; management thinks unsupervised FSD will reach customer-cars by 2026 Q4, but the release will be gradual; the FSD software deployed in Netherlands has the same exact architecture and the training procedure as the US version, but with more Europe data; management believes that the way Tesla solves full autonomy in the US can be applied to all parts of the world, if Tesla can add data from local regions; the Tesla customer fleet of vehicles is driving close to 10 billion miles on FSD in a few weeks; management thinks v14.3 of FSD is the last piece of the puzzle to enable unsupervised FSD; most Tesla drivers with Hardware 4 are already using FSD; FSD’s churn rate has improved

It’s always, I think, worth noting that a Tesla car is incredibly — incredible value for money, and they’re all autonomy-ready, depending on what part of the world you’re in. The supervised full self-driving is getting extremely good…

…For full self-driving and Robotaxi, version 14.3 was a major architectural update. And we have a whole pipeline of major improvements to full self-driving that, we believe, will lead to unsupervised full self-driving being available anywhere in the world that it is legal to do so. And then there’s a version 15, hopefully later this — hopefully by the end of this year, but certainly by early next year. And that will be a complete overhaul of the software architecture, and will run on AI4. That’s — and at that point, we’re really just increasing the safety level of FSD above human safety level, even more. Meaning, I think, even within version 14, we’re significantly safer than human, but v15 will take that to another level…

…On the FSD adoption front, we continue to see improvement, reaching nearly 1.3 million paid customers globally. The bulk of the growth came from subscriptions, while upfront purchases only increased 7% as we remove the purchase option in some markets in Q1.

We recently received approvals for FSD in Netherlands. This sets up us well for an EU-wide approval later in Q2, and we’re just gated by how the regulators go about it. Additionally, we’ve also received approvals in China. The broader approval is still not there, but we’re working with the regulators in the country, and we’re hoping that we can get approval by Q3…

…We have evolved our vehicle sales strategy, where we now emphasize FSD as a product and vehicle as only the delivery mechanism…

…We certainly hope to be — have unsupervised FSD/Robotaxi operating in, I don’t know, a dozen or so states by the end of this year…

…I think probably unsupervised FSD or Robotaxi revenue would not be super material this year. But I do think it will be material — it will be material probably in a significant way next year…

…[Question] When do you expect FSD unsupervised to reach customer cars?

[Answer] I’m just guessing here, but probably in the fourth quarter. It’s difficult to release this like to everyone everywhere all at once because we do want to make sure that they’re not unique situations in a city that particularly complex intersection or actually, they tend to be places where people get into accidents a lot because they’re just — perhaps there’s — and like I said, an unsafe intersection or bad road markings or a lot of weather challenges. So I think we would release unsupervised gradually to the customer fleet as we feel like a particular geography is confirmed to be safe…

…From a technology standpoint, what we deployed in Netherlands and Europe is the same exact architecture and the training procedure and so on, except it had more Europe data. And I suspect that same thing will be true for unsupervised FSD as well. Whatever we use to solve in the U.S. will work in other places and the rest of the world, too, provided we were able to add the data from the local regions…

…We are simultaneously solving the long tail of safety by monitoring the metrics across the entire Tesla customer vehicle fleet, which is close to driving 10 billion miles on FSD in the next few weeks…

…I think 14.3 is last piece of the puzzle for unsupervised FSD. Now the question is like degrees of safety. Like how — safety and convenience, I suppose…

…[Question] You have 180,000 new users, paying users this quarter, and I compare that to your overall installed base. It might be 15%, but then if I shrink that to the U.S. or to North America where most of them are, it’s probably more like 30%, 35%. And I’m trying to — and I compare that to what you sold, about 100,000 cars in North America in the quarter. So you’re winning twice more FSD users than you’re selling cars. And then if I add to that picture the fact that, I guess, it’s mostly Hardware 4 owners who subscribe to FSD, it sounds like most drivers in North America who have Hardware 4 would already be using FSD. Is that the right way to think about it and the kind of like success FSD is meeting today?

[Answer] You’re thinking about it the right way…

…We are actually seeing churn of subscribers also coming down, which again is a reflection of the product is getting better.

Tesla has started production of Cybercab, which are autonomous vehicles for the company’s Robotaxi fleet; the production of Cybercab will be a stretched-out S curve, ramping up only towards end-2026; the Robotaxi service has been expanded to Dallas and Houston; the expansion of the Robotaxi service is limited by management’s desire for really high safety levels; Robotaxi has, to-date, not had a single accident or injury; management hopes to have unsupervised Robotaxi in a dozen states by end-2026; management thinks Robotaxi revenue will not be material in 2026 but will be material in 2027; Robotaxi is currently running on FSD v14.3; Cybercab is 2-person vehicle; management thinks most of Tesla’s future vehicle production will be Cybercab; Tesla’s vehicles in the Robotaxi fleet sometimes get stuck because it’s programmed for maximum safety; the vehicles in the Robotaxi fleet can sometimes be stuck on infinite loops

We have just started production of Cybercab…

…Whenever you have a new product with a completely new supply chain, new everything, it’s always a stretched out S-curve. So you should expect that initial production of Cybercab and Semi will be very slow, but then ramping up and going kind of exponential towards the end of the year and certainly next year…

…We’ve expanded Robotaxi to Dallas and Houston using the same software source in the Bay Area. And the limiting factor for expansion is really rigorous validation, making sure things are completely safe. We don’t want to have a single accident or injury with the expansion of Robotaxi. And we have, to the credit of the team, not had a single one to date…

…We certainly hope to be — have unsupervised FSD/Robotaxi operating in, I don’t know, a dozen or so states by the end of this year…

…I think probably unsupervised FSD or Robotaxi revenue would not be super material this year. But I do think it will be material — it will be material probably in a significant way next year…

…So far, we have 0 incidents, and that’s what the NHTSA filing also shows…

…The version of Robotaxi that’s running in Austin, Dallas, Houston, et cetera, those are essentially 14.3 variants, and it’s obviously safe that, that’s why we’re able to launch in those cities…

…Cybercab is a compact vehicle. It’s actually — I mean, it’s very roomy, but it’s a 2-person vehicle. And we do think probably most of our production long term will be Cybercab because 90% of miles driven are with 1 or 2 people…

…A lot of what limits wider deployment of Robotaxi are actually not safety issues, but convenience issues or the car basically gets paranoid and gets stuck. Like sometimes it gets — because it’s programmed for maximum safety, so the problem is that then it sometimes just gets scared to do things. So like sometimes it gets scared to cross railroads, for example, or it’ll get stuck at a light or where there’s — the light never changes from red or, I mean, there was one kind of amusing situation where a whole bunch of Robotaxis got stuck in the left turn lane in Austin because, I kid you not, a Waymo had crashed into a bus. And so they could not turn left because the Waymo had crashed into the bus. And so you have this like long line of like, I don’t know, a dozen or more Tesla Robotaxis that were waiting for the bus to move, but the bus was never going to move because the Waymo crashed into the bus…

…We’ve also had literal infinite loops where the car might want to make a turn into a road, but there’s construction, and then it goes around the block, tries to turn into the road with construction, goes around the block, tries to turn into the road, and so you have to stop the infinite looping, the literal infinite looping.

Tesla has taped out its AI5 chip; management thinks the AI5 chip will be the best AI chip for inference at the edge, and will be the best value-for-money AI chip; Tesla is already designing the AI6 chip and is working on Dojo 3; management expects AI5 to go into Optimus and Tesla data centers, because AI4 is currently sufficient to achieve autonomy that is much safer than human drivers, so AI5 is not needed in the vehicle fleet; management thinks it will make sense at some point in the future to put AI5 into Tesla vehicles; management is planning to increase the memory and compute capacity of AI4, but the progress partly depends on Samsung (the fab for the chip)

Congratulations to — again to the Tesla AI chip team for taping out AI5. That’s going to be a great chip. I think probably the best AI inference chip for edge compute that exists. And certainly, I think, unequivocally the best value for money. The team did a great job. And we already have a lot of momentum for designing AI6, and we’ve begun to discuss ideas for Dojo 3…

…I do expect that AI5 will go into Optimus and into the data center because it’s looking like we’ll be able to achieve unsupervised self-driving with AI4 that is far greater than human safety levels. So — which means it’s not — certainly not immediately needed in the car. At some point, I think it will make sense for us to switch to AI5 in the car, but that’s — but there’s not a pressing issue to do so. So — but at some point, the AI4 hardware is going to get like so old that it’s like, okay, the only reason they’re keeping the factory open is for AI4.

We are planning an AI4 upgrade to use newer generation RAM. So it will go from 16 gigabytes to, I think, 32 gigabytes per SoC. So a total of 64 gigabytes, and probably a 10% increase in compute in sort of into — trillions of operations per second and in memory bandwidth. So that’s AI4.1 or AI4+, probably goes into production middle of next year, I think, depends. It depends on — Samsung is doing the modifications for us. So it sort of depends on when they’re able to finish that — finish those modifications and bring it to production.

Tesla’s management now thinks that Tesla vehicles with Hardware 3 will not be able to run unsupervised FSD; Hardware 3 has much lower memory capacity for Hardware 4, and memory capacity is needed for unsupervised FSD software to run; management is offering a trade-in for Tesla Hardware 3 vehicles to upgrade to Hardware 4; management is also considering setting up small factories to upgrade Hardware 3 on existing vehicles to Hardware 4

Unfortunately, Hardware 3 — I wish it were otherwise, but Hardware 3 simply does not have the capability to achieve unsupervised FSD. We did think at one point, it would have that, but relative to Hardware 4, it has only 1/8 of the memory bandwidth of Hardware 4. And memory bandwidth is one of the key elements needed for unsupervised FSD. And it’s just generally a thing that’s needed for AI. If you’re doing autoregressive transformer memory bandwidth, this is the choke point. So for customers that have bought FSD, what we’re offering is essentially a trade in — like a discounted trade-in for cars that have AI4 hardware. And then we’ll also be offering the ability to upgrade the car, to replace the computer, and you also need to replace the cameras, unfortunately, to go to Hardware 4.

So to do this efficiently, we’re going to have to set up like kind of micro factories or small factories in major metropolitan areas in order to do it efficiently. It’s — because if it’s done just at the service center, it is extremely slow to do so and inefficient. So we basically need like many production lines to make the change. And I do think, over time, it’s going to make sense for us to convert all Hardware 3 cars to Hardware 4 because that’s what enables them to enter the Robotaxi fleet and have unsupervised FSD.

Tesla’s research fab for the TeraFab project will begin construction this year at the company’s Giga Texas campus; management’s still working out details on TeraFab, which is a joint-venture between Tesla and other Elon Musk-related companies (xAI and SpaceX); the construction of the research fab will see Tesla spend around $3 billion, and the research fab is for Tesla to try out new ideas; SpaceX will be in charge of the initial phase of the scaled up TeraFab; Intel will be partnering the TeraFab for some of the core manufacturing technologies; TeraFab will utilise Intel’s 14A process, which is leading-edge but currently not fully mature; the TeraFab will be housing memory, logic, mask, lithography, and advanced packaging all under one roof, whereas the broader fab industry has separate facilities and companies for the different activities; management wants TeraFab to house all the different activities because they think it’s the fastest way to conduct R&D, but they are also aware it’s a long shot; management sees TeraFab as the only way to produce sufficient AI chips for the world, and not to press 3rd-party fabs on pricing; the TeraFab is also a great way for management to test out the radical ideas they have for improving AI chips

We’ve also finalized plans for the chip fab — the research chip fab on the Giga Texas campus, and we’ll start construction of that this year…

…We’re still working out the details of the Terafab deployment. In the near term, Tesla will be building the research fab on our Giga Texas campus. This is something we expect to be probably a $3 billion-ish initiative and capable of maybe a few thousand wafers per month, but it’s really intended to try out ideas, the research fab, both in terms of maybe — we have some ideas for improving the fundamental technology of how chips are made and some of the — there’s some new physics we’d like to test out. But we also want to test out the ability to see if something is working in production. So you need kind of like a few thousand wafer starts a month to make sure that a production process is sound. And then SpaceX is going to take care of like the initial phase of the scaled up Terafab. And that’s what we’ve figured out thus far…

…Intel is excited to partner with us on some of the core manufacturing technologies. So we plan to use Intel’s 14A process, which is state-of-the-art and, in fact, not yet totally complete. So — but given that by the time Terafab scales up, 14A will be probably fairly mature or ready for prime time. 14A seems like the right move…

…I think this will be unique in the world, or at least I’m not aware of any — a place where you have the lithography mask creation, the — and then logic, memory and packaging under one roof in one building. That’s about the fastest I could possibly imagine doing recursive research and development and being able to try out some pretty radical ideas, some of which have — it’s kind of long-shot stuff, but if some of these long shots pan out would be radical improvements in the way chips work…

…Terafab is not some sort of mechanism to generate leverage over our chip suppliers. It’s just literally we don’t see a path to having enough — a sufficient quantity of AI chips down the road as we scale production to high levels. Just the rate at which the industry is growing in logic, but even more so in memory, it just doesn’t — we just anticipate hitting the wall if we don’t make chips ourselves…

…I think that we do have some ideas for how to make maybe radically better AI chips. And these are kind of research ideas there — which means like long shot, but if long shot pays off, it’s maybe a giant improvement. And it’s just easier to do that if we have our own research fab and are developing our own production technologies. So — and if you look sort of long term at, say, having AI satellites, making chips for those. There’s just no way in hell the existing industry can keep up with that. It’s impossible.

Visa (NASDAQ: V)

Visa’s management believes agentic commerce will expand Visa’s market opportunity in 4 ways, namely, (1) accelerating the digitisation of commerce, (2) creation of significantly more transactions by agents, especially in a new category of commerce characterised by micro transactions, (3) accelerating the digitisation of B2B payments, with virtual cards and tokens becoming a preferred way to pay and be paid, and (4) accelerating overall GDP growth by 80-150 basis points

We believe AI and agentic commerce will expand our addressable market in 4 important ways. 

First, like eCommerce and mobile commerce before it, agentic commerce will accelerate the digitization of commerce around the world. And just like the acceleration from eCommerce and mobile commerce, Visa will benefit.

Second, agents will create significantly more transactions. Agents will intelligently split purchases across multiple transactions, optimizing price, timing and value to the buyer. And importantly, in some use cases, we expect agents will pay for their own data and resource consumption transaction by transaction and event by event, which creates an entirely new category of commerce with micro transactions.

Third, we will see accelerated digitization of B2B payments, where there is still enormous friction that AI agents can help remove. They will be able to automate payment initiation directly from invoices and contracts and manage approvals autonomously. In this context, virtual cards and tokenization will become a preferred way to pay and be paid.

And lastly, just like the advent of eCommerce and mobile commerce, agentic commerce will increase economic growth generally. Third parties estimate we are looking at a boost of 80 to 150 basis points of incremental GDP growth from AI and when GDP grows, spending grows and digital payments transactions grow.

Visa’s management believes the company is well positioned to win in agentic commerce for 3 reasons, namely, (1) the massive scale of Visa’s network, which means plenty of proprietary data to work with, (2) the tight security of Visa’s network, and (3) the high level of trust in it; Visa is a proven leader in tokenisation, and management believes tokens will become an essential element in agentic transactions; management thinks people will want their agents to pay with cards, just like how they prefer to use cards for physical and online payments; management recently launched Intelligent Commerce Connect, a network protocol and token vault agnostic on-ramp for agentic commerce; management is seeing early growth in agentic commerce transactions performed with Visa agentic tokens; management thinks the CLI (command line interface), which is effectively a chat box, is becoming a commerce platform, and cards will continue to have strong value in CLI-driven commerce transactions of all sizes; management recently launched Visa CLI as a proof-of-concept for developers to use their Visa credentials to make payments; early feedback for Visa CLI is very positive; management thinks agents will soon realise that no other payment methods, other than Visa cards, offer ease of use, broad acceptance, privacy, easy liquidity management, KYC, user security protection, and rewards; management thinks the limiting factor for agentic commerce is currently trust, which also means users will fall back on payment methods they already trust

Visa is extraordinarily well positioned to win in agentic for 3 important reasons. Our network, security and trust. Our network has enormous scale, more than 175 million seller locations, 5 billion credentials in 200 countries and territories with nearly 14,500 financial institution clients who have opted in to using this network. Payment security is only going to become more difficult and more valued. With our scale comes over 300 billion transactions annually, equating to an average of about 900 million transactions per day, and all of the data that comes with it. Visa has proven it knows how to manage transaction risk, identity risk and fraud, all enabled by this transaction data. And trust. Visa has well-established trust grounded in our standards and brand. We’ve set the standards that enable trusted payments in the digital and emerging agentic ecosystem.

And a big part of our network, security and trust are Visa tokens. Visa is a proven leader in tokenization, which is foundational in eCommerce and is set to become an essential element of trusted transactions in an agentic world.

People overwhelmingly choose to pay with cards face-to-face and online, and they will prefer their agents to pay with cards. And merchants want this, too. We recently launched Intelligent Commerce Connect, which acts as a network protocol and token vault agnostic on-ramp to agentic commerce for agent builders, merchants and enablers. Now while it’s early, we are seeing growth in agentic shopping and the emergence of early agentic commerce, real transactions with Visa agentic tokens.

And AI continues to evolve. With the AI landscape, we are seeing that Claude code and other agentic coding assistants will allow anyone to become a developer. It’s that easy to work in simple command-style tools like the command line interface, or CLI. These agentic coding assistants are a great example of how we see AI and agentic commerce increasing economic growth as they enable anyone to bring their new business ideas to life. We see a world where we will all design, build and launch digital products and experiences ourselves, engage with digital platforms and buy digital services using the CLI, or a slick consumer-friendly version of one as our interface. The CLI itself is becoming a commerce platform, and we believe that the preference and value of cards will be equally strong for all sizes of transactions, including micro transactions. A key to making this happen is enabling safe, simple and easy payments that are widely accepted by all API endpoints. We recently launched Visa CLI as a proof of concept, which shows how easy it is for a developer, soon all of us, to use their Visa credential to pay for digital services like an image, a website builder or more via the CLI. The early feedback we have been receiving from developers is very positive. And as we move forward, we plan to enable CLI commerce at scale, which means scaling the availability of command line tools and card acceptance by promulgating standards, products, rules and pricing…

…In all of these use cases, Visa cards are providing significant value. They’re easy to use, broadly accepted, integrated into the transaction flow, offer privacy, unlike most stablecoins, offer a way to manage liquidity in aggregate rather than funding millions of real-time micro transactions, offer an issuer KYC, user security protections if something goes wrong, and in many cases, cards offer rewards and benefits. We see no other payment method on earth that delivers all of these features. Buyers know this, sellers know this and soon so will agents. We expect more transactions, more value-added services and therefore, more revenue in the years ahead from agentic…

…I think the limiting factor for agentic commerce is trust. I think when we all think about ourselves as buyers and we all think about ourselves having agents go out and transact on our behalf, we are going to fall back on payment methods that we, as users, trust…

…When you think about yourself as a user, when you think about kind of who you’re going to trust your agent to make payments on your behalf, whether those are macro transactions, average transactions or micro transactions, we feel really good about our ability to win those transactions for our users using all of those capabilities.

AI is making Visa’s value-added services better; Visa’s new Large Transaction Model, which has a 5x increase in fraud value capture, is starting to be a foundational model for a variety of AI-powered fraud and risk services at the company; management has been integrating AI features across Visa’s VAS solutions; management thinks AI helps improve the differentiation of Visa’s VAS business even more; there are a variety of AI-driven products within the VAS portfolio that have helped the business perform well

Across Visa, AI is making what we do even better, especially for our value-added services. Our new Visa Large Transaction Model is beginning to act as the foundational model for a variety of AI-powered fraud and risk services at Visa. Early results have shown that it can power up to a 5x increase in fraud value capture. Our team has been integrating new AI-enabled features across our suite of VAS solutions, including the recent release of 6 dispute resolution capabilities. In fact, across all of our services, client adoption has been the fastest among AI embedded services such as Smarter Stand-In Processing and Visa Provisioning Intelligence…

…Our value-added services are highly differentiated and even more so in an AI world…

…We’ve been shipping new, especially AI-driven products in the issuing solutions space. We outperformed in the quarter in our AI-driven stand-in processing platform. We outperformed in our Visa supplier payment services platform. Those are two of the service — issuing solution platforms. In the acceptance side of the business, our Visa account updater platform outperformed. That’s one that allows merchants to automatically upstore credentials when you might have had fraud on your account and it was reissued or something like that. Look at our Risk and Security Solutions area, we saw outsized performance in VCAS, our Visa Consumer Authentication Service, or also in our VAA and VRM platforms, Visa Advanced Authorization and Visa Risk Manager. These are all products that we’ve been deploying in market, largely AI-driven products, and they’ve been driving broad-based out-performance across the value-added services portfolio.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I have a vested interest in Alphabet, Amazon, Apple, ASML, Intuitive Surgical, Mastercard, Meta Platforms, Microsoft, Netflix, TSMC, and Visa. Holdings are subject to change at any time.

What We’re Reading (Week Ending 03 May 2026)

The best articles we’ve read in recent times on a wide range of topics, including investing, business, and the world in general.

We’ve constantly been sharing a list of our recent reads in our weekly emails for The Good Investors.

Do subscribe for our weekly updates through the orange box in the blog (it’s on the side if you’re using a computer, and all the way at the bottom if you’re using mobile) – it’s free!

But since our readership-audience for The Good Investors is wider than our subscriber base, we think sharing the reading list regularly on the blog itself can benefit even more people. The articles we share touch on a wide range of topics, including investing, business, and the world in general. 

Here are the articles for the week ending 03 May 2026:

1. Oracle’s Deluge of AI Debt Pushes Wall Street to the Limit – Peter Rudegeair and Berber Jin

Banks including JPMorgan Chase struggled for months to spread the risk of billions of dollars in loans they made to build data centers leased to Oracle in Texas and Wisconsin, people familiar with the matter said. Many financial institutions that would ordinarily buy those loans face restrictions on how much exposure they can have to a single counterparty, and the sheer size of these debt packages pushed them to the limit with Oracle. As a result, bank balance sheets got clogged, constraining the financing prospects of future projects tied to Oracle and OpenAI.

For example, lenders balked at financing the expansion of a data-center complex in Abilene, Texas, if Oracle were the tenant, according to people familiar with the matter. That led the developer, Crusoe, to lease it to Microsoft instead…

…Lenders grew more comfortable with Oracle-related projects after the company said it would raise all the money it needed for 2026 by issuing roughly $50 billion in stock and bonds. Oracle said in a post on X last week that each data center it is developing for OpenAI is moving forward on time.

But even after it raises that amount, Oracle still has additional cash funding needs of $100 billion or more for 2027 and the first half of 2028, according to Morgan Stanley credit analysts. “We’ve pondered how [Oracle’s] considerable funding needs over the next three years may test the depths of different fixed-income markets,” the analysts wrote in February…

…Oracle, though, is in a comparatively weaker financial position than big tech rivals. It has a lower investment-grade credit rating, more debt and is burning cash. Much of its future revenue is tied to a money-losing startup that is facing growing competitive pressure. The cost of protecting Oracle’s bonds against a potential default via credit-default swaps roughly quadrupled between late September and late March, though it has fallen slightly since then…

…Much of the borrowing tied to the OpenAI megacontract was done by projects involving data center developers working with Oracle. The debt was structured as short-term construction loans meant to be syndicated among a group of banks and other institutions. Oracle is the tenant and OpenAI is the subtenant on the deals, but the debt doesn’t sit on Oracle’s balance sheet.

2. OpenAI Misses Key Revenue, User Targets in High-Stakes Sprint Toward IPO – Berber Jin

Chief Financial Officer Sarah Friar has told other company leaders that she is worried the company might not be able to pay for future computing contracts if revenue doesn’t grow fast enough, according to people familiar with the matter. 

Board directors have also more closely examined the company’s data-center deals in recent months and questioned Chief Executive Sam Altman’s efforts to secure even more computing power despite the business slowdown, the people said…

…OpenAI missed an internal goal of reaching one billion weekly active users for ChatGPT by the end of last year, according to people familiar with the goals. The company still hasn’t announced that milestone, unnerving some investors. It also missed its yearly revenue target for ChatGPT as well after Google’s Gemini saw massive growth late last year and ate into OpenAI’s market share, the people said. The company has also struggled with defection rates among subscribers, according to people familiar with those figures.

OpenAI missed multiple monthly revenue targets earlier this year after losing ground to Anthropic in the coding and enterprise markets, people familiar with its finances said.

3. If AI is so great, why isn’t it working? – Vas M.

AI is working for one group of people right now, at scale, because it’s the group of people that rely the least on business logic. It’s software engineers. The biggest winner from 18 months of AI improvement, by miles, has been engineers writing code in Cursor, Claude Code, Codex, etc. Some stats for you if for some reason you still don’t believe in agentic engineering:

  • GitHub’s 2024 study clocked Copilot users at 55% faster on real tasks. 1 hour 11 minutes vs 2 hours 41 minutes on the same work.
  • Anthropic ran an internal study in August 2025 across 132 engineers and 100,000 real Claude conversations. AI cuts developer task completion time by roughly 80%.
  • Sundar Pichai said at the start of 2026 that 75% of new code at Google is AI-generated and engineer-approved. That number was 30% in April 2025.

Yes, the tools still overpromise on the hard stuff: security review, complex distributed systems, novel debugging. Caveat very real and noted. But the bread-and-butter productivity gain on shipping code is the biggest jump engineering has had since the IDE…

…So why does AI work for engineers and not for any of these? What’s different about engineers? As a former software engineer, engineering work has four properties that basically no other enterprise function has. Yes there are nuances but these are directionally correct, please relax in the comments.

  • It’s bounded. A function takes inputs and returns outputs. The scope of “fix this bug” lives inside a file or a module. The dependencies are explicit and importable.
  • It’s checkable. Compilers tell you in milliseconds whether the code parses. Tests tell you whether it works. Type systems catch entire classes of error before runtime. Feedback loop: seconds.
  • The substrate is structured. Code lives in files, in version control, with a deterministic build pipeline underneath. Same input, same output. You can replay any state.
  • The output is verifiable. A pull request is a discrete artifact. A reviewer can look at the diff in 10 minutes and say yes or no.

When you point a capable AI at work that’s bounded, checkable, structured, and verifiable, the leverage is enormous. Cursor and Claude Code are the proof. And if we’re being honest, the biggest reason is that the AI labs (OpenAI, Anthropic, Cursor) poured every single ounce of resources they had into figuring out software engineering. If they can make their own engineers better, they can make the models better, faster, and achieve the ever-elusive “AGI”, which will then make every other task on the planet (Finance, Sales, Operations, Marketing, etc) much easier downstream.

But contrast software engineering with a finance close.

Finance involves AP, AR, intercompany reconciliations, FX, accruals, journal entries, and exception handling that spans NetSuite, Concur, three banks, two ERPs from acquisitions, a custom intake form, and a Slack channel where the controller flags “weird stuff she sees.” The “process” is documented in an SOP that doesn’t match what actually happens. The output is “the close was clean,” which takes two senior accountants two days to verify.

Sales ops involves a CRM, an outbound tool, a calendar, a notes platform, an enrichment vendor, an attribution tool, and a Slack channel where the AE is asking the CRO whether to discount this deal. None of those systems share state cleanly. The process for qualifying a lead is different across reps, even on the same team.

This is what every ops function looks like in every company Varick has ever audited. None of it is bounded, checkable, structured, or verifiable the way code is. And trying to wrangle generic AI to these functions that are incredibly specific to your company and its processes is a fools errand.

Pointing an LLM at this work gives you negative ROI. The operator was doing the work in 30 minutes. Now they’re doing the work in 30 minutes plus another 30 minutes correcting the AI’s mistakes. Most if not every vendors’ “AI for [department]” has the same arc. A nice flashy demo showing how great it works for startups, then a big series A, then quietly killed after it fails to work for enterprise…

…Ok so what does the 5% that ships and stays in production do consistently that makes them so good:

1. They audit before they build. Four weeks (often longer) of mapping the actual workflow before anyone touches a model. The audit produces a digital twin: a live map of how work moves through the org, where the conformance gaps are, what’s pattern-matchable, and what genuinely needs human judgment. The document itself matters less than the alignment it forces between the AI team and the operators. Make sure everyone is aligned on what the bottle-necks are, what the optimal state should be, and what is going to be done to fix it.

2. They decompose the work until most of it is deterministic. LLM goes ONLY where judgment is absolutely required, while plain code goes everywhere else. Most production systems we ship at Varick end up as 5-10 deterministic steps with maybe one or two model calls in specific places. Boring in production is genuinely the goal, and is how we’ve seen the most success.

3. They build a single orchestration layer that sits on top of the existing software stack. At Varick, we call this the single pane of glass. Finance, sales, ops, and engineering agents all live on the same platform, share the same context, and can talk to each other when they need to. Every new use case lands as configuration on top of the platform. In turn, sprawl is dead on arrival.

4. They stay model-agnostic. Abstractions get built at the task level, not at the model level. Each step routes to the best-fit model at any given moment. When OpenAI deprecates a model or Anthropic ships something dramatically better, the routing layer absorbs the change and your workflow keeps running without anyone noticing.

5. They treat the deployment as continuously evolving infrastructure. There is a real team responsible for ongoing tuning, retiring agents that aren’t earning their keep anymore, and shipping improvements every quarter. The deployments that pay off over five years are the ones that get tuned every quarter if not every month, not the ones declared “done” at go-live. You have to get over this fact if you want to succeed with AI. 

4. Software Is Eating the World (But Actually This Time) – Siddharth Ramakrishnan

In 2011, software ate the world. At least that’s what Marc Andreessen told us. But if that’s true, then why does the Bay Area still exist? If software really ate everything, wouldn’t we all have moved to New York or Miami by now?

Well, let’s look at what software actually ate: banks got apps, retail got websites, hospitals got EHR systems, and taxis got dispatched with a few taps instead of a phone call at 2am when you maybe don’t remember exactly where you are.

Software ate the interfaces, but the actual work? That mostly stayed human.

A customer calls about a billing dispute and software routes the call, pulls up the account screen, and then logs the resolution afterward. But here a person is still the one listening, figuring out whether the refund policy applies here, deciding what to do, and actually talking to the customer. A loan officer reviewing an application gets the credit score surfaced by software and the documents pulled up on screen, but they’re the one reading those documents and making the judgment call. For 15 years, software has been really good at the plumbing while humans kept doing the actual work.

Now, AI can actually do the work! A customer service call is becoming an agent loop where the system handles speech recognition, looks up the account via API, pulls the relevant policy, reasons about whether the customer qualifies, triggers the refund, and responds with text-to-speech. An insurance claim is becoming document intake followed by coverage checks, fraud flags, reserve calculations, and settlement workflows, all running as code. A coding task is already 30 rounds of reading files, editing code, running tests, and revising with no human involved at all…

…I think most people dramatically underestimate how much inference these converted workflows actually consume, because they’re picturing one model, one call, one response, and some hallucinations along the way, but the reality is very different.

Take a voice support agent handling something simple but real, like rescheduling a medical appointment. To the customer, it feels like one conversation. Under the hood, it is a small autonomous system running continuously. As the caller speaks, a speech recognition model transcribes audio in real time. An orchestration model then reasons over the transcript, pulls the patient record, checks scheduling constraints, looks up provider availability, decides what to ask next, and calls the relevant tools. Once it has enough information, it synthesizes the result into a response, and a text-to-speech model turns that back into natural audio. In parallel, other models may be monitoring sentiment, checking compliance, or deciding whether the call should be escalated.

The system is doing all the work itself: listening, retrieving, deciding, tool-calling, verifying, and responding in a loop. An 8 minute call might contain only ~3k tokens of raw transcript, but the orchestration layer can easily consume ~40k tokens once you account for repeated reasoning over the growing conversation, retrieved context, and tool outputs, on top of continuous ASR and TTS inference running for the duration of the call. “One AI phone call” is really a multi-model inference stack operating continuously…

…In customer support, a basic FAQ bot in 2023 might have consumed around 3,500 tokens for a ticket, better retrieval pushed that higher, then tool use and reasoning pushed it higher again, and now full voice support stacks are higher still. Coding follows the same pattern, just more violently: what used to be tens of thousands of tokens for a bounded coding task has become hundreds of thousands or even well over a million as agents became capable enough to handle real debugging, refactoring, and multi-file work. Each useful task now justifies much more inference than it did a year or two ago, because the model can actually finish the job.

This is a subtle version of Jevons paradox. The sticker price per token has actually been rising for frontier models, not falling. But the value per million tokens has gone up much faster: a frontier model today can complete a workflow in one coherent session that would have required dozens of brittle attempts a year ago, or simply could not have been done. Effective cost per useful outcome is dropping even as nominal cost per token climbs. And that dynamic is what opens up entirely new categories: complex insurance claims, broad code refactors, long-running research tasks, multi-step back-office processes. These were not meaningfully part of the inference market two years ago because the models could not stay coherent long enough to do them.

The aggregate numbers suggest this is already happening. OpenAI’s API is processing more than 15B tokens per minute as of April 2026, up from 6B half a year earlier. Google went from 9.7T tokens per month to 480T in a year, about 50x growth. OpenAI says reasoning token consumption per enterprise organization grew 320x year over year. Anthropic’s latest reported annualized revenue of $30B (up from $10B to start the year…) speaks for itself, especially given the main driver is Claude Code and their API…

…As models commoditize, the durable application companies will be the ones that see the real work: the tool calls, retries, escalations, corrections, and edge cases that never show up in a benchmark. That is where the system learns how a specific workflow actually runs, and where proprietary context starts to accumulate. Over time, the advantage is not just access to a model. It is knowing how this insurer handles claims, how this hospital works denials, how this codebase breaks, how this finance team closes. The apps that capture that messy operational data will be the ones that improve fastest and defend their position longest.

5. Nike and the Arithmetic of Durability – Andrew Chou

As of April 2026, Nike stock sat below US$45 – a market capitalisation of US$68 billion, its lowest level in over a decade, and a fall of more than 75% from the US$280 billion the company commanded at its 2021 peak.

How does what was once considered one of the widest consumer brand moats in the world, built over half a century, erode over the course of a few short years?

A good starting point is January 2020, when John Donahoe took over as Nike’s new CEO. The board wanted a digital-first operator, and Donahoe had the résumé – ServiceNow, eBay, and Bain – even if he was one of the few leaders in Nike’s history not to have risen through its operating ranks…

…Under Donahoe, Nike began systematically pulling back from these wholesale relationships. The logic was straightforward: move more volume through direct channels, control the brand experience, and capture more margin.

By September 2021, Nike had exited roughly half its retail partners. Big names like Foot Locker, Zappos, Dillard’s, and Big 5 Sporting Goods saw their allocation of the most sought-after models shrink in favour of Nike’s directly owned stores. Gross profit margins expanded immediately.

The vacated shelf space that followed was quickly and eagerly filled by competitors. Adidas, New Balance, Puma, Hoka, On, Brooks, and Salomon—brands that had suddenly found themselves with prime real estate in the stores Nike had walked away from…

…That same model of deep, sport-specific immersion was eventually replicated across basketball, football, tennis, and dozens of other categories. Teams embedded in each discipline accumulated years of insight about athletes, usage patterns, and the fine distinctions that matter in performance products. This kind of expertise accumulates slowly—through proximity to athletes, coaches, biomechanics, and the subtle demands of each sport.

Under Donahoe, Nike restructured around a simpler model: Men’s, Women’s, and Kids. The rationale was familiar—less duplication, cleaner accountability, more consistency across segments—and the resulting redundancies left the org chart looking tidier on paper. Overhead expenses came down immediately.

What it also did was dissolve the sport-by-sport expertise and institutional knowledge accumulated over decades. Product lines that had once been shaped by deep category knowledge were now filtered through broader consumer-demographic lenses…

…Nike has long been famous for marketing that built meaning before it chased sales. The ability to turn a product into a cultural moment was arguably Nike’s most valuable and least replicable asset.

The Banned Air Jordan story is perhaps the purest illustration. In 1984, Michael Jordan wore black-and-red sneakers that violated the NBA’s uniform rules. The league threatened fines. Nike’s response was not to comply—it was to lean in. The company shot a television commercial showing the shoes blacked out by censorship bars, declaring that the league had thrown them out of the game but could not stop you from wearing them. That single ad helped sell 50,000 pairs almost immediately…

…Under the new model, marketing spend shifted from broad, culture-shaping storytelling into programmatic digital advertising designed to drive traffic to Nike’s own e-commerce channels. Performance marketing has direct, measurable KPIs – but by its nature, it harvests existing demand rather than creating it.

Anyone can pay for web traffic, but doing so does not build a competitive advantage. Just ask the direct-to-consumer startups built on performance marketing in the 2010s that failed to sell to a large incumbent with real distribution before the music stopped…

…Nike shares climbed from around $100 when Donahoe took over to an all-time high of $179 in November 2021 – a company valued at roughly $280 billion. The “transformation” was working.

But these gains came from somewhere. They were, in effect, the monetisation of business value painstakingly built over decades: the distribution footprint Knight and his team had cultivated since the 1960s; the product expertise and institutional knowledge that Bowerman’s culture had embedded across dozens of categories; the brand equity that campaigns like the Banned Air Jordan and Just Do It had compounded over generations.

Most business decisions sit on a spectrum between maximising long-term net present value and maximising short-term accounting profit. When the asset being spent is the moat itself, the spending does not show up as a cost. Each of Nike’s three shifts boosted reported profitability immediately and reduced the long-run NPV of the franchise meaningfully. The trajectory of the income statement and the moat moved in opposite directions – but only the income statement was visible quarter to quarter.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. We currently have a vested interest in Alphabet (parent of Google). Holdings are subject to change at any time.

The View On Consumer Spending From The Largest Payments Companies (2026 Q1)

Mastercard and Visa can feel the pulse of consumer spending – what are they seeing now?

Mastercard (NYSE: MA) and Visa (NYSE: V) are two of the largest payments companies in the world. As a result, they have a great view on consumer spending that’s taking place. With both companies reporting their earnings results for the first quarter of 2026 earlier this week, the bottom line is that consumer spending remains strong in the USA and other parts of the world, although there’s some near-term uncertainty because of the current conflict in the Middle East. Here’s what they are seeing.

*What’s shown in italics between the two horizontal lines below are quotes from Mastercard and Visa’s management teams that I picked up from their earnings conference calls.


From Mastercard

1. Mastercard’s management sees consumer and business spending, and the labour market, remaining healthy, although the economic backdrop is uncertain, driven by geopolitical tensions in the Middle East that have affected cross-border travel and global energy supply

Looking at the macro picture, the economic foundation remains generally supportive, with healthy underlying consumer and business spending. However, the backdrop remains uncertain, driven by geopolitical tensions, which has put some pressure on cross-border travel. Overall, labor markets continue to be balanced and wages are still outpacing inflation in most major markets…

…Despite elevated geopolitical risks, the macro economy has remained largely supportive, with healthy, underlying consumer spending and the fundamentals of our business remain strong. With that said, we are operating in a period of heightened uncertainty magnified by the ongoing conflict in the Middle East. Since the outbreak of the conflict at the end of February, we have seen restrictions on travel and a reduction in the world’s energy supply. And as I noted earlier, we are seeing impacts from that in our cross-border travel metrics.

2. Worldwide GDV (gross dollar volume) was up 7% year-on-year in constant-currency basis; cross-border volume was up 13% globally in constant-currency, driven by both travel and non-travel cross-border spending (cross-border volume growth was 14% in 2025 Q4); cross-border volume in 2026 Q1 was affected in March because of impacts on cross-border travel from the conflict in the Middle East; switched transactions was up 9% year-on-year; card growth was 5% in 2026 Q1, with Mastercard ending the quarter with 3.7 billion cards in circulation (there were 3.7 billion cards in 2025 Q4, and year-on-year growth was 6% then); on currency-neutral basis, domestic assessments were up 6%, cross-border assessments were up 18% and transaction processing assessments were up 15%

I’ll speak to the growth rates of our key volume drivers for the first quarter on a local currency basis. Worldwide gross dollar volume, or GDV, increased by 7% year over year. In the US, GDV increased by 4%, with credit growth of 8% and debit growth of 1%. Excluding the impacts from the migration of the Capital One debit portfolio, our US debit GDV growth would have been 7%…

…Outside of the US, volume increased 9% with credit growth of 9% and debit growth of 8%. Overall, cross-border volume increased 13% globally for the quarter, reflecting continued growth in both travel and non-travel related cross-border spending. As one would expect starting in March, we began to see some impact on cross-border travel from the conflict in the Middle East.

…Switched transactions grew 9% year-over-year in Q1. Excluding the impacts from the migration of the Capital One debit portfolio, our switched transaction growth would have been 10%…

…Card growth was 5%. Globally, there are 3.7 billion Mastercard and Maestro branded cards issued…

…All growth rates are described on a currency neutral basis, unless otherwise noted. Looking quickly at each key metric, domestic assessments were up 6% while worldwide GDV grew 7%. The difference is primarily driven by mix, partially offset by pricing. Cross-border assessments increased 18%, while cross-border volumes increased 13%. The five point difference is driven primarily by pricing in international markets. Transaction processing assessments were up 15%, while switch transactions grew 9%. The six ppt difference is primarily due to favorable mix and pricing, slightly offset by lower revenue from FX volatility.

3. In 2026 Q1, Mastercard’s operating metrics had good year-on-year growth and were stable sequentially; in April 2026 so far, Mastercard’s operating metrics continue to be strong with worldwide switched volume growth of 8% (5% in the USA, and 10% outside of the USA), switched transactions growth of 9%, and cross-border volume growth of 9%; cross-border travel volume declined sequentially in April 2026 from 2026 Q1 because of an acceleration in the impact of the Middle East conflict; in all, management continues to see healthy consumer and business spending

Let me comment on the operating metric trends for Q1 and the first 4 weeks of April. As we look across Q1 and April, growth rates of our operating metrics were impacted by timing of holidays, namely Ramadan and Easter. March would have seen the benefits from the timing, while February and April saw a negative impact. Looking at the Q1 operating metrics on a sequential basis, switched metrics were generally in line with Q4 and underlying spend remains stable. Of note, U.S. switched volume was flat sequentially as the strength in consumer and business spend offset the impact from the migration of Capital One’s debit portfolio in the quarter. Excluding Capital One, on a like-for-like basis, U.S. switched volume growth was over 1 ppt higher in Q1 as compared to Q4. Now on to switched transactions; excluding the migration of the Capital One debit growth — sorry, excluding the migration of Capital One debit, growth was generally in line with Q4.

Moving to our cross-border metrics. Our overall cross-border volume remains healthy with growth at 13% in the first quarter. Cross-border card-not-present ex-travel grew at 18% and remained strong. And the sequential decline in cross-border travel was due primarily to the conflict in the Middle East and portfolio shifts.

Now looking specifically at cross-border travel for the first 4 weeks of April, the sequential decline from Q1 is due to an acceleration of the impact of the conflict, the portfolio shifts and the negative impact from the timing I just mentioned. None of these factors relate to any fundamental change, and underlying consumer and business spend remains healthy.

From Visa

1. US payments volume growth was good at 8%, with e-commerce growing faster than physical spend, and it reflected resilience in consumer spending; there was good growth in both US credit and debit volumes; growth across consumer spend bands improved from 2025 Q4 (FY2026 Q1) with the highest spend band continuing to grow the fastest; both discretionary and non-discretionary spend remained strong; management did not see a deterioration in spend in the lower bands; 

U.S. payments volume grew 8% year-over-year, up almost 1.5 points from Q1, reflecting resilience in consumer spending. E-commerce spend outpaced face-to-face spend. Both U.S. credit and debit demonstrated broad-based spend improvement, and we believe both were helped in part by higher tax refunds. Debit grew 7%, up almost 1 point from Q1 and credit grew 10%, up more than 2 points from Q1, with strong travel spend in both consumer and commercial.

Growth across consumer spend band saw incremental improvement from Q1 with the highest spend band continuing to grow the fastest. Across our volume, both discretionary and nondiscretionary spend remains strong. We do not see signs of the lower spend consumer weakening in our volumes.

2. Visa’s cross-border volume growth remained strong in 2026 Q1 (FY2026 Q2) at 11%, and was the same as in 2025 Q4 (FY2026 Q1)

Q2 total cross-border volume was up 11% year-over-year, consistent with Q1. Cross-border eCommerce volume was up 13%, 1 point above Q1. While crypto continued to be a slight drag, the improvement was primarily driven by U.S. inbound volume. Travel-related cross-border volume was up 10%, generally consistent with Q1, led by continued strength in commercial and improved U.S. inbound volume that generally offset the impact in the Middle East that was most pronounced in March.

3. Payments volume on Visa’s network continues to grow in April 2026, with US payments volume up 9%, cross-border volume up over 9%, e-commerce volume up 14%, and processed transactions up 8%; management is seeing near-term uncertainty in cross-border travel spend in the CEMEA (Central Europe, Middle East, and Africa) region because of the Middle East conflict

Now let’s look at drivers through April 21 with volume growth in constant dollars. U.S. payments volume was up 9%, with credit up 10%, and debit up 8% year-over-year. For constant dollar cross-border volume, excluding transactions within Europe, total volume grew 9% year-over-year with eCommerce up 14% and travel up 5%. The step down in travel from March was driven by both the impact from the Middle East conflict and Ramadan timing. When you normalize for Ramadan timing, the total April cross-border volume growth was in line with February levels. Processed transactions grew 8% year-over-year…

…The Middle East conflict has introduced some near-term uncertainty, in particular to cross-border travel spend in the CEMEA region.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I currently have a vested interest in Mastercard and Visa. Holdings are subject to change at any time.

What We’re Reading (Week Ending 26 April 2026)

The best articles we’ve read in recent times on a wide range of topics, including investing, business, and the world in general.

We’ve constantly been sharing a list of our recent reads in our weekly emails for The Good Investors.

Do subscribe for our weekly updates through the orange box in the blog (it’s on the side if you’re using a computer, and all the way at the bottom if you’re using mobile) – it’s free!

But since our readership-audience for The Good Investors is wider than our subscriber base, we think sharing the reading list regularly on the blog itself can benefit even more people. The articles we share touch on a wide range of topics, including investing, business, and the world in general. 

Here are the articles for the week ending 26 April 2026:

1. Pancreatic cancer mRNA vaccine shows lasting results in an early trial – Kaitlin Sullivan, Marina Kopf and Anne Thompson

Nine days later, Gustafson had surgery to remove the Stage 2 cancer from her pancreas. The day before she was supposed to start chemotherapy, her doctors told her about a clinical trial exploring the use of personalized messenger RNA vaccines for cancer. It was February 2020 — months before mRNA vaccines for Covid would become one of the world’s hottest commodities. Very soon after, Gustafson was the first person to get one for pancreatic cancer.

“It was a no-brainer,” Gustafson said of joining the trial. “I knew that statistically, the odds were against me.”

Less than 13% of people diagnosed with pancreatic cancer live for more than five years, making it one of the deadliest cancers. There is no routine screening for pancreatic cancer, such as colonoscopy or mammogram, and symptoms typically don’t show up until the disease is advanced. Once detected, there are few options for treatment. Only about 20% of cases are operable, which is currently required for someone to be eligible to join a pancreatic cancer vaccine trial…

…The vaccines work as a type of so-called immunotherapy, harnessing a person’s immune system to fight cancer cells. The goal is not to eliminate existing tumors, but instead to stamp out lingering, undetected cancer cells, and later any new cells that form before they can cause a recurrence.

…Pancreatic cancer is the poster child for these difficult-to-treat cancers, Balachandran said, and experts have long believed that people with pancreatic cancer could not generate an immune response against tumors. But after nine doses of the personalized vaccine, Gustafson is one of eight people in the 16-person Phase 1 trial who did just that, producing an army of immune cells called T cells that seek out and destroy tumor cells.

“This is one of the hardest cancers to generate any immune response, let alone such a potent one,” Balachandran said.

Balachandran and his team published the results of the Phase 1 clinical trial last year. At the time, the patients, all of whom had early-stage disease before they joined the trial, had only been tracked for just over three years, and it was unclear whether the immune response would last and lead to the patients living longer, he said. New data collected during the trial’s six-year follow-up period shows that it may.

Six years after treatment, Gustafson and six others who responded to the treatment are still alive, along with two of the eight people who did not respond. Two of the responders, including the one who died, had a cancer recurrence; Gustafson’s cancer has not come back.

“The most important finding here is that the people who mount a response to the vaccine live longer than those who do not,” said Dr. William Freed-Pastor, a physician-scientist at Dana-Farber Cancer Institute, who was not involved with the trial. He cautioned, however, that the results come from a very small group of patients. More research is still needed…

…Earlier research tested mRNA vaccines to treat people with advanced cancer, with disappointing results, “so we thought we didn’t have a vaccine that would work,” said Dr. Robert Vonderheide, the president-elect of the American Association for Cancer Research and director of the Abramson Cancer Center at the University of Pennsylvania.

In reality, newer research like this Phase 1 trial suggests the immunotherapy may work in less advanced cancer.

2. Brad Setser on the War in Iran and the Future of the US Dollar (Transcript Here) – Tracy Alloway, Joe Weisenthal, Brad Setser

Tracy Alloway: Why don’t we start with that historic analogy—the 1970s oil shock. Lots of ink is currently being spilled on whether or not that’s the correct parallel for our current crisis. In your view, how much does this particular oil shock resemble that of 50 years ago?

Brad Setser: There’s the obvious parallel in the sense that the 1970s oil shocks—’73 was a function of the Yom Kippur War and the Arab nations’ reactions to it. The second oil shock in 1979 was a function of the Iranian revolution. Same geographic region, but different in the sense that the US and Israel are the instigators, and different in that, so far at least, the magnitude of the shock is not at all comparable. It’s not at all comparable in price terms. In ’73 and then in ’79, oil doubled or tripled, and by the end of the decade oil had gone up six or seven times in dollar terms, less in real terms. We’ve only gone up maybe 50% max from spot oil for Brent and WTI and next month’s future. It’s a little higher for delivery in Asia, but we are not yet at the magnitude of the shock we saw in the 1970s.

The obvious point is that our economy as a whole—for the US and for the world—is a little less oil-dependent, but I wouldn’t push that too far. The main distinction is that we sort of started it—the US and Israel—and we in theory can end it, although we would only end it if Iran finds its own equilibrium that allows other countries’ oil to pass through the strait. At least so far, the market has not anticipated that this will need the same kind of jump in price to balance supply and demand. That could change. If you look at it in terms of physical interruption of the flow of oil, some of your guests have noted we’re similar, maybe even worse. So we’re in this weird world where the physical interruption is bigger but the price reaction is smaller.

Joe Weisenthal: I’m glad you brought this up. You talk to the commodity guys like we do, and they’re saying, “This is crazy, this is the biggest shock ever.” Guys like me—I’m an efficient-markets guy, I just see what’s on the screen, and it looks like it’s not that big of a deal. You be the third-party arbiter here. How do you make sense of the gap between what we see on our screen versus the shortfall in physical barrels—20 million every single day that aren’t coming to the market?

Brad Setser: It’s not quite 20. You’ve got the East—there’s been some rerouting. It’s somewhere between 10 and 15, which happens to be between 10 and 15% of global supply and between 20 and 30% of global traded oil. It is still a massive, massive shock, and my elasticities would imply a much bigger increase in price if that was a sustained, expected interruption.

You end up dealing with the reality that oil is close to being a perfectly fungible commodity, but it is not a perfectly fungible commodity. A North Atlantic barrel can only get to China or Japan with a long trek around the world, so there’s an extra shipping cost. A lot of the barrels in the North Atlantic are sweet and light—”light” is a measure of the weight of the oil, “sweet” means less sulfur. A lot of the refiners in Asia were set up to refine medium sour. For some things you want heavier grades of oil because you get more diesel out of the heavier grades. Refiners are configured for different grades of oil. When you interrupt the flow—fundamentally the flow from the Gulf countries to Asia—there’s no immediate, instantaneous substitution using barrels from the North Atlantic. That’s the first point.

The second point is that what people think of as traded oil is not actually oil for delivery tomorrow. It is the futures contract for the next month, and the month after that, the world could look completely different. The US has within its ability the capacity to pull back. If the US pulls back—and maybe the Iranians insist on a toll—there is no shortage of oil that could come out. It’ll take a little longer now because of the physical destruction of some of the export facilities in the Gulf, but if you don’t have this particular choke point strangled, the old global oil market was very well supplied. So the futures market has to balance between one possibility—that there is plenty of oil two or three months out and oil is on a trajectory, not immediately because of the damage, back to $60—and another possibility where this persists and oil is at $150 or above. The market’s had trouble figuring that one out…

…Joe Weisenthal: There are obviously differences, but how did the ’70s reshape the world? You had these oil shocks, and people then started talking about “petrodollars”—a word that came into existence. What kind of legacy did those shocks leave on the global financial system?…

…Brad Setser:  Americans are very unhappy—if you remember in the 1970s it was not good for President Carter. When the Iranian revolution came, there were the hostages, but the oil shock did not help his popularity. Americans in general are very unhappy when oil prices are high—it’s one of our national quirks.

In the short run at that time, there was a huge windfall into the Gulf states. The Gulf states piled up dollars, and they were dollars. Most oil—oil was priced in dollars before 1973. It didn’t take a deal to price oil in dollars. The US had been the biggest producer of oil in the 1930s. We were the supplier of oil to the Brits and others during World War II. It was only over the course of the 1950s and ’60s that other parts of the world caught up with US oil production, but the oil industry, in a deep sense, was born in the United States and was always priced in dollars. Saudi Aramco was originally a joint venture with an American company—or maybe even fully owned by an American company, I forget—so it was natural that it was priced in dollars. It wasn’t like in the 1970s you had to do a new deal to price oil in dollars rather than something else. Oil was in dollars.

Those dollars piled up, and it was a period of difficulty in the international monetary system. The US was going off the gold standard; Bretton Woods was breaking down; high inflation was not well contained after the first oil shock. There was an effort to convince the Saudis to keep their large stock of new petrodollars in dollars—not buy euros—and to use them at least in part to buy Treasuries. Even then, the Saudis were a little reluctant to visibly buy Treasuries. Some Bloomberg reporters several years ago went through this history, and the US started masking who was buying Treasuries at the request of the Saudis. The Saudis essentially said, “You guys are supporting Israel, we don’t really want to be seen buying your bonds directly, can you hide it?” And we agreed. Because there was still residual tension between the US and many parts of the Arab world, a lot of the dollars did not flow into the Treasury market. They flowed into bank accounts in London—offshored effectively eurodollars originating from petro-states. Those got recycled and lent in no small part to oil-importing emerging economies, and that is viewed as the start of the buildup of vulnerabilities that led to the Latin American debt crisis in the 1980s.

There’s another part of this whole story that I think people forget, which is sort of irritating me lately. After 1979–80, the Saudis had built up huge stocks of dollars—a great decade for the Saudis in the 1970s. In the ’80s, in order to keep prices high they had to cut production, and eventually that wasn’t enough and the oil price collapsed. By the end of the 1980s, and certainly by the middle of the 1990s, all the dollars that had been built up in the 1970s had been spent. The Saudi cumulative current account balance went back to basically being neutral or in deficit by ’95 and certainly by 2000. So in some sense the petrodollar boom came and it went. By the time of the Asian financial crisis, oil prices were very low—in the $20s—and there were no flows of petrodollars nor a very large stock of petrodollars. There’s sometimes a tendency to think the ’70s just continued and continued, but the reality is that, setting aside the really rich Emirates and Kuwait, the rest of the oil exporters were not in a position to continuously build up and save over most of the period after 1980 until the big run-up in oil from 2003 to 2014…

…Brad Setser: The last point, and this is just to be provocative because I’m tired of people blabbering about the dollar as the global reserve currency and how that’s the foundation of everything: an international large-cap equity portfolio will have a US share of roughly two-thirds—65 to 70%. The Saudi Public Investment Fund—my friend Alex Etra has done some work on it—its international portfolio has a dollar share of 80%, and that’s probably typical, because most private equity funds are going to be pretty dollar-heavy. A typical global reserve portfolio is now at 57% dollars. So the notion that reserves are the source of inflows into dollars is a bit dated. A reserve portfolio will typically have a lower dollar share than a standard return-seeking equities fund, which just because of the outperformance of US large caps will be more overweight dollars…

…Brad Setser:  a quarter of global reserves, to the first approximation, are in China. China still manages its currency against the dollar, but China as a matter of policy brought its formal disclosed dollar reserve share down to 55%, from 79% in 2005. They did not like the optics of financing their strategic rival and holding a lot of Treasuries in visible ways. That’s a bit misleading, because the dollar share of the portfolios of the state banks—which now have a very large share of the total state portfolio—is much higher, around 70%. If you actually net out the offshore liabilities of the state banks and just look at the net, the euro offshore portfolio is matched by euro offshore liabilities. The dollar offshore portfolio is matched by dollars onshore. In a sense, the BOP flow through the state banks was, setting aside some of the CNY lending which has gone up, almost 100% dollars…

…Brad Setser: Now, we are in a world where an enormous share of the world’s financial wealth—both people looking for safety in reserve assets, people looking for a bit more yield than you can get out of a safe G10 government bond, the private credit/CLO world, and people wanting the equity home runs—all those investors globally are now quite overweight US assets. As a result, the dollar is quite strong. To me, the core question is not really whether geopolitics will change things, assuming we don’t get into a full-on blow-up with Europe, which would accelerate some shifts. The real question is: is this intense overweight in the dollar sustainable when we have fairly reckless policies? The answer so far has been yes.

3. Token Cost Conundrums – Abdullah Al-Rezwan

Each model has its own tokenizer that decides how many tokens your prompt becomes. Feed the exact same prompt to GPT-5.4 and Claude Opus 4.7, and Claude might slice it into 2–3x as many pieces. So even if the headline price were exactly the same, you’d pay 2–3x more for identical content…

…”We sent identical inputs through each provider’s official token counting API and normalized against OpenAI’s…

…”The differences are dramatic. On tool-heavy workloads, claude-opus-4-7 costs 5.3x more than gpt-5.4 even though their list prices are only 2x apart. The rankings also flip depending on what you’re sending: Gemini is the cheapest option for text and structured data, but becomes 46% more expensive than OpenAI on tool definitions.

The only way to know what you’re actually paying is to measure it.”…

…Similarly, after understanding these nuances, I think any enterprise would be really imprudent to standardize on just one model developer. This is because the customer loses bargaining power, a benchmark, and the ability to distinguish real quality differences from billing artifacts. If the seller controls both the meter and the service, and the buyer has no parallel benchmark, the buyer is highly likely to end up paying more over the long term. Even if the model developer isn’t sneakily charging you higher price, without any benchmark, how will the customer press the model developer to lower their price or even understand that they’re paying too high a price?…

…Nonetheless, the smart move does seem to be multi-model capability (even if 95% of volume goes to one vendor) plus internal benchmarks run on your actual prompts. That gives you the optionality to switch and more importantly, the negotiating leverage to push back at contract renewal. Given this context, I believe it will be exceptionally unlikely that enterprise AI will ever be dominated by one model developer. Anthropic may be dominating enterprise AI today, but OpenAI and Google will also likely have plenty of opportunities to gain further ground.

4. Elite law firm Sullivan & Cromwell admits to AI ‘hallucinations’ – Sujeet Indap and Kaye Wiggins

Sullivan & Cromwell told a US federal bankruptcy court that a major filing it made in a high-profile case contained multiple “hallucinations” made by AI software…

…The case in question revolves around S&C’s representation of liquidators appointed by legal authorities in the British Virgin Islands who are pursuing actions against Prince Group and its owner Chen Zhi.

US federal prosecutors last year charged Zhi with wire fraud and money laundering, accusing him of “directing Prince Group’s operation of forced-labour scam compounds across Cambodia . . . that stole billions of dollars from victims in the United States and around the world”.

In a separate action, US prosecutors also filed a civil forfeiture complaint seeking to seize nearly $9bn worth of bitcoin that the US authorities said represented the proceeds of the Prince Group crimes. Zhi was arrested earlier this year in Cambodia and extradited to China after a request from Beijing.

Prince Group is incorporated in the British Virgin Islands and the Chapter 15 proceeding in the US court system is designed to get the US government to formally recognise the powers of the BVI liquidators to represent creditors and victims in the US legal proceedings, liquidators told the court.

In multiple instances, S&C in the April 9 filing erroneously summarised the conclusions made in other cases, according to a list of strike-through corrections the firm submitted to the judge.

S&C has an enterprise licence for ChatGPT according to multiple people familiar with the firm’s operations. According to S&C’s website, at least five high-level partners have been assigned to the Prince Group bankruptcy case.

5. Anthropic’s Mythos Model Is Being Accessed by Unauthorized Users – Rachel Metz

A handful of users in a private online forum gained access to Mythos on the same day that Anthropic first announced a plan to release the model to a limited number of companies for testing purposes, said the person, who asked not to be named for fear of reprisal. The group has been using Mythos regularly since then, though not for cybersecurity purposes, said the person, who corroborated the account with screenshots and a live demonstration of the model.

Anthropic has said Mythos is capable of identifying and exploiting vulnerabilities “in every major operating system and every major web browser when directed by a user to do so.” As a result, the company has taken pains to ensure that the technology is only available to a select batch of software providers through an initiative called Project Glasswing, with the goal of allowing those firms to test and safeguard their own systems from potential cyberattacks…

…The users relied on a mix of tactics to get into Mythos. These included using access the person had as a worker at a third-party contractor for Anthropic and trying commonly used internet sleuthing tools often employed by cybersecurity researchers, the person said. The users are part of a private Discord channel that focuses on hunting for information about unreleased models, including by using bots to scour for details that Anthropic and others have posted on unsecured websites such as GitHub…

…The group is interested in playing around with new models, not wreaking havoc with them, the person said. The group has not run cybersecurity-related prompts on the Mythos model, the person said, preferring instead to try tasks like building simple websites in an attempt to avoid detection by Anthropic.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. We currently have a vested interest in Alphabet (parent of Google). Holdings are subject to change at any time.

Some Signs of AI Froth

Companies are seeing their stock prices surge mani-fold just by adding “AI” to their name.

The late French writer Jean-Baptiste Alphonse Karr has a phrase, “Plus ça change, plus c’est la même chose” which translates into “the more things change, the more they stay the same.” This aptly describes the financial markets.

During the heady days of the Dotcom Bubble in the late 1990s, companies saw their stock prices surge simply by changing their name to include a reference to the internet. In a 2002 academic finance paper, A Rose.com by Any Other Name, Michael Cooper, Orlin Dimitrov, and Raghavendra Rau wrote (emphasis mine):

“We document a striking positive stock price reaction to the announcement of corporate name changes to Internet-related dotcom names. This “dotcom” effect produces cumulative abnormal returns on the order of 74 percent for the 10 days surrounding the announcement day.”

There have been recent rhymes in the stock market, but of the AI (artificial intelligence) variety.

On 15 April 2026, Allbirds announced a financing agreement along with changes in its business direction (laughably, from consumer footwear to providing compute for AI) and name (to NewBird AI). In response, its stock price surged 582% to US$17 on the day of the changes. NewBird AI’s stock price has since retreated to US$8, but it is still significantly higher than the pre-name-change price of less than US$3.

Later in the same day saw Myseum add “AI” to its name to highlight “the Company’s core technology platform that will integrate proprietary privacy-first artificial intelligence (AI) into its secure messaging and social media platforms.” The company’s stock price jumped by 129% the next day to close at US$3.30; at the day’s peak of US$5.77, Myseum AI’s stock price was up by 300%. The stock price is currently hovering near US$3.

The acclaimed investor Howard Marks has a great investing quote: “We may never know where we’re going, but we’d better have a good idea where we are.” And where we are right now, from what I see, is a bubbly place in AI-land.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I don’t have a vested interest in any company mentioned. Holdings are subject to change at any time.

What We’re Reading (Week Ending 19 April 2026)

The best articles we’ve read in recent times on a wide range of topics, including investing, business, and the world in general.

We’ve constantly been sharing a list of our recent reads in our weekly emails for The Good Investors.

Do subscribe for our weekly updates through the orange box in the blog (it’s on the side if you’re using a computer, and all the way at the bottom if you’re using mobile) – it’s free!

But since our readership-audience for The Good Investors is wider than our subscriber base, we think sharing the reading list regularly on the blog itself can benefit even more people. The articles we share touch on a wide range of topics, including investing, business, and the world in general. 

Here are the articles for the week ending 19 April 2026:

1. A Bakery, a Fortress, and Three Fired Central Bankers – Thomas Chua

Between 1991 and 1995, Croatia fought for independence as Yugoslavia dissolved. At its core, it was a war between a Croatian state seeking independence and Serbia wanting all territories where Serbs lived to be under Serbian control. Serbs were roughly 12% of Croatia’s population, but backed by the Yugoslav army, they pushed for roughly one third of the land.

An estimated 250,000 to 300,000 Croats were expelled from their homes, their houses looted or destroyed…

…But of all the stories I heard across Croatia, the most impactful came from our guide in Trogir.

Her grandmother believed one of her sons (the tour guide’s uncle) had been killed in the war. Heartbroken, this woman, living in a rural village, took her entire life savings and set out to find her son’s body so she could bring him home for a proper burial.

She couldn’t find him.

Eventually, she walked into a bakery and asked if anyone had seen her son’s body. They said no. She placed all her life savings on the table and told them: this is yours if you can find my son’s body. Please let me know.

The people at the bakery refused the money and said they would help, but not for the money. The grandmother left it on the table regardless.

Months later, her son came home. Alive. With her life savings in his hand. The bakery had found him and passed the money back.

In the middle of a war where Croats and Serbs were killing each other, where homes were being bombed and families torn apart, the people at that bakery who helped this grieving mother find her son were Serbs.

Not everyone supports the war. There can still be kindness across enemy lines…

…The tour guides all shared something similar. The pain never fully goes away, even if their rational minds tell them to let bygones be bygones. But they all said the same thing about the next generation: the children don’t carry the same weight. And that gives them hope that pain from the war will heal…

…I sat down at a casual spot and ordered a kebab. Nothing fancy. The bill came to 300 lira. I checked the Google reviews for the same place, and photos from a few years back showed kebab prices around 25 to 35 lira. That’s not a typo. Prices here change so fast that some of the menus had white stickers plastered over the old prices, one layer on top of another. Some restaurants had just given up on the lira entirely and started quoting in euros instead.

Our tour guide shared how prices had spiralled out of control over the past few years, and how the government is almost certainly underreporting the real inflation rate. The official numbers are bad enough.

Turkey’s official annual inflation rate was around 20% in 2021. By October 2022, it had hit 85%. It’s come down since, to around 31% as of March 2026, but independent analysts believe the real numbers are significantly higher.

Meanwhile, the Turkish lira went from about 8 per US dollar in early 2021 to around 44 per dollar today. That’s over 80% of its value gone in five years…

…How did this happen? President Erdogan holds an unconventional economic belief: that high interest rates cause inflation, not the other way around. This is the opposite of mainstream economics, where central banks raise rates to cool an overheating economy. Erdogan has called himself an “enemy of interest rates” and has also cited Islamic beliefs against usury as part of his reasoning…

…Between 2019 and 2021, Erdogan fired three central bank governors in roughly two years. The most dramatic was in March 2021, when he sacked Governor Naci Agbal just two days after the bank hiked interest rates to 19% to curb inflation. Agbal had been on the job less than five months and had been winning investor confidence. His replacement did exactly what Erdogan wanted: slashed rates from 19% down to 14%. The lira lost 44% of its value in 2021 alone.

And they kept cutting. By late 2022, the central bank had pushed rates down to 9%, even as inflation was running above 80%. The lira went into freefall. Ordinary Turks watched their purchasing power evaporate.

After winning re-election in 2023, Erdogan quietly reversed course. A new economic team was brought in and interest rates were hiked aggressively, eventually reaching 50% by March 2024. It was an implicit admission that the previous policy had failed, though Erdogan has never said so publicly.

The lesson is straightforward: when the central bank loses its independence, the consequences are severe and they fall hardest on ordinary people. A president who fires central bankers for doing their job, who replaces them with loyalists willing to cut rates into the teeth of 80% inflation, isn’t just making a policy error. He’s destroying the institutional credibility that takes decades to build and years to repair.

2. The coming El Niño of 2026 – Michael Fritzell

But first, let me explain what El Niño is. It’s essentially a climate pattern that drives global temperatures to rise, leading to droughts across Asia and Africa.

In normal years, winds blow from the eastern Pacific Ocean near South America to the western Pacific Ocean near Asia. These winds push warm water towards Asia. In normal years, this warm water causes clouds to form and rain to fall in Asia.

And since the warm water moves away from South America, the remaining water close to South America tends to be cool.

The so-called El Niño weather cycle disrupts this pattern. Instead of winds moving west, the warm water stays in the middle of the Pacific, or even moves east.

This causes:

  • Less rainfall in Asia, leading to droughts in Australia, Southeast Asia and even parts of Africa
  • More rainfall in the Southern United States and South America, leading to flooding in those regions…
  • …The US National Oceanic and Atmospheric Administration gives a 61% chance of El Niño emerging by July 2026.
  • Roughly half of the team at the European Centre for Medium-Range Weather Forecasts expect temperatures in the main El Niño region in the Pacific Ocean to exceed 2.5 degrees Celsius above the seasonal average by October 2026. Making it one of the most intense El Niños of the past century..

…First, droughts will negatively impact palm oil yields for Malaysian and Indonesian plantation companies, perhaps by as much as 10-20%. That’s how much output was impacted by the unusually strong El Niño of 1997…

…Droughts in Asia tend to reduce hydroelectric output, boosting the demand for coal in India and Indonesia. So coal prices could be heading higher, all else equal. And Indonesian coal miners stand to benefit…

…There have been a few instances, such as 2017, when key weather agencies forecasted an El Niño, yet none materialised.

However, I think there’s an asymmetry here, given that investors are not yet prepared for the potential of a super-El Niño, which could rival the one we saw in 1997.

3. China shock 2.0: the flood of high-tech goods that will change the world – Ryan McMorrow, Sam Fleming, Peter Foster, and Joe Leahy

Twenty years ago the global economy was shaken by a first “China shock” as a wave of low-cost goods destroyed the business models of manufacturers in advanced economies, displacing millions of workers and feeding discontent that fuelled populist politicians including US President Donald Trump.

Now a second shock is under way — one that is even more threatening to China’s trading partners: an assault on high-end manufacturing.

Vicious domestic competition, coupled with vast industrial scale, ample pools of engineering talent and some of the highest subsidies in the world, has generated world-beating Chinese champions in EVs, solar panels, batteries, wind turbines and a lengthening list of advanced manufacturing sectors…

…After racking up a record trade surplus in goods that surpassed $1tn in 2025, China boosted exports by nearly 15 per cent year on year in the first three months of 2026…

…BYD, the world’s largest EV maker, saw its average selling price per car fall from Rmb143,100 in 2021 to Rmb119,223 last year. Nio, one of China’s premium EV brands, has lowered the price of its flagship ES8 SUV by about 20 per cent since its 2018 debut, despite packing much more technology into the car.

Chief executive William Li says cutting costs has been a focus as they have redesigned the car. “For the first-generation ES8, the vehicle structure used 97.4 per cent aluminium, which was very expensive,” he says. “Today, we can achieve the same strength with less aluminium.”

Li adds that the group has brought the manufacture of components such as semiconductors in-house and localised the sourcing of parts such as the air suspension, which was once imported from Germany…

…“There is an ideological hardwiring at the top of the Chinese hierarchy to favour production over consumption,” says Daleep Singh, a former White House adviser under Joe Biden who is now chief global economist at PGIM, the asset management group.

“China will continue to rely on the rest of the world to absorb their excess production because the domestic political cost of empowering their own consumers is too high.”…

…The surge in Chinese exports in the first three months of 2026 was driven by shipments to the EU, up 21.1 per cent, and to south-east Asia, up 20.5 per cent year on year — even as exports to the US fell…

…A further, critical factor is the Chinese currency. Lower inflation relative to Chinese trading partners has led to a real exchange rate devaluation in the past three years, helping boost net exports and the current account surplus, which stood at 3.7 per cent of GDP last year.

The IMF estimates the country’s real effective exchange rate — which measures the real value of the currency against a basket of competitors — is undervalued by around 16 per cent, fuelling the competitive advantage enjoyed by Chinese exporters.

China has kept exports competitive by buying dollars and depreciating the currency, accumulating “shadow reserves” through a complex web of state-owned banks.

Then, crucially, there is Beijing’s industrial policy.

China has a ream of policies to help companies get off the ground, with local governments in particular battling with each other to offer the best subsidies, cheap land, financing and tax breaks to lure in manufacturers and seed new industries on their turf.

The competition between localities can be so great that some businesses move from one place to the next as they chase subsidies and investment. They have become known as “migratory bird enterprises”…

…The way the Chinese system works, local officials have every incentive to protect their companies.

Value added tax generates nearly 40 per cent of China’s tax revenue, and the central government splits the receipts with the localities where products are made, giving them a direct stake in keeping factories running.

Adding local production capacity also creates the growth that officials are largely judged on, and any large-scale lay-off could threaten social stability, Beijing’s overriding priority.

“Officials are scared of missing their GDP targets. Nobody is scared of overcapacity,” says another founder, who asks to remain unnamed. “As long as you’re manufacturing, there’s VAT revenue. Whether you sell [a product] or make a profit, that doesn’t really affect them.”…

…Recent OECD analysis underscores the role of subsidies. Company-level analysis of Chinese industry by the 38-member organisation estimates that Chinese businesses are subsidised at between three and nine times the rate of their rich-world counterparts.

As well as grants and tax breaks, the OECD data finds that the biggest subsidies come in the form of loans from Chinese state banks offering below-market rates to Chinese companies that undercut international competition.

While such dynamics have helped Chinese groups dominate globally, profits are vanishing. In the solar industry, overcapacity has led to vast losses, which China’s top six publicly traded solar groups indicated would cumulatively total Rmb43bn for 2025.

Yet the subsidies continue. One of those six companies, Jinko Solar, received Rmb1.3bn in subsidies in the first half of 2025 but still lost Rmb3bn in the period…

…As Chinese factories rushed into solar, production capacity skyrocketed. The country has the ability to manufacture 1,200GW of solar panels annually, roughly double the 647GW installed worldwide last year, according to the China Photovoltaic Industry Association and energy think-tank Ember.

“Why was it possible to build capacity exceeding global demand by double in such a short time?” asked Li Dongsheng, the chair of television and solar conglomerate TCL. “The key reason is the distortion of resource allocation and inappropriate local government participation,” he said in an interview with local media last month.

4. Corporate dark arts gone awry: how executive incentives can destroy shareholder value $NNBR $GME $HAIN – Andrew Walker

A comp scheme that could encourage management to destroy value to maximize their own payout.

Gamestop (GME) serves as a perfect example here. In January, they gave their CEO a huge option package: the CEO got >171m options struck at $20.66/share (the stock’s closing price). The options don’t expire for 10 years, and they only vest if the company hits certain market cap and EBITDA targets…

…You can certainly see the logic behind the award: GME’s market cap is <$10B, and their 2026 EBITDA was ~$345m. This comp package is encouraging massive market cap and EBITDA growth in order to even begin vesting.

Corporate governance ninjas can probably already see the issue with this package: it encourages any growth in market cap and EBITDA, not per share numbers. That incentive carries a host of issues. To take it to the most extreme: the CEO could easily hit all of his targets by issuing stock like a wild man in order to boost the company’s market cap. He could then take all of that cash and go on an acquisition spree in order to drive the company’s EBITDA up. It doesn’t matter whether the acquisitions create value for shareholders; if they boost EBITDA, they help from a vesting perspective…

…A comp package could actually disincentivize management from maximizing shareholder value.

Why does this one scare me? Because I’m so focused on incentives, and I’m always worried I’ll be lured into a situation where the incentives look positive but are actually insidious.

A live example will show this best: consider NNBR. In 2023, the stock was trading for just over $1/share, and they recruited a new CEO with a contract that would give him up to 2.5m shares if the stock price could hold $11/share…

…Fast forward to today, and things haven’t gone that well. The stock is back down to $1.50/share (though some early strength in the stock resulted in the $2 and $3 tranches vesting), and the company is reviewing strategic alternatives. Imagine you’re the CEO and had two choices right now: sell the whole company for $3/share, or max out the company’s credit line, head to Vegas, plop down at a roulette table, and bet it all on lucky #13.

If we ignored the fact that option #2 would result in some jail time, the CEO is actually incentivized to pursue that “lever up and risk it all” option. Why? Selling the company doesn’t help him vest more units, so he’s not super incentivized to pursue a sale (particularly because it puts him out of a job). In contrast, if he got lucky with the “lever up and risk it all” strategy all those PSUs would go in the money and he’d grab a multi-million dollar windfall…

…Someone highlighted COOK’s pay to me recently, and I’d be remiss if I didn’t mention it. COOK’s financial performance for 2025 missed all of their executive team’s performance goals, resulting in their stock declining >50% during the year and “no payments under the program to the Company’s named executive officers”…. but “the Board decided to award Jeremy Andrus, the Company’s Chief Executive Officer, and Michael Joseph (Joey) Hord, the Company’s Chief Financial Officer, discretionary cash bonuses equal to $956,250 and $270,938, respectively, due to their significant contributions to the Company in 2025 and to promote retention.” Well done guys; if I was a shareholder I know I’d be thrilled with that decision!

5. Letter to the 20-year-old investor – Chin Hui Leong

If you are closer to 20, you have an edge that no amount of money can buy. More on that later…

…I actually started investing much earlier, in 2002. Back then, there weren’t many choices. I bought the only unit trust available that tracked the US-based S&P 500…

…But when I bought my first individual stock in 2005, things changed. I actually felt more comfortable holding individual stocks than I did when holding index funds…

…Since 1928, the S&P 500 has fallen 10 per cent (or more) roughly every 1.8 years and 20 per cent every five years or so. When that happens, if you’re watching the index too closely, you’ll be upset.

You’ll start looking for reasons why it declined; my advice is don’t.

The S&P 500 is made up of 500 stocks…

…Trying to figure out why all 500 – or even 30 – stocks fell at once is too much work…

…When I held individual stocks, whenever a stock price fell, I could look at how much cash the company had. I could check whether its products were still selling. I could see whether it was generating profits and free cash flow…

…Between 2005 and 2010, the S&P 500 peaked in 2007, only to fall spectacularly during the global financial crisis. While the US market recovered starting from 2009, the index ended 2010 roughly where it started five years earlier…

…During what was rated as one of the deepest recessions in 70 years, I started noticing that certain companies were thriving.

The companies included Apple, Amazon, Booking Holdings, and Netlifx. They were among the 25 stocks I bought and held for a decade or more…

…Through it all, there were benefits I did not expect. I had a window into the future. I knew that online streaming was coming before it happened. I knew that same-day delivery was possible back in 2009…

…Amazon is up 39 times from when I bought it in 2010. Netflix has grown over 313 times. Booking Holdings is up 21 times. And Apple, which people thought had saturated its market a decade ago, is up 26 times…

…If you started investing at 20, or even earlier, time is on your side.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. We currently have a vested interest in Amazon, Apple, and Netflix. Holdings are subject to change at any time.

What The USA’s Largest Bank Thinks About The State Of The Country’s Economy In Q1 2026

Insights from JPMorgan Chase’s management on the health of American consumers and businesses in the first quarter of 2026.

JPMorgan Chase (NYSE: JPM) is currently the largest bank in the USA by total assets. Because of this status, it is naturally able to feel the pulse of the country’s economy. The bank’s latest earnings conference call – for the first quarter of 2026 – was held earlier this week and contained useful insights on the state of American consumers and businesses. The bottom-line is this: the US economy remains resilient, but the risks are growing in complexity.

What’s shown between the two horizontal lines below are quotes from JPMorgan’s management team that I picked up from the call.


1. The US economy remained resilient in 2026 Q1; consumers and businesses continue to spend; there are multiple tailwinds supporting the economy’s resilience, but the risks to the economy are growing in complexity; consumer spending growth in 2026 Q1 is faster compared to a year ago; energy is just 3% of the typical consumer’s expenditure, so they are not significantly affected by higher energy prices; the strength of the American consumer is the result of a strong labour market, so if the labour market were to weaken for any reason, the American consumer will also weaken

The U.S. economy remained resilient in the quarter, with consumers still earning and spending and businesses still healthy. Several tailwinds are supporting this resiliency, including increased fiscal stimulus, the benefits of deregulation, AI-driven capital investment and the Fed’s asset purchases. At the same time, there is an increasingly complex set of risks— such as geopolitical tensions and wars, energy price volatility, trade uncertainty, large global fiscal deficits and elevated asset prices. While we cannot predict how these risks and uncertainties will ultimately play out, they are significant and they reinforce why we prepare the Firm for a wide range of environments…

…Notwithstanding the recent volatility in market and gas prices based on our data, consumers and small businesses remain resilient with consumer spend growth continuing above last year’s pace…

…[Question] How resilient is consumer spend and credit if energy prices remain high? And are there any signs of cracks that you’re seeing at all?

[Answer] There really is not anything new or interesting to say this quarter. We’ve looked at it through every angle. Early roll rates, delinquency rates, cash buffer, spend, discretionary spend, non-discretionary spend, it all looks consistent with prior trends and fundamentally, healthy. So let me add maybe just a little bit of nuance in the context of energy prices and what’s going on this quarter. So I think gas or energy cost is something like 3% of the typical consumer’s expenditure, at least in our portfolio. So it’s not nothing, but it’s not overwhelming. We’ve looked to see if there’s kind of evidence in there of people trading, decreasing other discretionary spending to adjust for higher gas prices, but it’s just kind of not enough yet to be visible.

I would caution, though, I think it remains fundamentally the case that the biggest single reason that the consumer credit performance is healthy is that the labor market is strong. And if you get bad outcomes in the Middle East, much higher energy prices or other problems that sort of do eventually track what has been, I think, from many people’s perspective, a surprisingly resilient American economy and a very resilient U.S. consumer, and that winds up having knock-on effects on the labor market, then you will see that come through, clearly. But right now, in the end, the story remains the same, which is resilient consumer that’s doing fine despite higher gas prices.

2. Net charge-offs for the whole bank (effectively bad loans that JPMorgan can’t recover) was flat at US$2.3 billion compared to a year ago

Credit costs of $2.5 billion with net charge-offs of $2.3 billion and a net reserve build of $191 million.

3. Management thinks there has been no recent changes in real-world systemic risk

It’s important to understand that under the current rule, the surcharges for almost all of the G-SIB banks are scheduled to increase meaningfully over the next 2 years, simply as a result of recent growth in the system despite, in our view, no change in real-world systemic risk.

4. Mortgage loan originations had strong growth in 2026 Q1, driven by refinancing of mortgages 

In Home Lending, originations of $13.7 billion increased 46% year-on-year predominantly driven by refi performance.

5. JPMorgan’s investment banking fees were up 28% in 2026 Q1 from a year ago because of strong performance in mergers & acquisitions (M&A) and equity underwriting; management sees a strong pipeline for capital markets activities, barring significant deterioration in the ongoing Middle Eastern conflict; the sentiment of companies for capital markets activities has been surprisingly resilient

IB fees were up 28% year-on-year, driven by strong performance across M&A and equity underwriting, partially offset by lower debt underwriting. Looking ahead, client engagement and pipelines remain healthy, but of course, developments in the Middle East could have an impact on deal execution and timing…

…On the question of overall sentiment on the pipeline, I would describe it as resilient, maybe surprisingly resilient, given everything that’s going on. But I also think the time lines in the Middle East are kind of quite short. There are deadlines or negotiations. I think it’s reasonable for people to kind of proceed with their plans in the hope or maybe expectation that we get relatively quick resolutions. But if things start getting derailed, I would be surprised if you don’t see some impact on sentiment and on deal decision-making. But for right now, it seems quite resilient.

6. Management continues to expect credit card net charge-offs for 2026 to be 3.4% (was around 3.3% in 2025); management expects JPMorgan’s credit card loans to grow 6% in 2026

The adjusted expense outlook continues to be about $105 billion and the Card net charge-off rate continues to be approximately 3.4%…

…What we said about Card loan growth expectations at Company Update, which is that we said we expected 6% or maybe a little bit more, and that hasn’t really changed. That’s still kind of our core expectation. 

7. Management thinks that there will not be systemic issues for banks even if the private credit industry experiences a default cycle because the private credit industry is still relatively small compared to the overall loans market banks are participating in; the credit quality within private credit portfolios has not gotten much worse

[Question] do you think that if we do have a default cycle in private credit, that it will be systemic?

[Answer] Private credit leverage lending is like $1.7 trillion, high-yield bonds are something like $1.7 trillion, bank syndicated leveraged loans are like $1.7 trillion, investment-grade debt is $13 trillion, mortgage debt is like $13 trillion, and there’s a lot of other stuff out there. And I pointed out that I think there’s been some weakening in underwriting and not just by private credit elsewhere. And there will be a credit cycle one day. And I think when there’s a credit cycle, losses will be worse than people expect relative to the scenario. I don’t think it’s systemic. It almost can’t be systemic at that size relative to anything else. But when recessions happen and values go down and people refi at higher rates, there will be stress and strain in the system. Are people prepared for that? I can’t speak for other banks, but these are — most of these things are on top of — you have to have very large losses in private credit before at least it looks like banks are going to get hit or something like that. So it doesn’t mean you won’t feel some stress and strain, and that you might have to do something about it, but I’m not particularly worried about it…

…We always had what we call marking rights to look at the underlying collateral, and that’s just a right that protects you and gives you certain rights, things like that. Obviously, if you ever see credit getting worse, and it’s gotten not terribly worse, the actual credit which a lot of these private equity — private credit guys are pointing out, the actual credit hasn’t gotten that much worse. There are pockets where it has. And credit spreads themselves haven’t gotten much worse in general, but there are pockets where it has. So we’ll be watching it closely. We think we’re okay on all of that.

8. Management thinks corporate and consumer debt are not too high, whereas government debt is high

Corporations in general, the debt is not too high. Consumers, in general, the debt is not too high. Most of the excess debt is in government debt at this point. 


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I don’t have a vested interest in any company mentioned. Holdings are subject to change at any time.

What We’re Reading (Week Ending 12 April 2026)

The best articles we’ve read in recent times on a wide range of topics, including investing, business, and the world in general.

We’ve constantly been sharing a list of our recent reads in our weekly emails for The Good Investors.

Do subscribe for our weekly updates through the orange box in the blog (it’s on the side if you’re using a computer, and all the way at the bottom if you’re using mobile) – it’s free!

But since our readership-audience for The Good Investors is wider than our subscriber base, we think sharing the reading list regularly on the blog itself can benefit even more people. The articles we share touch on a wide range of topics, including investing, business, and the world in general. 

Here are the articles for the week ending 12 April 2026:

1. America’s AI Build-Out Hinges on Chinese Electrical Parts – Emily Forgash and Akshat Rathi

Almost half of the US data centers planned for this year are expected to be delayed or canceled. One big reason is the shortage of electrical equipment, such as transformers, switchgear and batteries. They are needed not just for powering AI, but also for building out the grid that is seeing increased consumption from electric cars and heat pumps. US manufacturing capacity for these devices cannot keep up with demand, and the scarcity has caused data center builders to rely on imports…

…Data centers consuming as much as 12 gigawatts of power are supposed to come online in 2026 in the US, according to analysts at market intelligence firm Sightline Climate, who will be releasing a new report in the coming weeks. However, only a third of that is currently under construction, Sightline estimates…

…Electrical infrastructure adds up to less than 10% of the total cost of the data center, but it’s impossible to build the operation without it. “If one piece of your supply chain is delayed, then your whole project can’t deliver,” says Andrew Likens, Crusoe’s energy and infrastructure lead. “It is a pretty wild puzzle at the moment.”…

…Though few companies are eager to talk about it, the US has been outsourcing its manufacturing to other countries, primarily China, for decades. That has contributed to a significant shortage of electrical components in the US, says WoodMac’s Boucher…

…While most of the US’s transformers come from Canada, Mexico and South Korea, US utilities imported more than 8,000 high-power transformers in 2025 through October from China, up from fewer than 1,500 imported in all of 2022, estimates WoodMac’s Boucher. This build-out “is going to be highly dependent on the import market,” he says.

Once transformers lower the voltage of electricity so it can be used in data centers, it then needs to be distributed across the data center safely. That’s done through switchgear, which includes circuit breakers and fuses. There too, data center developers are seeing delivery delays – though not as extreme as the timelines for transformers.

Equinix Inc.’s solution is to commit at least $350 million to support Hanley Energy’s new manufacturing facility in Ireland, which will make switchgear and other data center components. Equinix expects to achieve 10% to 15% faster lead times as a result.

Crusoe’s answer to that shortfall has been to pre-order lots of the equipment. That means spending many millions of dollars on supplies before the company even knows it has an order to fill, but it’s proved a winning strategy. Recently, Crusoe also began manufacturing their own switchgear…

…The share of US imports of transformers and switchgear from China has declined steadily in recent years – although for specific types of equipment that share is still hovering around 30%. The Chinese share of battery import volumes remains stubbornly above 40%.

China dominates the supply of electrical equipment because it controls so many parts of the supply chain, from materials to processing to manufacturing, and the gulf between China and the US is set to widen. In its new five-year plan, the Asian giant revealed last month that it will double down on building out its grid with renewables, while the Trump administration has dismantled policies to deploy solar and wind power.

2. “Founder Mode” Complacency – Abdullah Al-Rezwan

When DeepMind was plotting to extricate themselves from Alphabet almost a decade ago, Pichai was prescient enough to foresee AI’s paramount importance in their core business…

…As these negotiations became more tense over time, all the big guns of Alphabet planned to meet to resolve the issue at hand. Alas, some big guns didn’t seem to appreciate what was at stake. From the book:

When the two sides met again, the conversation underscored the gulf between them. Hassabis and Suleyman argued that DeepMind did not fit under Google’s umbrella: Its mission was AGI, not consumer‑internet products. Pichai objected that AI was central to his vision for Google, and that he would not allow his scientific bench to be depleted. Hassabis had hoped that Larry Page would weigh in on his side and push the Alphabet plan to a conclusion. But Page showed up for the meeting two hours late, and Sergey Brin was even later. Their version of what later came to be known as “founder mode” was that they were nowhere to be found, disproving the Silicon Valley mantra that founders deserve the right to control their companies indefinitely. With Page and Brin effectively checked out, Pichai was the man DeepMind had to deal with.

I have been thinking about the aforementioned excerpt for the last couple of days. If you glanced at my portfolio, it’s not difficult to see that I drank my fair share of kool-aid of “founder mode”. Perhaps fittingly the “founder mode” propaganda originated from a founder himself: Brian Chesky. The more I ruminated over “founder mode”, the more I came to the conclusion that there is a glaring missing aspect in “founder mode” mantra: Complacency.

It is telling that Chesky proudly recalls every chance he gets about how he figured out during Covid that Airbnb doesn’t need to do search advertising; as an investor I was actually a bit alarmed that he was running Airbnb pre-pandemic without paying close attention whether his advertising dollars was being deployed with appropriate ROAS guardrail. I can guarantee you that despite operating in “Manager Mode”, Glenn Fogel at Booking was looking at advertising ROI with a microscope and he certainly didn’t need a global pandemic to remind him how to deploy his precious advertising dollars at Booking.

3. A token is not a fixed unit of cost – Anjali Shrivastava

We only consider token count as the static linear meter because we inherited the logic from inference APIs. But, a token does not represent a fixed unit of work.

This is obvious to anyone who works in inference, but if you’re used to calculating compute budgets based on linear API rates, it takes a second to sink in.

The intuition is grounded in the autoregressive nature of the transformer: Attention is quadratic with respect to current context size…

…In layman’s terms, the language model is looking at every previous token in the context window before generating a new token, which means inference APIs are linearly pricing fresh tokens whose compute cost scales quadratically.

The scaling law for compute is likely not purely quadratic, given optimizations like caching and compacting context. But no matter what, the underlying compute cost per token grows with context length. The Nth token in a conversation is an order of magnitude more expensive than the first.

There’s signs that per-token pricing breaks down at scale: both Anthropic and Google charge different rates based on prompt length…

…Traditional SaaS has variable costs too (like hosting, customer support and third-party service costs). But these costs follow the law of large numbers, and are normally distributed at scale. You can set a single subscription price that covers this average cost, plus a comfortable margin to absorb tail risk.

In the case of AI software, it is likely that these variable costs are fat tailed. The law of large numbers assumes finite mean and i.i.d. samples, but AI software has at least one dimension with infinite first moment and non-stationary tails. The sample mean keeps wandering instead of converging…

…Margin collapse is the first and most obvious symptom of the problem. Cursor’s repricing exposed poor margins, and we also learned that Replit’s margins are volatile. And there is ample evidence that Anthropic is losing money on its subscriptions.

Each layer of the aggregate cost curve is highly variable, and the more you scale, the higher the probability that these tail risks can compound…

…Subscriptions misprice intelligence, and much of the industry recognizes this, but now we can rigorously explain why.

Traditional SaaS pricing mirrors the physics of stable software, but AI introduces high variance that breaks each of these laws…

…High variance in costs necessarily constrains demand; today, the constraints are reactive.

To safely cushion from unbounded costs, a business model must price in the variance or be well above the true cost on average. Ideally by anchoring price to value delivered instead of token cost; but value delivered also happens to be highly variable and subjective. At the same time, there’s structure to value: reliability, relevance, actionability.

The key insight is that margin squeeze and resource misallocation are two sides of the same problem. Solving one side of the equation should solve the other. If you can measure the value delivered, you can price that instead of raw compute. And if you can price outcomes in terms of value delivered, you can budget the exact amount of compute and data that maximizes profit on each task.

So the layer that owns the meter also decides how much compute and data to deploy and keeps the spread between cost and price. Today that meter sits inside the model; tomorrow it could sit inside an orchestrator that plans the whole workflow.

4. Why You Should Wait Out AI’s Super-Spending False Start – Merryn Somerset Webb

The second part, the data on which all LLMs are trained, is not. Its supply is limited. Up to ChatGPT4, the internet provided enough data for each new iteration to be better. But that version was completed a few years ago, trained on the lot. There is little more for new models to train on.

The data on the internet might have expanded over the last few years, but not in a particularly helpful way. Much of it has been produced by other LLMs: train your new model on that and you might end up degrading it. Why? Because LLMs are horribly prone to errors (confabulations or hallucinations), which means they can’t give us what we most need from them: accuracy.

An LLM is not a continuous learning machine. Its knowledge stops with its training. It also isn’t deterministic (like, say, a calculator), says AI expert Janusz Marecki (who I interviewed for a podcast this week). It knows nothing with certainty. It simply “rolls the dice” on what the next word in a series should be, giving you its best guess. The answer you get is an approximation, not a series of facts. Worse, the more complicated the task in hand, the more the errors compound. Possibly even worse, the LLM can’t tell you how likely it is that there are errors. How would it know?

These problems aren’t going to go away. They are irredeemable systemic flaws in the product.

5. Switzerland – Europe’s overlooked activist opportunity – Swen Lorenz

Switzerland is famously conservative and generally averse to outsiders telling it what to do.

This is also reflected in its corporate landscape.

Even though the country is broadly open to foreign investment, there have long been numerous mechanisms allowing companies to keep outside influence under tight control.

Some Swiss companies require shareholders to be registered by name, with board approval needed for new registrations. This has led to cases where outsiders were refused registration – and “outsiders” can even include Swiss citizens from a different region.

Other companies cap voting rights per shareholder or maintain super-voting shares that remain tightly held by local incumbents…

…The 2023 reform of Swiss corporate law wasn’t widely noticed, not least because attention was focused on events in Ukraine and the aftermath of the pandemic.

Until then, a shareholder needed to represent 10% of share capital to add an agenda item for a vote at the annual general meeting.

For publicly listed companies, this threshold has now been reduced to just 0.5% – a far more attainable level.

Similarly, a shareholder with 5% can now requisition a shareholders’ meeting, compared to 10% previously.

Just as importantly, the broader acceptance of active shareholders has evolved…

…Finanz und Wirtschaft, Switzerland’s leading German-language business daily, carries significant influence among corporate executives. In an article published on 18 September 2025, the paper noted how “activist investors are transforming from bogeyman to catalyst”…

…Patrick Fournier is an active investor based in Zug. We met several years ago at his family home to discuss our shared interest in frontier markets.

Today, his focus has shifted closer to home.

He allowed me to share the following:

“We have progressively sold all our portfolio of foreign shares and are now focusing on Swiss small & mid cap. We see huge value opportunities on this segment. We intend to become a little ‘activist’ as it is now possible with only 0.5% of capital in a listed company (far lower than the previous 10%) to add some proposition at the agenda of the annual general meeting of shareholders. This will wake up the Board of several companies, including regarding the dividend (payout) policy. As a result, we are in front of a ‘rerating’ (multiple expansion) of this segment.”…

…BVZ held its annual general meeting on 8 April 2026, and the results were telling.

Some 287 shareholders attended, representing 110,328 out of 197,278 shares outstanding (with one shareholder alone holding 56,000 shares). Alarick’s proposal to increase the dividend from CHF 18 to CHF 50 received 14.5% support and was rejected by 83.8%. As a result, the board’s proposal to raise the dividend from CHF 16 to CHF 18 was approved. With earnings per share of CHF 151, this implies a payout ratio below 20%. The proposal to initiate a share buyback programme received 16.67% support and was rejected by 82%, and therefore did not pass.

What may sound like a defeat is, in fact, the equivalent of an earthquake. In Switzerland’s highly consensus-driven corporate culture, such a level of shareholder dissent represents a clear wake-up call for management.

The market agreed. On the day of the meeting, the share price closed at an all-time high of CHF 1,550, up 67% over the past 12 months.

As the recent share price performance suggests, even raising one’s voice in a constructive manner can create value for shareholders in Swiss companies.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. We currently have a vested interest in Alphabet. Holdings are subject to change at any time.

What The CEO of The USA’s Largest Bank Thinks About The World Today

Jamie Dimon’s latest excellent letter.

JPMorgan Chase (NYSE: JPM) is currently the largest bank in the USA by total assets. Its CEO, Jamie Dimon, is known for writing lengthy annual shareholder letters in which he pontificates about the state of the world and the banking industry. 

His latest letter was released earlier this week. I read it in earnest, taking extensive notes that I thought will be useful to share. So here they are! (The italicised passages between the two horizontal lines below are direct quotes from Dimon’s letter.)


1. Asset prices look fully valued

And we also continue to buy back enough stock so as not to increase total excess capital, though we have a number of options on how to deploy our capital and are clear-eyed that many asset prices, including bank stocks, are fully valued.

2. Current banking regulations have had some good effects, but have also made the banking system weaker by creating risks, including the creation of conditions that led to the Silicon Valley Bank crisis in 2023, and the risk of moral hazard in bank runs

A properly regulated banking system helps reduce risk to the financial system, protect customers, and maximize productive use of capital and lending. The Dodd-Frank Wall Street Reform and Consumer Protection Act and some of the rules that followed that legislation accomplished some good things. At the same time, they also created a fragmented, slow-moving system with expensive, overlapping and excessive rules and regulations — some of which made the financial system weaker and reduced productive lending…

…Here are some of the negative consequences partially due to poor bank regulations.

  • Because capital requirements on banks are much higher than the market gives to private entities, insurance companies or even foreign banks, huge arbitrage is created. This is often a sign of potential risk.
  • Regulators wrongly incorporated an accounting concept called “held to maturity” (HTM) into the capital rules, thereby giving Treasury and mortgage securities better capital treatment because the holder has promised not to sell them. This had many negative consequences — it allowed banks to not recognize mark-to-market losses on those securities in their regulatory capital, and in some cases, it falsely increased returns on those securities (because the amount of regulatory capital needed to be held against them was significantly smaller). This inadvertently encouraged banks to take on more interest rate risk, which was the ultimate trigger for the failure of Silicon Valley Bank (SVB) and First Republic Bank (FRB).
  • The Fed’s Comprehensive Capital Analysis and Review (CCAR) stress test, as currently constructed, produces results that are far worse, in our strongly held opinion, than what our actual results would be under those severely adverse conditions. The process is flawed, including reliance on inaccurate models and assumptions and the fact that it tests only one type of crisis, so other scenarios are overlooked (e.g., rapid rises in interest rates, as in the case of SVB and FRB). Testing should use accurate numbers and assumptions — then the results are what they are — rather than being driven by predetermined “what-ifs.” More transparency and sound methodology would lead to continuous improvement, not gaming the system. Essentially, we do not use CCAR to manage risk — we look at far more scenarios and need to be prepared for all of them. We also look at these risks every week, not just once a year…

…One of the huge risks for a bank has always been a “run on the bank,” which occurs when people think that their uninsured deposits are at risk. The FDIC only covers insured deposits, and the run risk is driven by uninsured deposits, particularly nonoperating uninsured deposits. In recent bank failures, regulators have had to invoke the systemic risk exception (SRE) to protect uninsured deposits at the point of failure. That is a problem — no one should want this as an emergency mechanism. It creates moral hazard, and the process to invoke the SRE is chaotic and involves multiple agencies, including approval by the Treasury Secretary in consultation with the President. Bank runs can happen quickly, and relying on that type of action to avoid contagion is simply not a good idea.

3. Real banking risks are always about credit, liquidity, interest rate, and operations

The real risks almost always end up being credit, liquidity, interest rate or operational risk.

4. Ideas to reduce actual risks in the banking industry through regulations include placing limits on the amount of HTM (held-to-maturity) assets banks can hold, requiring banks to have more liquidity to pay off uninsured deposits, and putting limits on the percentage loss on uninsured depots in the event of a bank failure

Here are some ideas that I believe would not only significantly reduce the chances that the SRE would need to be invoked but would also make the system safer and avoid moral hazard.

  • I would limit the amount of HTM securities in a way that links to the total long-term debt that the bank must have available to absorb losses upon its failure. And while this is a judgment call, banks need to realize that when available-for-sale and HTM security losses start to exceed 50% of tangible equity, investors will get worried…
  • …Prior to failure — between the Fed window and the rather quick sale or financing of securities or other assets — banks should be in a position where they have enough liquidity to pay off more than 50% of uninsured, nonoperating deposits. Regulators floated a similar idea in 2024, and I agree with them. This plan, plus the fact that equity and long-term debt will absorb losses before uninsured deposits are at risk, would give customers far greater peace of mind.
  • We should also consider simply setting, upfront, a statutory cap on the percentage loss on uninsured deposits in the event of failure — say, at 5%. This would reduce moral hazard and create an additional buffer for the FDIC to achieve a smooth resolution without using the SRE. With this plan, a small portion of the uninsured deposits would be immediately available to cover losses and communicated to depositors in “peacetime” while the bulk of uninsured deposits would be protected in a resolution. Although some might argue that a mechanism like this might increase the risk of a bank run, I think if the percentage is well-chosen, it might actually be stabilizing by eliminating the uninsured depositor’s nightmare scenario of losing all their money. In the end, all debates about the best way to proceed revolve around how much shareholders, creditors and uninsured depositors of the failing bank should pay and how much healthy banks should pay. As I already said, it has never been the taxpayer. And perhaps capping the maximum loss on uninsured deposits upfront would put an end to ad hoc involvement by the government once and for all.

5. AI is transformational, will have tremendous positive impacts on society, and is not a speculative bubble, but AI will also create serious new risks, including job displacement; AI will also have second- and third-order effects

The importance of AI is real — and while I hesitate to use the word transformational — it is. The pace of adoption will likely be far faster than prior technological transformations, like electricity or the internet. Those took decades to roll out, but this implementation looks likely to accelerate over the next few years. Our Chief Operating Officer describes our efforts in more detail, but I want to make some key points here.

  • We will not put our heads in the sand. We will deploy AI, as we deploy all technology, to do a better job for our customers (and employees).
  • AI will affect virtually every function, application and process in the company. And in the long run, it will have a huge positive impact on productivity. I do not think it is an exaggeration to say that AI will cure some cancers, create new composites and reduce accidental deaths, among other positive outcomes. It will eventually reduce the workweek in the developed world. And people will live longer and safer.
  • We do not yet know exactly how AI will unfold. The landscape will change rapidly, with shifting assumptions about power consumption, costs, chip technologies and the speed at which data centers are deployed. There will be a wide variety of AI models — open and closed, large and small — and no single tool will dominate. Overall, the investment in AI is not a speculative bubble; rather, it will deliver significant benefits. However, at this time, we cannot predict the ultimate winners and losers in AI- related industries.
  • AI is a genuine technological shift that will impact many sectors, including physical industries and scientific research. AI is only beginning to be applied broadly in science, and its influence will continue to expand.
  • AI will also introduce serious new risks — from deepfakes and misinformation to cybersecurity vulnerabilities. These risks are real, but they are manageable if companies, regulators and governments prepare. The worst mistakes we can make are predictable: overreact at the first serious incident and regulate out important innovation or underreact and fail to learn from what went wrong. The right approach requires rigorous preparation in advance, an honest assessment when things go wrong — and they will — and discipline to fix what’s broken without destroying what works.
  • AI will definitely eliminate some jobs, while it enhances others. Our firm will have definitive plans on how we can support and redeploy our affected workforce.
  • AI will create many jobs — some we can see today in cybersecurity and AI itself, and some we can’t see. But we do know that there is a huge workforce shortage for many well-paying white- and blue-collar jobs.
  • There is a possibility that AI deployment will move faster than workforce adaptation to new job creation. In prior technological transformations, labor had time to adjust and retrain. We do believe that business and government can do many things to properly incent retraining, income assistance, reskilling, early retirement and relocation for those whose job might be adversely impacted by AI (I talk about some of these ideas in Section IV around work skills training and the Earned Income Tax Credit).

One last but important point: We have focused on some of the “known and predictable” and some of the “known unknown” events. But huge technological shifts like AI always have second- and third-order effects as well that can deeply impact society. Some of these are, for example, cars bringing about the development of suburbs and shopping malls; agriculture enabling cities; and the original internet (invented back in 1969) leading to mobile phones, apps and social media. We should be monitoring for this kind of transformation, too.

6. Small teams are required to execute with speed

It’s essential to organize in small teams for super speed.

The real competitive battles are fought at the detailed segment level: It’s not just investment banking or the investment banking healthcare sector; it’s having the right team to win in healthcare pharma or medical devices. It’s not just credit card or even affluent brands; it’s the Chase Sapphire® card. It’s not small business clients in branches; it’s restaurateurs or law firms. It’s not digital payments; it’s 24/7 digital payments with automatic currency conversions. It’s hundreds of small teams (including technology, AI, marketing, subject matter experts and others) attacking specific problems. The teams needed to tackle these challenges should be small and authorized with the decision-making ability to move and act like Navy SEALs or the Army’s Delta Force. Finally, they need to be dedicated to the task at hand. Very often when a management team wants to accomplish something new, like create a digital account opening process that cuts across virtually every area, everyone on the team says, “We’ll get it done,” meaning they will add it to the long list of tasks already on their plate. But when efforts are 1% of a lot of people’s jobs, it will never get done. You need a team 100% dedicated to the mission — and everyone else supports them.

7. The global and US economy is very different today compared to 20 years ago in terms of (1) the importance of energy, (2) the size of the financial markets, (3) the composition of the players in the financial markets, (4) the size of investment portfolios, (5) the composition of holders of US Treasuries, and (6) central bank activity

It’s helpful to recognize that the world’s economy is far larger and more diversified and far less reliant on energy as an input versus 20 years ago. Global energy consumption to the global gross domestic product (GDP) is only about 40% of what it was around 45 years ago, say in the early 1980s, and the United States, instead of being a major importer on a net basis, is now a major exporter…

…If you look at the tables below, there are a few items that are truly different now from what they were in 2010, and these may well lead to different and unexpected outcomes. To name a few: The global debt and equity markets are far bigger than before (as are global deficits). Many nonbank financial institutions and investors are dramatically bigger than they were in the past (think hedge funds, private equity funds, sovereign wealth funds, among others). Global foreign portfolio investments are far bigger than before, and a large stock of U.S. Treasuries owned by foreigners is not held by central banks (central banks are less likely to make dramatic changes in their holdings of U.S. Treasuries). In addition, global QE is far bigger than it ever was before. A change in sentiment could easily affect the global flow of investments into securities, including U.S. Treasuries. You can also see that brokerage inventories are far smaller as a percentage of investments than ever before and, as a result, market makers are less able to intermediate in extremely volatile markets.

8. The US remains the world’s best investment destination; the US must continue to be the premier military force globally, maintain its economic position, and manage its foreign economic affairs, in order to remain strong

It is also good to remember that the United States remains the world’s best investment destination, particularly when things are going badly…

…There are three critical issues that will ultimately determine the health and safety of the United States and possibly determine the future direction and strength of the free and democratic world. JPMorganChase and its employees — like all other businesses and individuals — will be deeply affected over time by how the United States succeeds in these areas:

  1. The United States must maintain the premier military force in the world.
  2. The United States must maintain its preeminent economic position in the world, which also requires reigniting the American Dream.
  3. The United States must manage its foreign economic affairs to strengthen the U.S. economy and that of our critical allies so that the first two points remain true.

9. Inflation is a risk to the US and global economy in 2026; other large risks to watch include (1) Russia’s ongoing war with Ukraine, and the US and Israel’s ongoing war with Iran, (2) high sovereign deficits and debt, (3) high asset prices and low credit spreads, (4) new trade arrangements, (5) the relationship between the US and China, (6) private credit, (7) lengthy holding periods of private equity investments, and (8) cybersecurity; losses on leveraged lending could be higher than most expect when a credit cycle happens

The skunk at the party — and it could happen in 2026 — would be inflation slowly going up, as opposed to slowly going down. This alone could cause interest rates to rise and asset prices to drop. Interest rates are like gravity to almost all asset prices. And falling asset prices at one point can change sentiment rapidly and cause a flight to cash…

…I think some of the larger risks are much like tectonic plates, always moving and periodically causing earthquakes and volcanoes when they crash into each other. Some of the larger risks we should keep our eyes on are:

  • First and foremost, geopolitics. Russia’s war in Ukraine and its ongoing sabotage in Europe and now the war in Iran and its potential effects on energy prices can cause events that are unpredictable. We all hope these wars get properly resolved. But war is the realm of uncertainty, as each side in a war determines what it wants to do (as is often said, “the enemy gets a vote”), and these conflicts involve many countries. Not only do they have a major impact on the nations at war, but they also have an impact on countries and economies across the globe that are not directly involved in war. Nations that are heavily dependent upon imported energy are already seeing the effects. And it’s not just energy, it’s commodity products that are byproducts of oil and gas, like fertilizer and helium. And given our complex global supply chains, countries are experiencing disruptions in shipbuilding, food and farming, among others. The outcome of current geopolitical events may very well be the defining factor in how the future global economic order unfolds — then again, it may not.
  • High global sovereign deficits and debt. Global deficits are significantly elevated, particularly during what has been a relatively healthy global economy and, until recently, a time of peace — the deficit globally is at an extremely high 5%, while global sovereign debt is at all-time highs. The current forecast from the Congressional Budget Office has our debt-to-GDP ratio going from 100% today to 120% in 2036. High government debt is somewhat offset by low consumer debt, which was nearly 100% of GDP in 2007 and is now below 70%. Similarly, corporate debt is at a fairly normal healthy level of 45%. High and increasing government debt will eventually have to be dealt with — the right way would be to deal with it now before it becomes a problem; the wrong way would be to let it become a crisis, which, in my opinion, is probably the likely outcome. Importantly, almost 60% of government spending is for entitlements and is not discretionary. This makes the job that much harder. A crucial note on the importance of growth: If interest rates went down 100 basis points and GDP grew at 3%, the debt-to-GDP ratio could actually start to go down instead of going up.
  • High asset prices and very low credit spreads. In and of itself, this is not a bad thing. Household net worth as a percentage of GDP is now 560%. The high during the housing peak in 2006 was 460%. But this also means that anything less than positive outcomes could have a dramatic impact on global markets. Rapidly decreasing asset prices can sometimes create a self-reinforcing loop. It’s always good to remember that prices are set by the marginal buyers and sellers — which, on the average day, is only a small fraction of asset owners. And it’s also good to remember that foreigners own almost $30 trillion of U.S. equities and bonds. While U.S. investments and the U.S. dollar are generally havens of security in a troubled world, that didn’t stop recessions and bad markets in prior times.
  • Trade 2.0. The U.S. tariffs themselves had only minor effects on inflation or growth, and were only one straw on the camel’s back. But the trade battles are clearly not over, and it should be expected that many nations are analyzing how and with whom they should create trade arrangements. This is causing a realignment of economic relations in the world. While some of this is necessary for national security and resiliency, which are paramount, it is hard to figure out what the long-term effects will be.
  • U.S. and China relations. This relationship is critical to the whole world and is also impacted by the events mentioned above. The United States and China clearly have different systems, values, goals and objectives, and while both sides are currently engaging, we have to expect that there will be some bumps in the road — maybe even some large ones. We should all hope that ongoing proper engagement continues to lead to what may be a competitive but peaceful future.
  • Private credit and credit in general. The leveraged private credit market totals $1.8 trillion. As a comparison, the U.S. high yield bond market totals $1.5 trillion, and the bank syndicated leveraged loan market totals $1.7 trillion. Taking a wider view, the total market size of investment grade bonds is $13 trillion. And the total market value of all residential mortgage securities and loans is also $13 trillion. In the great scheme of things, private credit probably does not present a systemic risk.

    I do believe that when we have a credit cycle, which will happen one day, losses on all leveraged lending in general will be higher than expected, relative to the environment. This is because credit standards have been modestly weakening pretty much across the board; i.e., more aggressive and positive assumptions about future performance (called add-backs), weaker covenants, more use of PIK (payment-in-kind; not paying interest in cash but accruing it), more aggressive private ratings (particularly in insurance companies) and more arbitrage (not always a great sign). Also, by and large, private credit does not tend to have great transparency or rigorous valuation “marks” of their loans — this increases the chance that people will sell if they think the environment will get worse — even if actual realized losses barely change. Additionally, actual losses right now are already a little higher than they should be, relative to the environment. Finally, if rates or credit spreads ever go up, the companies that borrowed will have to borrow at even higher rates, putting them under even greater stress. However this plays out, it should be expected that at some point insurance regulators will insist on more rigorous ratings or markdowns, which will likely lead to demands for more capital.

    It has always been true that not everyone providing credit is necessarily good at it. There are many players who are late to this game, and it should be expected that some credit providers will do a far worse job than others. We have not had a credit recession in a long time, and it seems that some people assume it will never happen.

    Additionally, anything that gets sold to retail investors as opposed to institutional investors requires greater transparency, higher standards and fewer potential conflicts. If anything ever goes wrong, you should assume that retail investors, even though they were told about some of the risks, will seek remedy in the courts. Also, some of these loans go into various funds run by the asset management company. Generally, each of these funds has its own objectives and its own fiduciary responsibility to make sure that the loans are suitable for that specific fund. Those who do not do this properly are likely to get into trouble.
  • Private markets. With stock markets at all-time highs in recent months, it is a little surprising that private equity firms, which own close to 13,000 companies, have not taken greater advantage of healthy markets to take their companies public. Private equity investments are now held for an average of seven years — this is virtually double what it used to be. And some are sold, not to another company or taken public, and put in a new fund called a continuation fund. We have generally had nothing but a bull market since the great financial crisis — it’s hard to imagine what will happen if and when we have an extended bear market.
  • Cyber risk. I have to mention this because it remains one of our biggest risks, and this is probably true for many other major industries and corporations. AI will almost surely make this risk worse. We invest significantly to protect ourselves and stay vigilant.

10. There are a number of things that will have a positive impact on the US economy in 2026, and they are (1) the One Big Beautiful Bill, (2) purchases of securities by the Federal Reserve, (3) less restrictive regulations, and (4) AI-related capital spending

While there are many larger risks, as discussed in the next section, that may or may not impact the economy in 2026, we do know several things that will have a positive impact on the economy in the remainder of this year. They are:

  • Increasing fiscal stimulus from the One Big Beautiful Bill. Our economists believe this will inject another $300 billion (effectively 1% of GDP) into the economy. This has to be very modestly inflationary this year.
  • Benefits from the Fed’s purchase of $40 billion of additional securities each month, which is supposed to be reduced to $20 billion–$25 billion this April. At a minimum, this supports asset prices and helps ensure there is no liquidity squeeze in the financial system.
  • Positive effects of comprehensive deregulatory policies. This was badly needed and long overdue. Change is clearly evident in bank regulations that will free up capital and liquidity, which can be lent out (and we already see this happening), and in deregulation across many other industries, from energy to home building. It is fair to say that actions taken have clearly increased confidence and animal spirits. This should add to productivity and be modestly deflationary this year.
  • Huge increase in AI-driven capital spending and construction by the five hyperscalers. In 2025, this number was $450 billion, and in 2026, it will be approximately $725 billion. While AI will clearly drive productivity, which is generally good for inflation in the long run, all of this spending is probably inflationary in the short run.

Some of the items above have mild inflationary effects, while others probably have some deflationary effects.

11. The US has come together before to overcome incredible challenges

We have met big challenges before. At one point in 1940, only one nation, the United Kingdom, stood against the Nazi war machine, which had already conquered most of Western Europe. The United States was unprepared for what was going to happen but rose to the challenge. You may find it uplifting to read the book Freedom’s Forge, which shows how the United States came together to build the arsenal of freedom and to keep the world safe for democracy.

12. The US has become too dependent on unreliable sources for its national security needs

The United States has also allowed itself to become too dependent on unreliable sources for items that are essential to our national security, such as critical minerals, semiconductors and advanced manufacturing output, among others. We have maintained insufficient productive capabilities to be ready to quickly increase production if necessary. And our military needs to be able to rapidly develop new and often cheaper weapons, like drones.

13. The US could have grown even faster over the past 20 years than what it actually did

Over the last 20 years or so, U.S. GDP has averaged about 2% annually — I believe we could have easily achieved at least 3% growth. The reason we were able to grow 2% is that America’s businesses and entrepreneurial spirit allowed us to overcome a lot of the roadblocks mentioned later in this section. That 1% difference would have had an enormous impact, providing Americans with an extra $20,000 GDP per person annually, giving us resources to take care of nearly all our problems and jump-starting deficit reduction. Growth is part of the solution to almost all of our problems…

…I am going to mention a few damaging policies, not in detail because I’ve written about them in the past, but if they aren’t corrected, real progress may be impossible.

  • Fraud, waste and abuse…
  • …Inefficiencies within the federal government (and within state and local governments, too)…
  • …Mortgage and regulatory policies and local housing requirements….
  • …Red and “blue” tape, permitting reforms and a little litigation reform… 
  • …Policy uncertainty…
  • …Unreliable R&D policies…
  • …Failure to recognize that capital formation drives growth.

14. Supportive policies for capital formation in Sweden and Australia have led to great results

In Sweden, an investment savings account is available that simplifies the investing process with favorable tax treatment. Account holders can deposit and withdraw funds at any time, and there is no capital gains tax — just an annual tax of 1% on the balance. This has dramatically increased investment by retail investors into the Swedish stock market. It may surprise some of our readers that Sweden’s policies have created a growing and innovative stock market and that Sweden has more unicorns and billionaires per person than America does. Another example is Australia, which has a wonderful retirement policy based on superannuation, a savings account funded by both employer and employee contributions.

15. The private sector should be the one allocating capital, not the government

Industrial policy mechanisms, when used, should be as targeted and as simple as possible. They come in many guises: grants, cheap loans, equity investing, purchase agreements and others. The cleanest of these is tax credits in various forms. Whatever the policy, two rules should not be violated: (1) there should be no social engineering — this is not a jobs program (the Jones Act meant to preserve jobs in the Merchant Marine has basically destroyed our Merchant Marine and merchant ship building business) and (2) for the most part, the market should allocate capital, not the government. Industrial policy can easily devolve into a buffet where corporate America gorges at the expense of the taxpayer. While there are certain circumstances that require the government to allocate capital (think infrastructure and national security), generally the government is simply not good at allocating capital in a free market. America does best not with central planning but with consistent and clear policies that are conducive to growth.

16. Europe is currently on a very bad path of decline and fragmentation; Europe’s defense industrial base is not in good shape

I believe we are staring one in the face: the slow but constant decline and fragmentation of Europe. Europe is entering a decisive decade, and it is unable to act. The EU was an extraordinary accomplishment —nations coming together and using political and peaceful means to settle differences. And this after a millennium of terrible wars. It worked, but it only went halfway. Europe never finished the economic union (see the Draghi report), which meant that European countries constantly underperformed economically. This has led to their GDP relative to the United States going from 90% in the year 2000 to approximately 70% today. This fragmentation remains a structural drag on competitiveness. As former European Central Bank President Mario Draghi has noted, internal EU market barriers function like “hard tariffs” of approximately 45% for manufacturing and 110% for services. Those barriers reflect not a failure of ambition but rather a failure of integration. This has led to a lack of scale for their major businesses and a lack of mobility for both capital and people.

EU nations also created whole new layers of bureaucracy that reduced innovation, growth and investment among other things. This will continue unless European leaders dramatically change course. If they don’t, they will eventually be unable to afford their social safety nets, restrengthen their nations’ militaries and grow their economies. The EU is currently home to world-class companies, deep pools of savings and a talented workforce. But without new EU direction, their major global companies will weaken, faced with very strong American and Chinese competition. The ultimate loser in all this will be Europe and all its citizens — and it will hurt the United States as well.

Europe and America are each other’s largest trade partners at $2 trillion a year…

…Yet Europe’s defense industrial base is still not fit for purpose. This is as much an economic and industrial challenge as a military one. The continent needs enduring production capacity, coordinated procurement and dual-use manufacturing that serves both commercial and defense sectors.

17. Strong leadership by the US is still required for global prosperity

Strong American leadership is required – there is no real alternative.

Some political leaders have said that there is a “rupture” between America and the Western world — that the red lines have been crossed and there is no return to the prior system. I completely disagree. There is no practical replacement to the prior system. It has not ruptured, but it needs reform. The middle-sized nations do not have real alternatives in terms of building a unified military or a unified economy that can compete effectively with the United States and China. If these middle nations did, the result would look a lot like what Europe is today: dysfunctional. The only practical alternative is to fix the current situation.

The United States and Europe have an extraordinary number of commonalities, including values deeply held. For more than 75 years since the end of World War II, the United States and Europe have worked together to resolve most major global economic or military challenges and in fighting terrorism and nuclear proliferation. We need this cooperation for the next 75 years.

I do not want to contemplate the opposite. Without American leadership, there would be a huge vacuum. If not us, who? We are the only country that has the capability to do it. Fragmented relationships with and among our extensive allies could lead to an “every nation for themselves” mentality. America would become more isolated, the U.S. dollar would no longer be the world’s reserve currency and autocratic nations would rejoice. Need I say more?


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I don’t have a vested interest in any company mentioned. Holdings are subject to change at any time.

What We’re Reading (Week Ending 05 April 2026)

The best articles we’ve read in recent times on a wide range of topics, including investing, business, and the world in general.

We’ve constantly been sharing a list of our recent reads in our weekly emails for The Good Investors.

Do subscribe for our weekly updates through the orange box in the blog (it’s on the side if you’re using a computer, and all the way at the bottom if you’re using mobile) – it’s free!

But since our readership-audience for The Good Investors is wider than our subscriber base, we think sharing the reading list regularly on the blog itself can benefit even more people. The articles we share touch on a wide range of topics, including investing, business, and the world in general. 

Here are the articles for the week ending 05 April 2026:

1. Energy’s Moment – Abdullah Al-Rezwan

I had this naive assumption that since the US has now become a net oil exporter and China remains heavily dependent on imported oil, any oil shock would be net negative for China far more than it would affect the US.

So, it was surprising for me when I noticed that China was actually ahead of the US in terms of “total insulation factor” when it comes to global oil & gas shocks. The “total insulation factor” indicates the share of a country’s useful final energy that is less exposed to global oil and gas shocks. JPM calculated it by adding together a country’s reliance on four specific energy sources: domestically produced gas, domestically produced coal, nuclear power, renewables (such as biofuel, hydro, wind, solar, and biomass). China has a total insulation factor of 76%, while the US has a total insulation factor of 70%. China scores higher primarily because of its massive reliance on domestic coal (54% of useful final energy), which accounts for a larger share of its energy mix than the US’s primary domestic buffer: natural gas (44.5% of useful final energy).

Even though China is the world’s largest oil importer nation, oil imports make up 13% of China’s primary energy consumption. When you combine all oil consumption and imported gas, it only accounts for 20% of China’s primary energy. 

2. A Sinister Raise, a Bitter Press Release, and Five Other Weird SEC Filings – Andrew Walker

EMPD is a digital treasury company focused on Bitcoin. Like most digital treasury companies, they’ve traded for a discount to NAV basically this entire year; for most of the past month, they’ve been trading around 80-90% of NAV (you can see a real time NAV calc here).

Towards the end of March, the company announced a $25m equity raise. The raise was priced at a premium to both NAV (it priced at 103% of NAV) and the market price (again, the company trades for <90% of NAV); on top of that premium raise, the company noted they’d continued to buy back stock at a discount to NAV. Read that sentence again: a company trading at a discount is buying back shares and somehow raising capital at a premium to NAV? Nirvana for shareholders, right!?!?!

Au contraire, mon frère!

EMPD didn’t just issue stock in the equity raise; for every share they issued, they gave the buyer a four year warrant struck at $6.27/share (~20% above NAV). Those warrants have enormous value; BTC currently trades with ~50 vol. EMPD is a levered BTC treasury company, so it should have even more volatility than BTC (EMPD’s option chain is extremely thin but points to volatility well over 100). ChatGPT tells me that a four year warrant that’s 40% out of the money with 100 vol is worth ~65% of the spot price…. for EMPD, that means each warrant was worth ~$2.90/share. So, yes, the headline number EMPD raised at was $5.39/share, but if you adjust for the value of the warrant EMPD raised money at an effective price of ~$2.50/share while the stock was trading at ~$4.50/share. An absolutely awful trade.

Why would EMPD raise money like this? Well, I’m not in the board room, so I can’t tell you with absolutely certainty…. but I’d suggest it’s likely a board entrenchment maneuver. EMPD is currently in a rarely seen double proxy fight where two separate shareholder groups3 are trying to replace the board with a more shareholder friendly group4. EMPD’s press release announcing the raise notes the raise was bought by a “current institutional investor” in EMPD; my guess is EMPD went to a big shareholder and said “hey, agree to keep the current board in place, and we’ll give you a big slug of stock to vote and toss in a ton of warrants to make the whole thing worth your while.” …

…Today, BNTX has just shy of $20B in net cash, and while the COVID franchise is obviously dwindling the company has a ton of promising other drugs / readouts coming over the next few years….

Perhaps those readouts work, perhaps they don’t. I have no idea! But it’s a pretty promising set up…. which is why it’s so wild that BNTX announced that their co-Founders / top executives were leaving BNTX to start a “next-generation mRNA innovations” company. What’s even crazier is that BNTX will be contributing their mRNA assets to the new company!

Why is this so crazy? It’s absolutely ripe for conflicts of interest! BNTX could have spun out the mRNA assets to all their shareholders and put their founders / exec team in control of the new company. That would have been a fair and equitable way to do a start up. Instead, it seems BNTX will contribute the mRNA assets to the new company in exchange for a piece of that company. How are the mRNA assets going to get valued in that transaction? Given the founders / CEOs are going to the new company, it’s not hard to see how they might want to give the new company a boost by paying BNTX far under market value for the mRNA assets.

3. 2023 – Dean W. Ball

Intelligence is a tremendously useful capability, but it is not the bottleneck on all human progress, and, crucially, an extreme amount of intelligence does not equate to omniscience. Intelligence is not knowledge. Aristotle was surely more intelligent than I am, but he was not more knowledgeable, including even about many of the topics to which he devoted his treatises. This is why I am confident I would score better on a standardized test in biology or physics than Aristotle, despite him being one of the West’s originators of those fields of inquiry.

In a similar vein, imagine a newborn baby that was guaranteed to grow into an adult with an astoundingly high IQ (say, an IQ of 300, or 500, or 1000), but raised by Aristotle in Ancient Greece. Do you expect that the baby would mature into an adult that invents all modern science within the span of a few years or decades? Eliezer Yudkowsky does. Indeed, he describes contemporary humans trying to grapple with superintelligent AI as equivalent to “the 11th century trying to fight the 21st century.” I, on the other hand, strongly doubt that our imaginary high-IQ baby would invent all modern science from first principles. First principles do not have unbounded explanatory power.

In the end, most interesting things about the universe cannot be inferred from first principles. Imagine, for example, that you came upon a dry planet with mountain ranges but no bodies of water. But imagine that you knew, magically, that the planet would soon gain an atmosphere and thus precipitation, seasons, and the like. Suppose you have a superintelligent AI with you, and you show it the map of the planet as it is, and ask it to predict where all the planet’s rivers, lakes, and oceans will lie 50 years hence, after the planet gains regular precipitation. You don’t ask it to predict “generally speaking, where the bodies of water might end up,” but instead to predict exactly where they will be.

I would submit that there is no computational process which can arrive at the end of this natural process faster than nature itself. In other words, there is no pattern or abstraction you can create that allows you to speed ahead to the end of the process, and thus there is no amount of intelligence that gets you to the correct solution faster than nature on its own. You just have to wait the 50 years to find out. This is what the scientist Stephen Wolfram describes as “computational irreducibility.” Understanding this notion deeply is key, I think, to understanding the limits of intelligence. It should therefore come as no surprise that the best debate I’ve ever heard about AI existential risk was between Wolfram and Eliezer Yudkowsky.

Computational irreducibility comes into play anytime you are interacting with a complex system (though this is not to say that computational irreducibility is intrinsic to all interactions with a complex system). Every natural ecosystem, cell, animal, and economy is a complex system. While we have all manner of methods to predict what will happen when a complex system is perturbed (we call these things “physics,” “biology,” “chemistry,” “economics,” and the like), none of those methods is perfect, and often they are far from it.

The way we build better models of the world does not usually resemble “thinking about the problem really hard.” Generally it involves testing ideas and seeing if they work in the real world. In science these are generally called “experiments,” and in business sometimes we call these “startups.” Both take time and often money (sometimes considerable amounts of both); in the limit, neither of these things can be abstracted away with intelligence, no matter how much of it you have on tap. This is the central reason that I have written so much about, and even written into public policy, automated scientific labs that could run thousands of experiments in parallel; AI will increase the number of good predictions, but these are worth little without the ability to verify those predictions with experiments at massive scale.

There is one further observation that follows from the disentanglement of knowledge and intelligence. This is that knowledge itself is distributed throughout the world in highly uneven and imperfect ways. Anyone who thinks that “all the world’s knowledge” is on the internet is deeply mistaken. There is information that exists within a firm like Taiwan Semiconductor Manufacturing Corporation that is, first of all, not only unavailable on the internet but literally against Taiwanese law to make public. Even more importantly, though, there is knowledge within that firm that cannot be written down and is only held collectively. No single employee knows it all; it is the network—the meta-organism of TSMC itself—that holds this knowledge. It cannot be replicated so easily. This is all merely a restatement of the knowledge problem most memorably elucidated by the economist Friedrich Hayek.

The implicit, and sometimes even explicit, argument of “the doomers” is that intelligence is the sole bottleneck on capability (because any other bottlenecks can be resolved with more intelligence), and that everything else follows instantly once that bottleneck is removed. I believe this is just flatly untrue, and thus I doubt many “AI doom” scenarios. Intelligence is neither omniscience nor omnipotence.

What all of this means is that I am doubtful about the ability of an AI system—no matter how smart—to eradicate or enslave humanity in the ways imagined by the doomers. Note that this is not a claim about alignment or any other technical safeguard, even if a “misaligned” AI system wanted to take over the world and had no developer- or government-imposed, AI-specific safeguards to hinder it, I contend it would still fail. “Taking over the world” involves too many steps that require capital, interfacing with hard-to-predict complex systems (yes, hard to predict even for a superintelligence), ascertaining esoteric and deliberately hidden knowledge (knowledge that cannot be deduced from first principles), and running into too many other systems and procedures with in-built human oversight. It is not any one of these things, but the combination of them, that gives me high confidence that AI existential risk is highly unlikely and thus not worth extreme policy mitigations such as bans on AI development enforced by threats to bomb civilian infrastructure like data centers. “If anyone builds it, everyone dies” is false.

4. Beware of Simple Narratives – Alfred Lin

Consider a few narratives that shaped and misshaped technology investing:

  • Winner takes all. In some markets, such as search and social networking, this proved largely correct, but enterprise software proved stubbornly multi-vendor. E-commerce never consolidated the way the narrative predicted. Even in cloud infrastructure, the oligopoly of AWS, Azure, and GCP defied the single-winner thesis. The narrative was a useful heuristic. Founders and investors who treated it as a law made expensive mistakes.
  • First mover advantage. Google was not the first search engine. Facebook was not the first social network. The iPhone was not the first smartphone. The company that finds product-market fit in the right window wins. But “timing and execution matter more than sequence” is a harder story to tell than “be first.”
  • AI will replace [x]. Today’s dominant narrative is directionally correct but operationally misleading. The simple version, that AI replaces humans in a neat, linear substitution, misses the more investable reality. Augmentation, new workflows, and entirely new categories of work tend to emerge alongside displacement. The companies building for the nuanced version of this future look very different from those building for the simple version…

…In 1997, I declared that Amazon would kill Walmart. Today, Walmart is 30 times larger than it was 30 years ago. With each quarter of declining mall traffic and each confirmed brick-and-mortar bankruptcy, the thesis held true. This was confirmation bias at work. The world was messier than the story. E-commerce companies also failed. Customer acquisition costs online kept rising. Certain categories had persistent try-before-you-buy dynamics. Physical presence created brand equity that digital alone could not. Those who treated the simple narrative as a settled truth missed the omnichannel reality that ultimately prevailed.

5. Javier Blas on Why Oil Could Go Much, Much Higher (Transcript here) – Tracy Alloway, Joe Weisenthal, and Javier Blas

Javier: You are absolutely right that what is really cushioning the market right now is a number of buffers that we are going through. One is regular inventories that every country, every refinery has to normal functioning. Then is also the strategic inventories that some countries own, particularly industrialized countries like the United States, Europe, Japan, and also China. Those have been mobilized, in most places have been released. And also we entered the crisis with a market that was over-supplied. There was even floating storage – that is when an oil tanker has been loaded, it’s on the high seas but it cannot find a buyer and just basically sits on the high seas looking for someone who will take the oil. We have quite a lot of that just going into the crisis. So there was quite an element of buffer through the system and probably a larger buffer than in normal circumstances because the market was over-supplied. That is helping to cushion or to mitigate the crisis.

Where we are seeing some actions by government is where countries are closer to the crisis, which is the Strait of Hormuz. So the closer that you are to that location, the more action you need to take because you typically depend more of that flow of oil coming from the Middle East and also because you are impacted earlier. If you are moving oil from say Saudi Arabia into India, that’s only a few days, at most a week, of sailing time. If you are moving that to say the Philippines, that’s about 15 days. It’s longer if you are moving that oil into Europe, probably around three weeks. And it’s even longer if you are moving that oil into say the United States where Saudi oil takes about 40 days. All of that means that the crisis is felt in some places quicker than in other places.

Also is how the global oil market works. And to put it in quite simple terms, I’m afraid that I have to go with colonial vocabulary. The oil market is divided in two large chunks. East of Suez and west of Suez. This is like the British Empire was still around and everything was east or west of the Suez Canal. Countries that are east of Suez, mostly Asia, rely a lot on Middle East oil these days and therefore they are impacted earlier on by the crisis. West of Suez, Europe, Western Europe, and the whole American continent, is a bit detached from that market and therefore the crisis will hit them much later…

…Javier: But if I may suggest, forget about the price of a barrel of oil. No one cares about the price of oil unless you are someone producing oil in Texas or Saudi Arabia, or you are someone who owns a refinery. Those are the people that care about the price of a barrel of crude. The rest of us, you and I, we care about the price of a refined product because that’s what we consume. We consume gasoline, we consume diesel, or we consume other refined products that they embed into a service that we are buying. Think about an airfare ticket, where inside that ticket there’s a big proportion of it that is jet fuel, or you are buying a cup made of plastic. You are buying effectively some kind of transformed naphtha and obviously the transformation and the retail margin and so on, but what matters really is the price of refined products, and there actually we are beginning to see, particularly in the Southeast Asian markets, some very extreme prices.

If you look at the price of crude or Brent or WTI or Oman, things look relatively contained. We are trading around $110 a barrel, that is well below the all-time high. If you look at the cost of diesel in Singapore, which is a benchmark for the Southeast Asian market, the price there is approaching $200 a barrel, which is something that we have never seen. The refined product is where really we are seeing the real tension.

Tracy: This is exactly what I wanted to ask you. If you look at the benchmark prices for crude oil, we’ve seen higher prices before, and relatively recently in 2022. But if you look at the refined products, we’re getting to places that we haven’t seen. What explains that disconnect? Back in 2022, why didn’t we see the higher cost of crude feed into refined products the way that we seem to be seeing now?

Javier: For two reasons. One is because we have lost not only a lot of crude oil production, but we have lost a significant chunk of refined production. The Middle East also has a lot of refineries which are export refineries. They are just devoted to the export market and the global trade of refined products is a lot smaller than the global trade of crude oil. So even a small reduction in supply could have a much larger impact. You think about the global market for crude oil which is 100 million barrels, around 60 million are traded globally. But if you look at the market for say jet fuel, that market is a lot smaller and we have lost a significant proportion of the refineries who are serving that international market for jet fuel and therefore prices are reacting much more stronger than we saw in previous crises.

There’s also the way that the world of refining works. Some refineries are slowing down intake of crude oil because there is not enough crude oil in the market but we have not really seen yet the consumers reacting the same way. So what is happening is the refining world is acting as a buffer between crude oil that is not there, and consumers that have not yet realized that the crude oil is not there. The refined market is trying to basically get those two together. The way that it can only do it is by extreme pricing and indicating to the consumers, “I don’t have enough crude to make these refined products, so please can you stop demanding the refined products?” The please is basically $200 a barrel diesel…

…Tracy: What’s going on with US natural gas? If you look there – we’re talking about muted market moves in the oil market, even though those have risen – if you look at nat gas, nat gas has actually come down.

Javier: Nat gas in the United States is trading almost at a six-month low, which considering what is happening in the global energy market, is almost incredible. The reason there is US shale. And the reason is that you cannot export gas easily. For exporting gas, you first need to cool it down, liquefy. That basically means having an enormous fridge that cools gas from room temperature to -160 celsius, then it liquefies and then you can put it on a tanker and send it to the rest of the market. Because we have limited liquefaction capacity, and it does increase quite quickly, that creates a bottleneck. That means that the US and Canadian gas is effectively trapped inside North America and that’s keeping prices completely detached from the global market. That is a huge difference from previous episodes of high energy prices. Even in 2022, the price of US natural gas went from around $3.50-$4 to almost $10 per British Thermal Unit. This time it’s staying at actually below $3 per MBTU.

That is incredible because it means that the heavy US industry, electricity generators, chemical companies, fertilizer companies, there is no crisis while everyone else in the world is suffering. The US is completely insulated…

…Javier: 2022 was a huge shock to the global food market because it affected a bread basket region of the world. If you look at Russia and Ukraine, at the time combined, they accounted for around a quarter of global exports of wheat and barley, around 15% of global exports of corn, and even much higher percentage for some vegetable oil like rapeseed and sunflower. The Russian invasion of Ukraine, the battleground was some of the richest fertile farmland in the planet. The battleground of the crisis in the Middle East is deserts and a piece of sea that we call the Strait of Hormuz. It doesn’t have the same impact in terms of global supply.

It does have an impact on fertilizer prices. It did also, the 2022 war between Russia and Ukraine which is still ongoing. But fertilizer prices require time to have an impact on food production. Also, while yes the numbers are very scary and you look at the global fertilizer market, just focusing on urea, you look at that market and say, “Oh boy it’s going up a lot, we are approaching the 2022 record high.” But that is a problem in many markets, it’s a problem that is not a food problem. It will be a fiscal problem and the reason is that urea fertilizer in particular is massively subsidized in the developing world, particularly in places like India and Pakistan. So the problem there is going to be for the Indian government – can it afford to spend billions of dollars extra subsidizing fertilizer? Less so is it going to be a food crisis in India because the fertilizer I think is going to be there. You are the finance minister in India, you have a big problem there. That’s how I’m seeing the problem.

Also the global food market is in a better position than almost anytime in the last two or three decades. Inventories of wheat are very high. Inventories of rice in particular at an all-time high. You mentioned rice, while we are worried about fertilizer prices, etc., etc., if you look at the most important benchmark for rice prices in Asia, it is about to hit at 19-year low…

…Javier: Oh boy, if we we didn’t have enough with the Middle East, here is Ukraine. You cannot blame Ukraine, it is fighting for survival. They are hitting Russia as hard as they can, wherever they can. And that means hitting their oil terminals. In the past, they were hitting the terminals in the south of the country. That’s the Black Sea. But they have found a corridor to send long-distance drones into the north, into the Baltic. I think the Russians were caught completely offguard. They didn’t think that Ukraine will be able to hit the terminals in the far north of Russian territory. So they were not very well protected, or you say Ukrainians were extremely good at it. But the terminals have been damaged significantly. We don’t know for sure the extent of the damage, but looking at the satellite pictures, it looks bad enough. So we may be also losing potentially 1 million barrels a day of Russian oil. Again you cannot blame Ukraine, but it’s not really the time when you want to be losing more oil…

…Tracy: Okay, one thing that people have talked about for I’m pretty sure the duration of all of our careers, are attempts to move away from pricing oil in dollars. If you think about the current situation, there’s something very perverse about seeing the dollar go up because there’s a scramble for barrels of oil because of an action taken by the United States. From your context in the oil market, is anyone talking about actual currency pricing for barrels at the moment? Is this something that is going to get renewed traction?

Javier: No, I don’t hear anyone. Certainly Iran may be happy to take other currencies. It has been relatively happy to take Chinese yuan, and also other currencies which has problems on convertibility. Everyone else will still want the dollar. The way that it was put to me by a leading producing country in the Middle East, and I was talking to the head of the central bank, I’m going to not name the country. But they said to me, “If I switch from the dollar to say the yuan, I move from a relatively high interest rate, to a low interest rate. I move from full convertibility to a lot of problems to convert. And I move from maximum liquidity to no liquidity whatsoever.” And then this central bank governor is like, “Why I would like to do that? Why I would like to really take a step back on my currency?” I think that the yuan is not there yet for oil producers. Everyone that is using other currencies than the dollar to price their oil or to invoice their oil, they are doing it because they are under American sanctions. They’re not doing it because they want to do it. They’re doing it because they have no other option than to do it. Just because they are on the naughty corner of the US Treasury.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. We currently have a vested interest in Alphabet (parent of GCP), Amazon (parent of AWS), Microsoft (parent of Azure), and TSMC. Holdings are subject to change at any time.