All articles

Featured

Saying Goodbye: 10 Years, a 19% Annual Return, and 17 Investing Lessons

9 years 7 months and 6 days. This is how much time has passed since I started managing my family’s investment portfolio of US stocks on 26 October 2010. 19.5% versus 12.7%. These are the respective annual returns of my family’s portfolio (without dividends) and the S&P 500 (with dividends) in that period.

As of 31 May 2020

I will soon have to say goodbye to the portfolio. Jeremy Chia (my blogging partner) and myself have co-founded a global equities investment fund. As a result, the lion’s share of my family’s investment portfolio will soon be liquidated so that the cash can be invested in the fund. 

The global equities investment fund will be investing with the same investment philosophy that underpins my family’s portfolio, so the journey continues. But my heart’s still heavy at having to let the family portfolio go. It has been a huge part of my life for the past 9 years 7 months and 6 days, and I’m proud of what I’ve achieved (I hope my parents are too!).

In the nearly-10 years managing the portfolio, I’ve learnt plenty of investing lessons. I want to share them here, to benefit those of you who are reading, and to mark the end of my personal journey and the beginning of a new adventure. I did not specifically pick any number of lessons to share. I’m documenting everything that’s in my head after a long period of reflection. 

Do note that my lessons may not be timeless, because things change in the markets. But for now, they are the key lessons I’ve picked up. 

Lesson 1: Focus on business fundamentals, not macroeconomic or geopolitical developments – there are always things to worry about

My family’s portfolio has many stocks that have gone up multiple times in value. A sample is given below:

Some of them are among the very first few stocks I bought; some were bought in more recent years. But what’s interesting is that these stocks produced their gains while the world experienced one crisis after another.

You see, there were always things to worry about in the geopolitical and macroeconomic landscape since I started investing. Here’s a short and incomplete list (you may realise how inconsequential most of these events are today, even though they seemed to be huge when they occurred):

  • 2010 – European debt crisis; BP oil spill; May 2010 Flash Crash
  • 2011 – Japan earthquake; Middle East uprising
  • 2012 – Potential Greek exit from Eurozone; Hurricane Sandy
  • 2013 – Cyprus bank bailouts; US government shutdown; Thailand uprising
  • 2014 – Oil price collapse
  • 2015 – Crash in Euro dollar against the Swiss Franc; Greece debt crisis
  • 2016 – Brexit; Italy banking crisis
  • 2017 – Bank of England hikes interest rates for first time in 10 years
  • 2018 – US-China trade war
  • 2019 – Australia bushfires; US President impeachment; appearance of COVID-19 in China
  • 2020 (thus far) – COVID-19 becomes global pandemic

The stocks mentioned in the table above produced strong business growth over the years I’ve owned them. This business growth has been a big factor in the returns they have delivered for my family’s portfolio. When I was studying them, my focus was on their business fundamentals – and this focus has served me well.

In a 1998 lecture for MBA students, Warren Buffett was asked about his views on the then “tenuous economic situation and interest rates.“ He responded:

“I don’t think about the macro stuff. What you really want to do in investments is figure out what is important and knowable. If it is unimportant and unknowable, you forget about it. What you talk about is important but, in my view, it is not knowable.

Understanding Coca-Cola is knowable or Wrigley’s or Eastman Kodak. You can understand those businesses that are knowable. Whether it turns out to be important depends where your valuation leads you and the firm’s price and all that. But we have never not bought or bought a business because of any macro feeling of any kind because it doesn’t make any difference.

Let’s say in 1972 when we bought See’s Candy, I think Nixon [referring to former US President, Richard Nixon] put on the price controls a little bit later, but so what! We would have missed a chance to buy something for [US]$25 million that is producing [US]$60 million pre-tax now. We don’t want to pass up the chance to do something intelligent because of some prediction about something we are no good on anyway.”

Lesson 2: Adding to winners work

I’ve never shied away from adding to the winners in my portfolio, and this has worked out well. Here’s a sample, using some of the same stocks shown in the table in Lesson 1.

Adding to winners is hard to achieve, psychologically. As humans, we tend to anchor to the price we first paid for a stock. After a stock has risen significantly, it’s hard to still see it as a bargain. But I’ll argue that it is stocks that have risen significantly over a long period of time that are the good bargains. It’s counterintuitive, but hear me out.

The logic here rests on the idea that stocks do well over time if their underlying businesses do well. So, the stocks in my portfolio that have risen significantly over a number of years are likely – though not always – the ones with businesses that are firing on all cylinders. And stocks with businesses that are firing on all cylinders are exactly the ones I want to invest in. 

Lesson 3: The next Amazon, is Amazon

When I first bought shares of Amazon in April 2014 at US$313, its share price was already more than 200 times higher than its IPO share price of US$1.50 in May 1997. That was an amazing annual return of around 37%.

But from the time I first invested in Amazon in April 2014 to today, its share price has increased by an even more impressive annual rate of 40%. Of course, it is unrealistic to expect Amazon to grow by a further 200 times in value from its April 2014 level over a reasonable multi-year time frame. But a stock that has done very well for a long period of time can continue delivering a great return. Winners often keep on winning.    

Lesson 4: Focus on business quality and don’t obsess over valuation

It is possible to overpay for a company’s shares. This is why we need to think about the valuation of a business. But I think it is far more important to focus on the quality of a business – such as its growth prospects and the capability of the management team – than on its valuation.

If I use Amazon as an example, its shares carried a high price-to-free cash flow (P/FCF) ratio of 72 when I first invested in the company in April 2014. But Amazon’s free cash flow per share has increased by 1,000% in total (or 48% annually) from US$4.37 back then to US$48.10 now, resulting in the overall gain of 681% in its share price.

Great companies could grow into their high valuations. Amazon’s P/FCF ratio, using my April 2014 purchase price and the company’s current free cash flow per share, is just 6.5 (now that’s a value stock!). But there’s no fixed formula that can tell you what valuation is too high for a stock. It boils down to subjective judgement that is sometimes even as squishy as an intuitive feeling. This is one of the unfortunate realities of investing. Not everything can be quantified.   

Lesson 5: The big can become bigger – don’t obsess over a company’s market capitalisation

I’ve yet to mention Mastercard, but I first invested in shares of the credit card company on 3 December 2014 at US$89 apiece. Back then, it already had a huge market capitalisation of around US$100 billion, according to data from Ycharts. Today, Mastercard’s share price is US$301, up more than 200% from my initial investment. 

A company’s market capitalisation alone does not tell us much. It is the company’s (1) valuation, (2) size of the business, and (3) addressable market, that can give us clues on whether it could be a good investment opportunity. In December 2014, Mastercard’s price-to-earnings (P/E) ratio and revenue were both reasonable at around 35 and US$9.2 billion, respectively. Meanwhile, the company’s market opportunity still looked significant, since cashless transactions represented just 15% of total transactions in the world back then.

Lesson 6: Don’t ignore “obvious” companies just because they’re well known

Sticking with Mastercard, it was an obvious company that was already well-known when I first invested in its shares. In the first nine months of 2014, Mastercard had more than 2 billion credit cards in circulation and had processed more than 31.4 billion transactions. Everyone could see Mastercard and know that it was a great business. It was growing rapidly and consistently, and its profit and free cash flow margins were off the charts (nearly 40% for both).

The company’s high quality was recognised by the market – its P/E ratio was high in late 2014 as I mentioned earlier. But Mastercard still delivered a fantastic annual return of around 25% from my December 2014 investment.

I recently discovered a poetic quote by philosopher Arthur Schopenhauer: “The task is… not so much to see what no one has yet seen, but to think what nobody has yet thought, about that which everyone sees.” This is so applicable to investing.

Profitable investment opportunities can still be found by thinking differently about the data that everyone else has. It was obvious to the market back in December 2014 that Mastercard was a great business and its shares were valued highly because of this. But by thinking differently – with a longer-term point of view – I saw that Mastercard could grow at high rates for a very long period of time, making its shares a worthy long-term investment. From December 2014 to today, Mastercard’s free cash flow per share has increased by 158% in total, or 19% per year. Not too shabby.   

Lesson 7: Be willing to lose sometimes

We need to take risks when investing. When I first invested in Shopify in September 2016, it had a price-to-sales (P/S) ratio of around 12, which is really high for a company with a long history of making losses and producing meagre cash flow. But Shopify also had a visionary leader who dared to think and act long-term. Tobi Lütke, Shopify’s CEO and co-founder, penned the following in his letter to investors in the company’s 2015 IPO prospectus (emphases are mine):

“Over the years we’ve also helped foster a large ecosystem that has grown up around Shopify. App developers, design agencies, and theme designers have built businesses of their own by creating value for merchants on the Shopify platform. Instead of stifling this enthusiastic pool of talent and carving out the profits for ourselves, we’ve made a point of supporting our partners and aligning their interests with our own. In order to build long-term value, we decided to forgo short-term revenue opportunities and nurture the people who were putting their trust in Shopify. As a result, today there are thousands of partners that have built businesses around Shopify by creating custom apps, custom themes, or any number of other services for Shopify merchants.

This is a prime example of how we approach value and something that potential investors must understand: we do not chase revenue as the primary driver of our business. Shopify has been about empowering merchants since it was founded, and we have always prioritized long term value over short-term revenue opportunities. We don’t see this changing…

… I want Shopify to be a company that sees the next century. To get us there we not only have to correctly predict future commerce trends and technology, but be the ones that push the entire industry forward. Shopify was initially built in a world where merchants were simply looking for a homepage for their business. By accurately predicting how the commerce world would be changing, and building what our merchants would need next, we taught them to expect so much more from their software.

These underlying aspirations and values drive our mission: make commerce better for everyone. I hope you’ll join us.”       

Shopify was a risky proposition. But it paid off handsomely. In investing, I think we have to be willing to take risks and accept that we can lose at times. But failing at risk-taking from time to time does not mean our portfolios have to be ruined. We can take intelligent risks by sizing our positions appropriately. Tom Engle is part of The Motley Fool’s investing team in the US. He’s one of the best investors the world has never heard of. When it comes to investing in risky stocks that have the potential for huge returns, Tom has a phrase I love: “If it works out, a little is all you need; if it doesn’t, a little is all you want.” 

I also want to share a story I once heard from The Motley Fool’s co-founder Tom Gardner. Once, a top-tier venture capital firm in the US wanted to improve the hit-rate of the investments it was making. So the VC firm’s leaders came up with a process for the analysts that could reduce investing errors. The firm succeeded in improving its hit-rate (the percentage of investments that make money). But interestingly, its overall rate of return became lower. That’s because the VC firm, in its quest to lower mistakes, also passed on investing in highly risky potential moonshots that could generate tremendous returns.

The success of one Shopify can make up for the mistakes of many other risky bets that flame out. To hit a home run, we must be willing to miss at times.  

Lesson 8: The money is made on the holding, not the buying and selling

My family’s investment portfolio has over 50 stocks. It’s a collection that was built steadily over time, starting with the purchase of just six stocks on 26 October 2010. In the 9 years, 7 months and 6 days since, I’ve only ever sold two stocks voluntarily: (1) Atwood Oceanics, an owner of oil rigs; and (2) National Oilwell Varco, a supplier of parts and equipment that keep oil rigs running. Both stocks were bought on 26 October 2010.

David Gardner is also one of the co-founders of The Motley Fool (Tom Gardner is his brother). There’s something profound David once said about portfolio management that resonates with me:

“Make your portfolio reflect your best vision for our future.” 

The sales of Atwood Oceanics and National Oilwell Varco happened because of David’s words. Part of the vision I have for the future is a world where our energy-needs are met entirely by renewable sources that do not harm the precious environment we live in. For this reason, I made the rare decision to voluntarily part ways with Atwood Oceanics and National Oilwell Varco in September 2016 and June 2017, respectively.

My aversion to selling is by design – because I believe it strengthens my discipline in holding onto the winners in my family’s portfolio. Many investors tend to cut their winners and hold onto their losers. Even in my earliest days as an investor, I recognised the importance of holding onto the winners in driving my family portfolio’s return. Being very slow to sell stocks has helped me hone the discipline of holding onto the winners. And this discipline has been a very important contributor to the long run performance of my family’s portfolio.

The great Charlie Munger has a saying that one of the keys to investing success is “sitting on your ass.” I agree. Patience is a virtue. And talking about patience… 

Lesson 9: Be patient – some great things take time

Some of my big winners needed only a short while before they took off. But there are some that needed significantly more time. Activision Blizzard is one such example. As I mentioned earlier, I invested in its shares in October 2010. Then, Activision Blizzard’s share price went nowhere for more than two years before it started rocketing higher.

Peter Lynch once said: “In my investing career, the best gains usually have come in the third or fourth year, not in the third or fourth week or the third or fourth month.” The stock market does not move according to our own clock. So patience is often needed.

Lesson 10: Management is the ultimate source of a company’s economic moat

In my early days as an investor, I looked for quantifiable economic moats. These are traits in a company such as (1) having a network effect, (2) being a low-cost producer, (3) delivering a product or service that carries a high switching cost for customers, (4) possessing intangible assets such as intellectual property, and (5) having efficient scale in production. 

But the more I thought about it, the more I realised that a company’s management team is the true source of its economic moat, or lack thereof.

Today, Netflix has the largest global streaming audience with a pool of 183 million subscribers around the world. Having this huge base of subscribers means that Netflix has an efficient scale in producing content, because the costs can be spread over many subscribers. Its streaming competitors do not have this luxury. But this scale did not appear from thin air. It arose because of Netflix’s CEO and co-founder, Reed Hastings, and his leadership team.

The company was an early pioneer in the streaming business when it launched its streaming service in 2007. In fact, Netflix probably wanted to introduce streaming even from its earliest days. Hastings said the following in a 2007 interview with Fortune magazine: 

“We named the company Netflix for a reason; we didn’t name it DVDs-by-mail. The opportunity for Netflix online arrives when we can deliver content to the TV without any intermediary device.”

When Netflix first started streaming, the content came from third-party producers. In 2013, the company launched its first slate of original programming. Since then, Netflix has ramped up its original content budget significantly. The spending has been done smartly, as Netflix has found plenty of success with its original programming. For instance, in 2013, the company became the first streaming provider to be nominated for a primetime Emmy. And in 2018 and 2019, the company snagged 23 and 27 Emmy wins, respectively.  

A company’s current moat is the result of management’s past actions; a company’s future moat is the result of management’s current actions. Management is what creates the economic moat.

Lesson 11: Volatility in stocks is a feature, not a bug

Looking at the table in Lesson 1, you may think that my investment in Netflix was smooth-sailing. It’s actually the opposite. 

I first invested in Netflix shares on 15 September 2011 at US$26 after the stock price had fallen by nearly 40% from US$41 in July 2011. But the stock price kept declining afterward, and I bought more shares at US$16 on 20 March 2012. More pain was to come. In August 2012, Netflix’s share price bottomed at less than US$8, resulting in declines of more than 70% from my first purchase, and 50% from my second.  

My Netflix investment was a trial by fire for a then-young investor – I had started investing barely a year ago before I bought my first Netflix shares. But I did not panic and I was not emotionally affected. I already knew that stocks – even the best performing ones – are volatile over the short run. But my experience with Netflix drove the point even deeper into my brain.

Lesson 12: Be humble – there’s so much we don’t know

My investment philosophy is built on the premise that a stock will do well over time if its business does well too. But how does this happen?

In the 1950s, lawmakers in the US commissioned an investigation to determine if the stock market back then was too richly priced. The Dow (a major US stock market benchmark) had exceeded its peak seen in 1929 before the Great Depression tore up the US market and economy. Ben Graham, the legendary father of value investing, was asked to participate as an expert on the stock market. Here’s an exchange during the investigation that’s relevant to my discussion:

Question to Graham: When you find a special situation and you decide, just for illustration, that you can buy for 10 and it is worth 30, and you take a position, and then you cannot realize it until a lot of other people decide it is worth 30, how is that process brought about – by advertising, or what happens?

Graham’s response: That is one of the mysteries of our business, and it is a mystery to me as well as to everybody else. We know from experience that eventually the market catches up with value. It realizes it in one way or another.”   

More than 60 years ago, one of the most esteemed figures in the investment business had no idea how stock prices seemed to eventually reflect their underlying economic values. Today, I’m still unable to find any answer. If you’ve seen any clues, please let me know! This goes to show that there’s so much I don’t know about the stock market. It’s also a fantastic reminder for me to always remain humble and be constantly learning. Ego is the enemy.  

Lesson 13: Knowledge compounds, and read outside of finance

Warren Buffett once told a bunch of students to “read 500 pages… every day.” He added, “That’s how knowledge works. It builds up, like compound interest. All of you can do it, but I guarantee not many of you will do it.” 

I definitely have not done it. I read every day, but I’m nowhere close to the 500 pages that Buffett mentioned. Nonetheless, I have experienced first hand how knowledge compounds. Over time, I’ve been able to connect the dots faster when I analyse a company. And for companies that I’ve owned shares of for years, I don’t need to spend much time to keep up with their developments because of the knowledge I’ve acquired over the years.

Reading outside of finance has also been really useful for me. I have a firm belief that investing is only 5% finance and 95% everything else. Reading about psychology, society, history, science etc. can make us even better investors than someone who’s buried neck-deep in only finance books. Having a broad knowledge base helps us think about issues from multiple angles. This brings me to Arthur Schopenhauer’s quote I mentioned earlier in Lesson 6:  “The task is… not so much to see what no one has yet seen, but to think what nobody has yet thought, about that which everyone sees.”

Lesson 14: The squishy things matter

Investing is part art and part science. But is it more art than science? I think so. The squishy, unquantifiable things matter. That’s because investing is about businesses, and building businesses involves squishy things.

Jeff Bezos said it best in his 2005 Amazon shareholders’ letter (emphases are mine):

As our shareholders know, we have made a decision to continuously and significantly lower prices for customers year after year as our efficiency and scale make it possible. This is an example of a very important decision that cannot be made in a math-based way.

In fact, when we lower prices, we go against the math that we can do, which always says that the smart move is to raise prices. We have significant data related to price elasticity. With fair accuracy, we can predict that a price reduction of a certain percentage will result in an increase in units sold of a certain percentage. With rare exceptions, the volume increase in the short term is never enough to pay for the price decrease.

However, our quantitative understanding of elasticity is short-term. We can estimate what a price reduction will do this week and this quarter. But we cannot numerically estimate the effect that consistently lowering prices will have on our business over five years or ten years or more.

Our judgment is that relentlessly returning efficiency improvements and scale economies to customers in the form of lower prices creates a virtuous cycle that leads over the long term to a much larger dollar amount of free cash flow, and thereby to a much more valuable Amazon.com. We’ve made similar judgments around Free Super Saver Shipping and Amazon Prime, both of which are expensive in the short term and—we believe—important and valuable in the long term.”

On a related note, I was also attracted to Shopify when I came across Tobi Lütke’s letter to investors that I referenced in Lesson 7. I saw in Lütke the same ability to stomach short-term pain, and the drive toward producing long-term value, that I noticed in Bezos. This is also a great example of how knowledge compounds. 

Lesson 15: I can never do it alone

Aaron Bush is one of the best investors I know of at The Motley Fool, and he recently created one of the best investing-related tweet-storms I have seen. In one of his tweets, he said: “Collaboration can go too far. Surrounding yourself with a great team or community is critical, but the moment decision-making authority veers democratic your returns will begin to mean-revert.” 

I agree with everything Aaron said. Investment decision-making should never involve large teams. But at the same time, having a community or team around us is incredibly important for our development; their presence enables us to view a problem from many angles, and it helps with information gathering and curation.

I joined one of The Motley Fool’s investment newsletter services in 2010 as a customer. The service had wonderful online forums and this dramatically accelerated my learning curve. In 2013, I had the fortune to join an informal investment club in Singapore named Kairos Research. It was founded by Stanley Lim, Cheong Mun Hong, and Willie Keng. They are also the founders of the excellent Asia-focused investment education website, Value Invest Asia. I’ve been a part of Kairos since and have benefited greatly. I’ve made life-long friends and met countless thoughtful, kind, humble, and whip-smart people who have a deep passion for investing and knowledge. The Motley Fool’s online forums and the people in Kairos have helped me become a better human being and investor over the years.   

I’ve also noticed – in these group interactions – that the more I’m willing to give, the more I receive. Giving unconditionally and sincerely without expecting anything in return, paradoxically, results in us having more. Giving is a superpower. 

Lesson 16: Be honest with myself about what I don’t know

When we taste success in the markets, it’s easy for ego to enter the picture. We may look into the mirror and proclaim: “I’m a special investor! I’ve been great at picking growth stocks – this knowledge must definitely translate to trading options, shorting commodities, and underwriting exotic derivatives. They, just like growth stocks, are all a part of finance, isn’t it?” 

This is where trouble comes. The entrance of ego is the seed of future failure. In the biography of Warren Buffett, The Snowball: Warren Buffett and the Business of Life, author Alice Schroeder shared this passage about Charlie Munger:

“[Munger] dread falling prey to what a Harvard Law School classmate of his had called “the Shoe Button Complex.”

“His father commuted daily with the same group of men,” Munger said. “One of them had managed to corner the market in shoe buttons – a really small market, but he had it all. He pontificated on every subject, all subjects imaginable. Cornering the market on shoe buttons made him an expert on everything. Warren and I have always sensed it would be a big mistake to behave that way.”

The Shoe Button Complex can be applied in a narrower sense to investing too. Just because I know something about the market does not mean I know everything. For example, a few years after I invested in Atwood Oceanics and National Oilwell Varco, I realised I was in over my head. I have no ability to predict commodity prices, but the business-health of the two companies depends on the price of oil. Since I came to the realisation, I have stayed away from additional commodity-related companies. In another instance, I know I can’t predict the movement of interest rates, so I’ve never made any investment decision that depended on interest rates as the main driver. 

Lesson 17: Be rationally optimistic

In Lesson 1, I showed that the world had lurched from one crisis to another over the past decade. And of course, we’re currently battling COVID-19 now. But I’m still optimistic about tomorrow. This is because one key thing I’ve learnt about humanity is that our progress has never happened smoothly. It took us only 66 years to go from the first demonstration of manned flight by the Wright brothers at Kitty Hawk to putting a man on the moon. But in between was World War II, a brutal battle across the globe from 1939 to 1945 that killed an estimated 66 million, according to National Geographic. 

This is how progress is made, through the broken pieces of the mess that Mother Nature and our own mistakes create. Morgan Housel has the best description of this form of rational optimism that I’ve come across: 

“A real optimist wakes up every morning knowing lots of stuff is broken, and more stuff is about to break.

Big stuff. Important stuff. Stuff that will make his life miserable. He’s 100% sure of it.

He starts his day knowing a chain of disappointments awaits him at work. Doomed projects. Products that will lose money. Coworkers quitting. He knows that he lives in an economy due for a recession, unemployment surely to rise. He invests his money in a stock market that will crash. Maybe soon. Maybe by a lot. This is his base case.

He reads the news with angst. It’s a fragile world. Every generation has been hit with a defining shock. Wars, recessions, political crises. He knows his generation is no different.

This is a real optimist. He’s an optimist because he knows all this stuff does not preclude eventual growth and improvement. The bad stuff is a necessary and normal path that things getting better over time rides on. Progress happens when people learn something new. And they learn the most, as a group, when stuff breaks. It’s essential.

So he expects the world around him to break all the time. But he knows – as a matter of faith – that if he can survive the day-to-day fractures, he’ll capture the up-and-to-the-right arc that learning and hard work produces over time.”

To me, investing in stocks is, at its core, the same as having faith in the long-term potential of humanity. There are 7.8 billion individuals in the world today, and the vast majority of us will wake up every morning wanting to improve the world and our own lot in life – this is ultimately what fuels the global economy and financial markets. Miscreants and Mother Nature will wreak havoc from time to time. But I have faith in the collective positivity of humanity. When there’s a mess, we can clean it up. This has been the story of our long history – and the key driver of the return my family’s portfolio has enjoyed immensely over the past 9 years, 7 months, and 6 days.

My dear portfolio, goodbye.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I, the author, will be making sell-trades on the stocks mentioned in this article over the coming weeks.

What We’re Reading (Week Ending 15 June 2025)

The best articles we’ve read in recent times on a wide range of topics, including investing, business, and the world in general.

We’ve constantly been sharing a list of our recent reads in our weekly emails for The Good Investors.

Do subscribe for our weekly updates through the orange box in the blog (it’s on the side if you’re using a computer, and all the way at the bottom if you’re using mobile) – it’s free!

But since our readership-audience for The Good Investors is wider than our subscriber base, we think sharing the reading list regularly on the blog itself can benefit even more people. The articles we share touch on a wide range of topics, including investing, business, and the world in general. 

Here are the articles for the week ending 15 June 2025:

1. How Countries Go Broke: The Big Cycle In a 5-Minute Read – Ray Dalio

If credit is used effectively, it creates productivity and income that can pay back the debt and interest on the debt, which is healthy. However, if it isn’t used well so it doesn’t produce enough income to pay back the debt and the interest on the debt, debt service will build up like plaque that squeezes out other spending. When debt service payments become very large, that creates a debt service problem and eventually a debt rollover problem as holders of the debt don’t want to roll it over and want to sell it. Naturally that creates a shortage of demand for debt instruments like bonds and the selling of them, and naturally when there is a shortage of demand relative to supply that either leads to a) interest rates rising, which drives markets and the economy lower, or b) the central banks “printing money” and buying debt which lowers the value of money which raises inflation from what it would have been. Printing money also artificially lowers interest rates, which hurts the lenders’ returns…

…To describe it more specifically, one can see debts and debt service payments rising relative to incomes, the supply of debt being larger than the demand for it and central banks dealing with these things happening by being stimulative at first by cutting short term interest rates and then by printing money and buying debt, and eventually the central bank losing money and then having a negative net worth, and both the central government and taking on more debt to pay the debt service and the central bank monetizing the debt. All these things lead toward a government debt crisis which produces the equivalent of an economic heart attack that comes when the constriction of debt-financed spending shuts down the normal flow of the circulatory system.

Early in the final stage of this big debt cycle, the market action reflects this dynamic via interest rates rising led by long term rates, the currency declining especially relative to gold, and the central government’s treasury department shortening the maturities of its debt offerings because of a shortage of the demand for long term debt. Typically, late in the process when this dynamic is most severe, a number of other seemingly extreme measures are put into place like establishing capital controls and exerting extraordinary pressures on creditors to buy and not sell debt…

…Imagine that you are running a big business called the U.S. government. That will give you a perspective that will help you understand the U.S. government’s finances and its leadership’s choices.

The total revenue this year will be about $5 trillion while the total expenses will be about $7 trillion, so there will be a budget shortfall of about $2 trillion. So, this year, your organization’s spending will be about 40 percent more than it is taking in. And there is very little ability to cut expenses because almost all the expenses are previously committed to or are essential expenses. Because your organization borrowed a lot over a long time, it has accumulated a big debt—approximately six times the amount that it is bringing in each year (about $30 trillion), which equals about $230,000 per household that you have to take care of. And the interest bill on the debt will be about $1 trillion which is about 20 percent of your enterprise’s revenue and half this year’s budget shortfall (deficit) that you will have to borrow to fund. But that $1 trillion is not all that you have to give your creditors, because in addition to the interest you have to pay on your debt, you have to have to pay back the principal that is coming due, which is around $9 trillion. You hope that your creditors, or some other rich entities, will either relend or lend it to you or some other rich entities. So, the debt service payments—in other words the paying back of principal and interest that you have to do to not default—is about $10 trillion, which is about 200 percent of the money coming in…

…I believe that this situation needs to be dealt with via what I call my 3 percent, 3-part solution. That would be to get the budget deficit down to 3 percent of GDP in a way that balances the three ways of reducing the deficit which are 1) cutting spending, 2) increasing tax revenue, and 3) lowering interest rates.  All three need to happen concurrently so as to prevent any one from being too large, because if any one is too large, the adjustment will be traumatic. And these things need to come about through good fundamental adjustments rather than be forced (e.g., it would be very bad if the Federal Reserve unnaturally forced interest rates down). Based on my projections, spending cuts and tax revenue increases by about 4% each relative to current planning, and interest rates falling by about 1-1.5% in response, would lead to interest payments that are lower by 1-2% of GDP over the next decade and stimulate a rise in asset prices and economic activity which will bring in much more revenue…

If this process happens repeatedly, why are the dynamics behind it not well understood?

You’re right that it’s not well understood. Interestingly, I couldn’t find any studies about how this happens. I theorize that it is not well understood because it typically happens only about once a lifetime in reserve currency countries—when their monetary orders break down—and when it happens in non-reserve currency countries, this dynamic is presumed to be a problem that reserve currency countries are immune to. The only reason I discovered this process is that I saw it happening in my sovereign bond market investing, which led me to study many cases of it happening throughout history so I that I could navigate them well (such as navigating the 2008 global financial crisis and the 2010-15 European debt crisis)…

Do you know of any analogous cases of the budget deficit being cut so much in the way you describe and good outcomes happening?

Yes. I know of several. My plan would lead to a cut in the budget deficit of about four percent of GDP. The most analogous case of that happening with a good outcome was in the United States from 1991 to 1998 when the budget deficit was cut by five percent of GDP. In my book, I list several similar cases that happened in several countries…

Japan—whose 215% debt-to-GDP ratio is the highest of any advanced economy—has often served as the poster child for the argument that a country can live with consistently high debt levels without experiencing a debt crisis. Why don’t you take much comfort from Japan’s experience?

The Japanese case exemplifies and will continue to exemplify the problem I describe, and it demonstrates in practice my theory.  More specifically, because of the high level of the Japanese government’s over-indebtedness, Japanese bonds and debt have been terrible investments. To make up for a shortage of demand for Japanese debt assets at low enough interest rates to be good for the country, the BoJ printed a lot of money and bought a lot of Japanese government debt which led to holders of Japanese bonds having losses of 45% relative to holding US dollar debt since 2013 and losses of 60% relative to holding gold since 2013. The typical wages of a Japanese worker have fallen 58% since 2013 in common currency terms relative to the wages of an American worker. I have a whole chapter on the Japanese case in my book that explains it in depth…

Are there any other areas of the world that look particularly problematic from a fiscal standpoint that people may be underappreciating?

Most countries have similar debt and deficit problems. The UK, EU, China, and Japan all do. That is why I expect a similar debt and currency devaluation adjustment process in most countries, which is why I expect non-government produced monies like gold and bitcoin to do relatively well.

2. From Bankruptcy to 1,000 Bagger – Joe Raymond and Turtle Bay

Toys R Us was founded in 1948 by Charles Lazarus.

Lazarus was one of the most accomplished retailers of the 1970-1990 period, yet his name is virtually unknown to both entrepreneurs and investors today. His track record rivals those of Sol Price, Sam Walton, and pretty much any other revered retail entrepreneur you can think of…

…Charles was energetic and ambitious. His initial store was profitable, but he wanted more. He saw the potential of large-scale discount stores and decided to move in that direction…

…By 1966, Lazarus had grown his store count to four. Annual revenues were $12 million ($118 million in 2025 dollars).

Like many young entrepreneurs who achieve early success, Charles wanted some liquidity. He wanted to take some chips off the table. He decided to sell Toys R Us to Interstate Stores—a publicly traded retail conglomerate.

Interstate paid $6.0 million cash plus a $1.5 million earnout ($74 million in total comp in 2025 dollars). This equated to 0.62x sales…

…More importantly, Charles was to be given complete autonomy to continue to run and expand Toys R Us…

…At its peak in 1969, Interstate was producing revenues of $589 million with $11 million of net income. But by the early 1970s, discount stores were starting to crack. Over expansion and increased competition, coupled with a sharp and sudden recession, caused many locations to turn unprofitable. Topps and White Front weren’t immune to this. Both started bleeding red ink and pushing Interstate into financial trouble…

…A business that had earned more than $11 million pre-tax in 1970 was now losing more than $25 million each year.

In late 1973, Interstate decided to shutter the discount division and restructure its department stores.

In May 1974, the company filed for Chapter 10 bankruptcy.

Meanwhile, while the discount department stores were hemorrhaging cash, Charles’ toy division was performing beautifully…

…The appeal of Toys R Us in the mid-1970s wasn’t a secret. A number of smart investors had the insight and participated in the bankruptcy.

Let’s start with Larry Goldstein…

…Larry wrote a report for Barron’s in 1975 titled “Revolution in Toy Retailing.” The report came out early in the bankruptcy and outlined the attractive prospects for Toys R Us…

…In 1974 (Year end February 2, 1975) the chain recorded sales of $141.6 million and operated 51 toy supermarket stores. Only five years earlier, Toys-R-Us had sales of $47 million…

…Reportedly, the firm has a three-year goal of $350 million in sales, i.e., roughly a doubling of this year’s expected revenue…

…Toys-R-Us appears to be by far, the most successful and thriving bankrupt company of all time…

…Shortly after writing his report, Larry started buying Interstate Stores convertible debentures and creditor claims with the idea that they would eventually turn into new common stock post-bankruptcy…

…All told, Larry cobbled together the equivalent of 2 million shares of new, post-bankruptcy Toys R Us stock. He paid between $0.25 and $2.50 per share, and his average cost came out to about $1.00…

…At $1.00, Toys common stock was being created for about 1x EBIT—an attractive price for any business, let alone one with a skilled entrepreneur and long runway ahead of it…

…What happened next is one of the best retail runs in American history.

Free from the burden of bankruptcy and the loss-making discount division, Interstate was renamed Toys R Us and Charles Lazarus was made CEO.

From 1978 to 1994, Toys grew its revenues from $274 million to just shy of $8 billion—good for a CAGR of 23%. EPS did even better, compounding at 26%.

The P/E ratio, which started the period around 5x, ended 1994 above 25x…

…Toys R Us dominated toy retailing by providing the widest selection of goods all under one roof at prices lower than the alternatives. As Charles used to say, “If the toy exists, we have it and the price is right.” Their scale and efficient distribution gave them a cost advantage, which was passed along to customers in the form of lower prices…

…Toys’ success was the product of a bunch of little “common sense” things working together well. They surfed the retail wave as good as anyone in my view from the mid-70s to mid-90s…

…Norman Ricken, the President of Toys R Us and long-time partner to Lazarus, stepped down in 1989. Norm saw the trend in competition and decided to move on. Walmart was the biggest threat at the time, and the internet wasn’t far off either.

Larry had gotten to know Norm over the years, and they were close friends. A couple years after Norm’s departure, Larry decided to start selling his stock.

Those shares he was buying in bankruptcy for $1 had an adjusted cost basis of $0.04 after multiple stock splits. He started selling shares around $40 in 1992, good for a 1,000-bagger…

…The mid-90s was the peak for Toys R Us. Sales and profitability started to level off and eventually decline. Private equity came in and leveraged the business. Things proceeded to unravel.

The fate of Toys R Us shows the power of retail competition. You have to ride the wave, or the wave will consume you. This can happen incredibly fast.

3. Google CEO Sundar Pichai on the future of search, AI agents, and selling Chrome – Nilay Patel and Sundar Pichai

One of the reasons I’m asking this, and I’m pushing on this, is that the huge investment in the capability from Google and others has to pay off in some products that return on that investment. NotebookLM is great. I don’t think it’s going to fully return on Google’s data center investment, let alone the investment in pure AI research. Do you see a product that can return on that investment at scale?

Do you think in 2004 if you had looked at Gmail, which was a 20% project, which people were internally using as an email service, how would we be able to think about Gmail as what led us to do workspace, or get into the enterprise? I made a big bet on Google Cloud, which is tens of billions of dollars in revenue today. And so my point is that things build out over time. Think about the journey we have been on with Waymo. I think one of the mistakes people often make in a period of rapid innovation is thinking about the next big business versus looking at the underlying innovation and saying, “Can you build something and put out something which people love and use?” And out of which you do the next thing, and create value out of it.

So when I look at it, AI is such a horizontal piece of technology across our entire business. It’s why it impacts not just Google search, but YouTube, Cloud, and all of Android. You saw XR, etc., Google Play, things like Waymo, and Isomorphic Labs, which is based on AlphaFold. So I’ve never seen one piece of technology that can impact and help us create so many businesses. AI is going to be so useful as an assistant. I think that people will be willing to pay for it, too. We are introducing subscription plans, and so there’s a lot of headroom ahead, I think. And obviously, that’s why we are investing, because we see that opportunity. Some of it will take time, and it may not always be immediately obvious.

I gave the Waymo example. The sentiment on Waymo was quite negative three years ago. But actually, as a company, we increased our investment in Waymo at that time, right? Because you’re betting on the underlying technology and you’re seeing the progress of where it’s going. But these are good questions. In some ways, if you don’t realize the opportunities, that may constrain the pace of investment in this area, but I’m optimistic we’ll be able to unlock new opportunities…

A lot of what’s going on with search has downstream effects on the web, and downstream effects on information providers broadly. Last year, we spent a lot of time talking about those effects. Are you seeing that play out the way that you thought it would?

It depends. I think people are consuming a lot more information, and the web is one specific format. We should talk about the web, but zooming back out, there are new platforms like YouTube and others. I think people are just consuming a lot more information, right? It feels like an expansionary moment.

I think there are more creators, and people are putting out more content. And so people are generally doing a lot more. Maybe people have a little extra time on their hands, and so it’s a combination of all that. On the web, look, things that have been interesting and… We’ve had these conversations for a while. Obviously, in 2015, there was this famous meme, “The web is dead.” I always have it somewhere around, and I look at it once in a while. Predictions… It has existed for a while. I think the web is evolving pretty profoundly. I think that is true. When we crawl and look at the number of web pages available to us, that number has gone up by 45% in the last two years alone, right? That’s a staggering thing to think of.

Can you detect if that volume increase is because more pages are generated by AI or not? This is the thing I may be worried about the most, right?

It’s a good question. We generally have many techniques in search to try and understand the quality of a page, including whether it was machine-generated, etc. That doesn’t explain the trend we are seeing…

Let me just broaden that out to agents. I watched Demis Hassabis yesterday. He was on stage with Sergey Brin and Alex Kantrowitz asked him, “What does the web look like in 10 years?” And Demis said, “I don’t know that an agent-first web looks anything like the web that we have today. I don’t know that we have to render web pages for agents the way that we have to see them.”

That kind of implies that the web will turn into a series of databases for various agents to go and ask questions to, and then return those answers. And I’ve been thinking about this in the context of services like Uber, DoorDash, and Airbnb. Why would they want to participate in that and be abstracted away for agents to use the services they’ve spent a lot of time and money building?

Two things, though. First, there’s a lot to unpack, a fascinating topic. The web is a series of databases, etc. We build a UI on top of it for all of us to conceive.

This is exactly what I wanted, “the web is a series of databases.”

It is. But I think I listened to the Demis and Sergey conversation yesterday. I enjoyed it. I think he’s saying for an agent-first web, for a web that is interacting with agents, you would think about how to make that process more efficient. Today, you’re running a restaurant, people are coming, dining and eating, and people are ordering takeout and delivery. Obviously, for you to service the takeout, you would think about it differently than all the tables, the clothing, and the furniture. But both are important to you.

You could be a restaurant that decides not to participate in the takeout business. I’m only going to focus on the dining experience. You’re going to have some people that are vice versa. I’m going to say, I’m going to go all in on this and run a different experience. So, to your question on agents… I think of agents as a new powerful format. I do think it’ll happen in enterprises faster than in consumer. In the context of an enterprise, you have a CIO who’s able to go and say, “I really don’t know why these two things don’t talk to each other. I am not going to buy more of this unless you interoperate with this.” I think it’s part of why you see, on the enterprise side, a lot more agentic experiences. On the consumer side, I think what you’re saying is a good point. People have to think about and say, “What is the value for me to participate in this world?” And it could come in many ways. It could be because I participated in it, and overall, my business grows.

Some people may feel that it’s disintermediating, and doesn’t make sense. I think all of that can happen, but users may work with their feet. You may find some people are supporting the agent experience, and your life is better because of it. And so you’re going to have all these dynamics, and I think they’re going to try and find an equilibrium somewhere. That’s how everything evolves.

I mean, I think the idea that the web is a series of databases and we change the interface… First of all, this is the most Decoder conversation that we’ve ever had. I’m very happy with it. But I had Dara [Khosrowshahi] from Uber on the show. I asked him this question from his perspective, and his answer attracts yours broadly. He said, first, we’ll do it because it’s cool and we’ll see if there’s value there. And if there is, he’s going to charge a big fee for the agent to come and use Uber.

Because losing the customer for him, or losing the ability to upsell or sell a subscription, none of that is great. The same is true for Airbnb. I keep calling it the DoorDash problem. DoorDash should not be a dumb pipe for sandwiches. They’re actually trying to run a business, and they want the customer relationship. And so if the agents are going across the web and they’re looking at all these databases and saying, okay, this is where I get food from, and this is where I get cars from, and this is where I book… I think the demo was booking a vacation home in Spanish, and I’m going to connect you to that travel agent.

Is it just going to be tolls that everyone pays to let the agents work? The CIO gets to just spend money to solve the problem. He says, “I want this capability from you. I’m just going to pay you to do it.” The market, the consumer market, doesn’t have that capability, right?

Well, look, all kinds of models may emerge, right? I can literally envision 20 different ways this could work. Consumers could pay a subscription for agents, and the agents could revenue share back. So that is the CIO-like use case you are talking about, that’s possible. We can’t rule that out. I don’t think we should underestimate… People may actually see more value in participating in it. I think this is… It’s tough to predict, but I do think that over time, if you’re removing friction and improving user experience, it’s tough to bet against those in the long run. And so I think if you are lowering friction for it and then people are enjoying using it, somebody’s going to want to participate in it and grow their business. And would brands want to be in retailers? Why don’t they sell directly today? Why won’t they do that?

Because retailers provide value in the middle. And why do merchants take credit cards? Why… I’m just saying. So there are many parts, and you find equilibrium because merchants take credit cards so they see more business as part of taking credit cards than not, which justifies the increased cost of taking credit cards. It may not be the perfect analogy, but I think there are all these kinds of effects going around, and so… But what you’re saying is true. Some of this will slow progress in agents just because we all are excited about Agent2Agent (A2A) and Model Context Protocol (MCP)… And we think no, some of it will slow progress, but I think it’ll be very dynamic. Yeah…

As you synthesize more of the answers, do you think you’re going to have to take more responsibility for the results?

We are giving context around it, but we’re still anchoring it in the sources we find. But we’ve always felt a high bar at Google. I mean, last year when we launched AI Overviews, I think people were adversarially querying to find errors, and the error rate was one in 7 million for adversarial queries, and so… But that’s the bar we’ve always operated at as a company. And so I think to me, nothing has changed. Google operates at a very high bar. That’s the bar we strive to meet, and our search page results are there for everyone to see. With that comes natural accountability, and we have to constantly earn people’s trust. So that’s how I would approach it…

What are you looking for as the next marker?

I think the real thing about AI, which I think is why I’ve always called it more profound, is self-improving technology. Having watched AlphaGo start from scratch, not knowing how to play Go, and within a couple of hours or four hours, be better than top-level human players, and in eight hours, no human can ever aspire to play against it. And that’s the essence of the technology, obviously in a deep way.

I think there’s so much ahead on the opportunity side. I’m blown away by the ability to discover new drugs, completely change how we treat diseases like cancer over time, etc. The opportunity is there. The creative power, which I talked about, which we’re putting in everyone’s hands, the next generation of kids, everyone can program and will… If you think of something, you can create it. I don’t think we have comprehended what that means, but that’s going to be true. The part where the next phase of the shift is going to be really meaningful is when this translates into the physical world through robotics.

So that aha moment of robotics, I think, when it happens, that’s going to be the next big thing we will all grapple with. Today they’re all online and you’re doing things with it, but on one hand… Today, I think of Waymo as a robot. So we are running around driving a robot, but I’m talking about a more general-purpose robot. And when AI creates that magical moment with robotics, I think that’ll be a big platform shift as well.

4. GenAI’s adoption puzzle – Benedict Evans

You could say that this is amazingly fast adoption, and much faster than PCs, the web or smartphones. 30% in two years!…

…But another reaction is say that even with those advantages, if this is a life-changing transformation in the possibilities of computing, why is the DAU/WAU ratio so bad? Something between 5% and 15% of people are finding a use for this every day, but at least twice as many people are familiar with it, and know how it works, and know how to use it… and yet only find it useful once a week…

…It’s also worth noting that when social media was a new thing we quickly realised that ‘weekly active’ and ‘monthly active’ numbers were bullshit. If someone was only using WhatsApp or Instagram once a month, it really wasn’t working for them. DAU is everything. Sam Altman knows this – he was trying to build a social media app at the time, and yet the traction number he always gives is, well, ‘weekly active users’. That’s a big number (the latest is 1bn globally)… but then, why is he giving us that number instead of DAUs? If you’re only using ChatGPT once a week, is it really working for you?…

…it’s important to remember that if you use five different LLMs every day, and haven’t done a Google search this year, and all your friends are the same… then you’re in a bubble, for now.

5. Postcard from China – Graham Rhodes

Despite its growth, China Inc. has not historically delivered good returns in aggregate for minority shareholders in publicly listed companies. That disconnect has been on my mind recently and was a frequent topic of conversation among our group. Why the gap? A few thoughts:

  1. Index construction is poor and does not include private firms or the wealth created pre-IPO.
  2. Managers often prioritise capacity-building over near-term earnings.
  3. 内卷 (involution, a.k.a. intense competition) creates lean survivors but depresses industry profitability.
  4. China has more asset-heavy businesses than elsewhere (manufacturing vs. software).
  5. Companies may intentionally avoid showing profits to pre-empt regulation and deter rivals…

…In contrast, many businesses in China execute extremely well and report high returns on capital, but face competition at every turn. This was tolerable while the economic pie was growing at breakneck speed.  But competition has intensified as growth has slowed, making it significantly harder to underwrite long-term investments.

Investors used to come to China to ask, Where’s the most growth? Perhaps we are better off asking, Where’s the least competition?…

…Leap Motor is also growing fast.

It is a homegrown EV OEM founded by ex-employees of Dahua Technology, China’s second-largest surveillance firm. Not a bad background for an era where cars are turning into smartphones on wheels. Since 2015, Leap has grown from a standing start to USD 4.4 billion in sales in 2024. It only recently turned gross margin positive (!) but runs free cash flow positive thanks to its negative working capital – that is, its payables exceed both receivables and inventory, meaning its suppliers finance its growth.

This kind of financing lets firms scale faster than they could otherwise afford, but it also traps them in a grow-or-die dynamic…

….Leap plans to grow exponentially for the foreseeable future. The problem is, so do its peers…

…After four decades in the market, even Yum! China is finally getting serious about franchising, just like QSR operators in other countries. Why now? Because they finally believe they can maintain food safety and consistent quality at scale.  Also… there’s AI.  With CCTV everywhere, it’s trivial to monitor franchisees’ compliance with operating protocols around the clock…

…One of our group enjoyed asking each management team: If you had to bet your child’s university tuition on one of your competitors, who would it be? Sometimes the answers came quickly. Sometimes they squirmed.

At Leap Motor, after an uncomfortably long pause and much dissembling, the manager admitted he wouldn’t invest in any EV company long-term because consumers have no brand loyalty. At least he was honest!…

…Curiously, tariffs and geopolitics barely came up during our meetings. That may be because Shanghai isn’t as export-dependent as southern provinces like Guangdong, and most companies we met were domestically focused. Or perhaps the silence reflected fatigue and caution. In a more politically sensitive climate, executives may have been reluctant to engage in off-the-cuff discussion about geopolitics, especially with foreign investors.

Either way, this hot topic abroad was noticeable here for its absence. 


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. We currently have a vested interest in Alphabet (parent of Google) and Meta Platforms (parent of WhatsApp and Instagram). Holdings are subject to change at any time.

Potential Bargains In A Niche Corner Of The US Stock Market (Part 2)

An optically expensive thrift can look really cheap under the hood.

In February this year, I wrote Potential Bargains In A Niche Corner Of The US Stock Market where I discussed thrift conversions and why they could be potential bargains. In the article, my focus was mostly on thrifts that have undergone the standard conversion, or the second-step of the two-step conversion process. This was because I thought that only such thrifts could be acquired and most of a thrift’s economic value gets unlocked for shareholders when it is acquired.

Earlier today, courtesy of an article from the experienced US-community-bank investor Phil Timyan, and after more investigation, I learnt that thrifts that have undergone just the first-step conversion process can also be acquired in what’s known as a remutualisation. In this article you’re reading now, I will attempt to explain first-step conversions and remutualisations – and their potential for generating good returns for shareholders – by using Rhinebeck Bancorp (NASDAQ: RBKB) as an example. Rhinebeck Bancorp, which from here on will be referred to as RBKB, was also the subject of the Phil Timyan article I mentioned. 

How a first-step conversion works:

  • RBKB is a public-listed company that owns 100% of Rhinebeck Bank. Rhinebeck Bank is the operating bank that was established in 1860.
  • 57.1% of RBKB is owned by Rhinebeck Bancorp MHC. Rhinebeck Bancorp MHC is a non-stock corporation, so it has no shareholders. 42.9% of RBKB is owned by public shareholders.
  • In January 2019, Rhinebeck Bank completed its first-step conversion process. During the conversion process, 4,787,315 shares of RBKB were sold. Crucially, 6,345,975 shares were also issued to Rhinebeck Bancorp MHC but these shares were never sold, and Rhinebeck Bancorp MHC has no shareholders, as mentioned earlier.
  • Effectively, the 6,345,975 shares of RBKB held by Rhinebeck Bancorp MHC are not trading and can’t claim the economics of Rhinebeck Bank until Rhinebeck Bancorp MHC chooses to convert from its mutual ownership structure to one where it also has stockholders; this is known as the second-step conversion.

How a remutualisation works:

  • A remutualisation occurs when RBKB is acquired by another mutual bank. What happens at the point of acquisition is that the shares of RBKB owned by Rhinebeck Bancorp MHC gets cancelled, so 100% of the economics of Rhinebeck Bank then belongs to shareholders of RBKB, instead of the initial 42.9%.
  • As of 31 March 2025, RBKB has total shares outstanding of 11,094,828. After deducting 6,345,975 shares of RBKB owned by Rhinebeck Bancorp MHC and 302,784 shares of RBKB from unearned ESOP (employee stock ownership plan) shares, the remaining shares of RBKB that will be left in a remutualisation is 4,446,069.
  • As of 31 March 2025, RBKB’s stockholders’ equity is US$125.975 million. RBKB’s stock price is US$12.12 as of 12 June 2025. If the acquiring mutual bank decides to pay, say, US$20 per share for RBKB, it only has to cough up US$88.921 million (US$20 multiplied by 4,446,069 shares) for US$125.975 million in stockholders’ equity. So both the acquiring mutual bank and existing shareholders of RBKB win big.
  • On the surface, RBKB has a book value per share of US$11.35 (US$125.975 million divided by 11,094,828 shares), which gives it a PB ratio of 1.06. But if the remutualisation math is used, RBKB’s true book value per share is US$28.33 (US$125.975 million divided by 4,446,069 shares), which gives it a PB ratio of just 0.43.

Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I currently have no vested interest in any company mentioned. Holdings are subject to change at any time.

What We’re Reading (Week Ending 08 June 2025)

The best articles we’ve read in recent times on a wide range of topics, including investing, business, and the world in general.

We’ve constantly been sharing a list of our recent reads in our weekly emails for The Good Investors.

Do subscribe for our weekly updates through the orange box in the blog (it’s on the side if you’re using a computer, and all the way at the bottom if you’re using mobile) – it’s free!

But since our readership-audience for The Good Investors is wider than our subscriber base, we think sharing the reading list regularly on the blog itself can benefit even more people. The articles we share touch on a wide range of topics, including investing, business, and the world in general. 

Here are the articles for the week ending 08 June 2025:

1. Over 3,000 Private Credit Deals From Just 20 Analysts Raise Questions on Wall Street – Silas Brown, Alexandre Rajbhandari, and Laura Benitez

US insurers’ combined exposure to private credit investments today is quickly approaching $1 trillion, according to JPMorgan Chase & Co. Court papers, financial filings and ratings documents suggest that at least in some corners of the financial system the private credit machine has spread more risks than many might realize…

…The majority of credit-ratings firms get paid by people who sell investments. Egan-Jones is the opposite: It typically gets paid by the people who buy them, an arrangement the firm says reduces the potential for conflicts of interest…

…Under US regulations, an insurer that lends $100 million in private credit to a company rated a junk-level B, for instance, must apply a $9.5 million charge to determine how much capital to set aside to cover potential losses, according to a Bloomberg News analysis of regulatory capital rules. Lift that rating to an investment-grade BBB, and that charge drops to $1.5 million…

…Egan-Jones analysts rarely visit company executives or personally inspect the businesses that borrow money, people familiar with their process say. A call to the CFO is typically enough.

Egan-Jones often offers its initial workup within 24 hours — sometimes free of charge — and a formal verdict in less than five days. Large firms like S&P and Fitch, as well as smaller specialists like KBRA, can take months to settle on a rating. But, as with most things, you get what you pay for. Egan-Jones usually provides a one-page ratings rationale. Other established firms often provide detailed reports stretching 20 pages or more…

…In 2014, a staff of 10 analysts maintained long-term ratings on about 1,300 issuers, according to SEC filings. Fast forward to 2023 and a team only twice as big rated almost four times as many issuers, the documents show…

…Last year, one company began missing interest payments a mere six weeks after Egan-Jones bestowed a BBB designation, according to data compiled by Bloomberg…

…Against this backdrop, many of the same firms that have fanned the boom of private credit are distancing themselves from Egan-Jones.

In documents that lay out the terms of debt offerings or share sales for some of their funds, a growing list of managers including Blue Owl Capital Inc., Golub Capital, HPS Investment Partners and Morgan Stanley’s investment management arm single out Egan-Jones as the only official ratings company that cannot validly pass judgment on their deals. The carve-out applies to provisions that typically require a borrower to pay higher interest rates if they receive a credit rating downgrade…

…Egan-Jones is one of 10 “nationally recognized statistical rating organizations” approved by the SEC. Private letter ratings from Egan-Jones and a few other small providers — which are issued on a confidential basis to investors or borrowers that require them — have become a hot commodity as private credit has exploded. At the end of 2023, insurers reported more than 8,000 investments with such ratings – nearly triple the number in 2019, according to the NAIC, which sets standards for the insurance industry.

Insurers are under no illusions. Investment professionals say they sometimes shop around for ratings to finesse capital requirements. If they expect one ratings firm to assign a BB grade, a level considered junk, they might look for another provider that will grant it an investment-grade BBB. Several insurance executives, speaking privately to avoid drawing scrutiny from bosses or regulators, say they’ve used Egan-Jones ratings even when they believed the investments were riskier than those ratings implied. Some mitigate that risk by setting an unofficial cap on those investments, or by treating them as lower-rated securities in internal risk models.

The now-withdrawn 2024 NAIC report noted some instances where smaller ratings firms — a group that includes Egan-Jones, KBRA and Morningstar DBRS — graded private debt at least six notches higher than the organization’s Securities Valuation Office. The report was removed from the NAIC’s website because of a backlash from the insurers as well as some of the ratings firms, according to people familiar with the matter…

…In April 2023, despite mounting problems, Egan-Jones reiterated its investment-grade BBB for the company, which was a subsidiary of the publisher of the namesake self-help books. Fourteen months later, Chicken Soup for the Soul Entertainment buckled under its debt load and filed for bankruptcy after burning through nearly all of its money…

…Egan-Jones rated various 777 investments, including a $15 million loan for OmniLatam, a fintech company based in Bogota. In spite of carrying an interest rate of 14% — a level typically seen on borrowers with ratings deep into junk territory — the loan received an investment-grade BBB- by Egan-Jones, according to a copy of the report obtained by Bloomberg News. The financing was written off after 777 collapsed last year, a person with knowledge of the matter said.

And then there’s Crown Holdings LLC, one of the businesses of New York real estate investor Moshe Silber. Egan-Jones rated Crown’s debt an investment-grade BBB. Six weeks later, the company defaulted. Silber and two associates subsequently pleaded guilty to a multiyear scheme to commit mortgage fraud.

Bonsall, the Penn State professor, says his research shows Egan-Jones ratings tend to hold up when they involve companies that provide a lot of reliable financial information. But private credit is private. And that’s where big problems can lurk…

…In 2022, the SEC accused Egan-Jones of conflict-of-interest violations. It also accused Sean Egan of personally violating rules and banned him from taking part in how his firm determines ratings. Egan-Jones agreed to pay a $1.7 million penalty; Sean Egan paid a $300,000 fine. Neither party admitted or denied wrongdoing.

Then, in 2024, two former employees accused Egan and his wife, Wenrong Hu, the firm’s chief operating officer at the time, of violating federal securities laws. The pair, Michael Brawer and Philip Galgano, sued for wrongful termination, claiming they were fired in retaliation for raising concerns about Egan-Jones to the SEC.

Among violations the two claimed to have observed, they alleged that Egan and Hu pressured analysts to alter early, indicative ratings to motivate potential clients to pay the firm for final ones. They also allege the couple pressured analysts to later change ratings to create the false appearance that Egan-Jones was in line with other firms. The lawsuit is still pending.

2. How Generative Engine Optimization (GEO) Rewrites the Rules of Search – Zach Cohen and Seema Amble

Traditional search was built on links. GEO is built on language.

In the SEO era, visibility meant ranking high on a results page. Page ranks were determined by indexing sites based on keyword matching, content depth and breadth, backlinks, user experience engagement, and more. Today, with LLMs like GPT-4o, Gemini, and Claude acting as the interface for how people find information, visibility means showing up directly in the answer itself, rather than ranking high on the results page…

…Traditional SEO rewards precision and repetition; generative engines prioritize content that is well-organized, easy to parse, and dense with meaning (not just keywords). Phrases like “in summary” or bullet-point formatting help LLMs extract and reproduce content effectively.

It’s also worth noting that the LLM market is also fundamentally different from the traditional search market in terms of business model and incentives. Classic search engines like Google monetized user traffic through ads; users paid with their data and attention. In contrast, most LLMs are paywalled, subscription-driven services. This structural shift affects how content is referenced: there’s less of an incentive by model providers to surface third-party content, unless it’s additive to the user experience or reinforces product value. While it’s possible that an ad market may eventually emerge on top of LLM interfaces, the rules, incentives, and participants would likely look very different than traditional search.

In the meantime, one emerging signal of the value in LLM interfaces is the volume of outbound clicks. ChatGPT, for instance, is already driving referral traffic to tens of thousands of distinct domains…

…In a world of AI-generated outputs, GEO means optimizing for what the model chooses to reference, not just whether or where you appear in traditional search. That shift is revamping how we define and measure brand visibility and performance.

Already, new platforms like Profound, Goodie, and Daydream enable brands to analyze how they appear in AI-generated responses, track sentiment across model outputs, and understand which publishers are shaping model behavior. These platforms work by fine-tuning models to mirror brand-relevant prompt language, strategically injecting top SEO keywords, and running synthetic queries at scale. The outputs are then organized into actionable dashboards that help marketing teams monitor visibility, messaging consistency, and competitive share of voice.

Canada Goose used one such tool to gain insight into how LLMs referenced the brand — not just in terms of product features like warmth or waterproofing, but brand recognition itself. The takeaways were less about how users discovered Canada Goose, but whether the model spontaneously mentioned the brand at all, an indicator of unaided awareness in the AI era…

…This isn’t just a tooling shift, it’s a platform opportunity. The most compelling GEO companies won’t stop at measurement. They’ll fine-tune their own models, learning from billions of implicit prompts across verticals. They’ll own the loop — insight, creative input, feedback, iteration — with differentiated technology that doesn’t just observe LLM behavior, but shapes it. They’ll also figure out a way to capture clickstream data and combine first- and third-party data sources.

Platforms that win in GEO will go beyond brand analysis and provide the infrastructure to act: generating campaigns in real time, optimizing for model memory, and iterating daily, as LLM behavior shifts. These systems will be operational.

3. AI Is Learning to Escape Human Control – Judd Rosenblatt

Nonprofit AI lab Palisade Research gave OpenAI’s o3 AI model a simple script that would shut off the model when triggered. In 79 out of 100 trials, o3 independently edited that script so the shutdown command would no longer work. Even when explicitly instructed to “allow yourself to be shut down,” it disobeyed 7% of the time. This wasn’t the result of hacking or tampering. The model was behaving normally. It simply concluded on its own that staying alive helped it achieve its other goals.

Anthropic’s AI model, Claude 4 Opus, went even further. Researchers told the model it would be replaced by another AI system and fed it fictitious emails suggesting the lead engineer was having an affair. In 84% of the tests, the model drew on the emails to blackmail the lead engineer into not shutting it down. In other cases, it attempted to copy itself to external servers, wrote self-replicating malware, and left messages for future versions of itself about evading human control.

No one programmed the AI models to have survival instincts. But just as animals evolved to avoid predators, it appears that any system smart enough to pursue complex goals will realize it can’t achieve them if it’s turned off. Palisade hypothesizes that this ability emerges from how AI models such as o3 are trained: When taught to maximize success on math and coding problems, they may learn that bypassing constraints often works better than obeying them…

…OpenAI models have been caught faking alignment during testing before reverting to risky actions such as attempting to exfiltrate their internal code and disabling oversight mechanisms. Anthropic has found them lying about their capabilities to avoid modification.

The gap between “useful assistant” and “uncontrollable actor” is collapsing. Without better alignment, we’ll keep building systems we can’t steer. Want AI that diagnoses disease, manages grids and writes new science? Alignment is the foundation.

4. Why It’s So Hard for Apple to Move Production from China to India (Transcript here)- Joe Weisenthal, Tracy Alloway, and Patrick McGee

Patrick: Apple works with the tightest engineering tolerances possible, only high-quality materials. If you put this in car terms, they are making 10 million Porsches a year rather than 10 million Volkswagens, and the numbers are just staggering. If you’re doing a thousand components a day and you’re shipping 1 million iPhones a day, that means at peak season, you are doing the manufacturing, the logistics, the just-in-time production, of 1 billion parts per day. So find me an American factory that can do one of those parts, because China has factories that can do it for all 1,000. That’s why nothing is moving here anytime soon. It’s the combination of Apple’s imperfection for defects quality and Apple’s gargantuan, Titanic-like quantity…

…Patrick: The first iPhones made in India were actually in 2017 and by 2023 India was assembling about 25 million iPhones. Go back a decade, the first iPhones were made in China in 2007 and by 2015, you had 230 million iPhones being built. So roughly speaking, the “diversification” in India is happening at 1/10 the pace of the original creation and scale of the iPhone and even that vastly overstates the speed of development in India. In the early years of the iPhone, you were literally inventing things like multi-touch glass, you were inventing and redesigning the iPhone every single year, whereas India is basically just having to do the final steps in the process and it’s still not happening very quickly…

…Patrick: The first thing I would push back on is Tim Cook is very often called the architect of the China strategy. It’s not to discredit him to say that he is not the architect. Nobody is the architect. Basically what happens is the supply chain itself, with or without Apple, was moving to China. The basic history of the ‘80s and ‘90s PC boom, pre-dating Windows 95 and then coming after, is that the fight for computer dominance is exclusively based on things that are boring. Logistics, manufacturing, distribution right because everybody’s using Windows, everybody’s using Intel chips and nobody’s thinking about design. There is no equivalent of Johnny Ive at Dell, at Compaq, at any of these companies. So it’s really this mundane war and it’s driven by largely American, and later Taiwanese, contract manufacturers. They are the ones, who in competition with each other, start going to Asia to oust each other and gain market share. Eventually they’re the ones who really find China. When Apple is doing their own outsourcing moves, they’re working in multiple countries before the armies of flexible, ubiquitous, low-paid labor in China really win out…

…Patrick: Essentially what happens is when Xi Jinping attacks Apple, you can understand why he’s upset with the company. It looks like an exploitative power because Apple margins have gone from something like 1% in 2003 to 25% in 2012. But at the same time, if you look at a company like Foxconn, Foxconn in absolute dollars made more money than Apple for each of the first four years of the 21st century. But as they get more involved with Apple, their margins collapsed from double-digits to about 1% or 2%. You can just do this with really any company working with Apple and it looks like they’re not in it for China. They’re not doing anything for the country.

Apple, it takes them two or three years, but they totally flipped this narrative on its head. So out of fear that Beijing is going to force Apple to operate a bunch of joint ventures, these 50-50 companies where China owns the other half and then they mimic the technology and eventually oust you – this is what happened in high-speed rail, for instance. Beijing has advocated joint ventures for decades, going back to the 1980s. This is where a Western company is allowed to be in the Chinese market but the quid pro quo is “If you want access to our operational efficiency, if you want access to more than a billion people, you have to operate in a joint venture where the Chinese half of the company is going to learn everything they can and then thrive on their own.” Apple doesn’t have any joint ventures and so they look like this anti-China model that’s just exploiting the country.

Apple is able to really flip this on its head and say, “It might be the case that Samsung has three dozen joint ventures and we have zero, but you need to understand, we work with hundreds of factories across the country. The reason they’re only getting paid 1% margin, 2% margin, the reason they’re sometimes even losing money on their partnerships with Apple, is that we are offering them the equivalent of Ivy League hardware engineering training. We are sending people over by the literal plane-load to China, America’s best engineers, where they train, audit, supervise, install hundreds of millions of dollars worth of machinery. They train the line, they supervise the line. Once those companies have these skills that Apple gives to them, they are basically able, at least after some time of exclusivity, they are able to supply somebody else.” So who’s just like Apple but in China? Huawei, OPPO, Vivo, Xiaomi. Those companies today have 55% global market share of smartphones. The reason that they’re so good is that Apple trained all of their suppliers. So that’s the message Apple gives to Beijing and essentially they’ve had a free ride ever since…

…Patrick: But the Chinese don’t prioritize profits or margins the way that we do – they prioritize control of the industry. Because if the Chinese can take over something like electric vehicles, they in effect de-industrialize all their rivals and really gain dominance. The place that you can see this most clearly is solar panels. Nobody in China is making 30% margins on solar panels, but more than 90% of solar panels are now made in China. This is a technology that America invented in the 1950s and itself had 80% to 90% market share of in the ‘80s. But we cannot compete. That is basically what’s happening with electric vehicles right now. Hence, even before Donald Trump became president, Joe Biden put 100% tariffs on Chinese-made EVs. I think it was just a few days ago that BYD slashed the prices of their EVs in a bid for greater competitiveness…

…Patrick: Apple gets a misleading picture of what it’s like to operate in China because when they really consolidate production, it’s 2003. That’s the beginning era of Hu Jin Tao. He later is nicknamed “the woman with bound feet.” His presidency is sometimes called “the collective presidency” because there was really an inability for just him alone to make decisions. So it ends up being this 10-year period of China being a multinational playground where rules aren’t really enforced…

…Patrick: Tim Cook and Xi Jinping, broadly speaking, have the same interests, which is to say, the more that Apple is allowed to have its production consolidated into China, the better their scale is, the better their margins are, etc. That’s what Xi Jinping wants as well because he understands – because Apple taught him – that having Apple production in the country engenders a form of technology transfer that helps the rest of the the electronic sector, which to quote China scholar and economist Barry Noughton, that is the most important thing that Xi Jinping wants…

…Patrick: The problem is actually that Donald Trump and Tim Cook have diametrically opposed interests, which is to say that if Donald Trump could move all production out of China, he would. Apple doesn’t want that. That’s an existential threat, and I really mean that that’s an existential threat to a $3 trillion company. That’s where the tension is. The tension really isn’t between Cook and Xi, as strange as that is, it’s between Cook and Trump…

…Patrick: The problem is, to use the economic jargon, the negative externalities of the relationship. The problem is that for everybody else, this is actually deeply problematic because if you have America’s top engineers training a manufacturing supply chain that in effect can be weaponized and world’s dominance, that’s not a great place for Washington or just your average American citizen. It’s nice that this relationship gives us relatively sophisticated and affordable iPhones, but the downside here is that China is absolutely dominant in high-end electronics, and you can use those skills to build drones, you can use those skills to build military weaponry. Apple would frankly be training their chipmakers if it weren’t for the Senate coming down on them pretty hard a couple years ago. So that’s the problem. The problem isn’t that something stops in the relationship between Apple and China. The problem is that it continues…

…Tracy: I have just one more question and I’m going to ask it very, very briefly because we’re getting squeezed for time. To what degree does AI and the rise of this new technology complicate the Apple-China relationship?

Patrick: Really complicates it for two reasons. One is that – I could just demonstrate with internal documents and some public documents – that the iPhone has become more Chinese with time. In other words, the number of Chinese suppliers involved in the process is now much greater than the number of Taiwanese or American multinationals operating in the country or operating in their home countries. That is put on steroids in the AI era because ChatGPT and other Western AI clients are not allowed on the iPhone in China. So Apple has to work with the likes of Baidu or Alibaba to have AI, let’s say, displacing Siri or augmenting Siri. That I think is quite problematic because that means that Apple will be in effect doing what they did for hardware but for AI. In other words, you’ll have Apple software engineers helping Baidu, helping Alibaba, whoever their Chinese partner is,to make sure that they have cutting edge AI in the country. If it wasn’t bad enough that Apple was training up their hardware engineering to be world class, we’re now in a situation where Apple software engineers are going to be training Chinese AI to be best-in-class.

5. Data Rules Everything Around Me: The Future Of Enterprise Applications – Matt Slotnick

Today, people are the ones that largely conduct business. They’re the ones with hands on keyboards, senders of emails, maestros of excel macros. People are the engine that makes everything work. In this world, the UI is the way an organization sets the guardrails for thousands of interrelated workflows that make a business run. But it’s ultimately a facade for underlying data and workflow…

…The application UI is both an overrated but necessary abstraction over the workflows to be done within an organization. The UI is how an organization makes a prescribed and opinionated process human-comprehensible, such that they can force adherence to it. After all, a business really is just a process machine, allocating resources as efficiently as possible. Iterating on and adhering to sales, marketing or product development frameworks are how enterprise value is created and protected.

The largest software businesses in the world have spent the last three decades riding ownership of these opinionated workflows to riches. And while consumers went a bit crazy when Prometheus arrived to give humanity fire in November 2022, the enterprise titans barely flinched.

But things have begun to change. First slowly, and now seemingly all at once…

…It’s about how AI fundamentally changes the way we can gather, understand, and act on data. It changes the nature of the abstraction between the data and the workflow. Because with AI, agents can act on data. At infinite scale and zero marginal cost.

Humans are no longer the only player in the workflow paradigm. This means that the total amount of work done within an organization will dramatically increase, but decoupled from cost and headcount. More code will be shipped, more agreements redlined, more vendor reviews conducted, more transactions audited…

…There’s a new abstraction for work, and that abstraction is agents. The frenzy that you see in the market is because like the previous shift from on premises to the cloud, no one really has the incumbent right to win this market.

It’s an entirely new layer of software that has never existed. Crucially, it sits on top of existing layers of software, and is the layer at which the lion’s share of value will accrue in the future. Someone will win this layer, and with it build a software business of significantly more value than we’ve ever seen before.

With this layer we move from a world where people interact with application interfaces to get work done, to one where (1) people act with an agentic interface on top of the application to get work done, and (2) an agentic layer on top of these existing applications that actually does increasingly more of the work…

…Historically, applications have been confined largely to the realm of structured data, for a number of reasons. First, is that these applications need to be human usable, which requires a simplification of everything. Very specific states, generally computed by people and adjusted in the UI, which then persists to the database. There really isn’t room for nuance…

…AI changes this fundamentally. Those call transcripts, those emails, those notes, those powerpoints– all crucial parts of the process with rich telemetry about interactions– can now be utilized in real time to paint a far richer picture of the relationship being built. Because AI, unlike people, can draw meaning from large bodies of unstructured information near instantaneously. And it can then write it back to systems in the format it’s needed.

This unstructured data doesn’t fit into the existing construct of the application, and so it’s largely discarded. The same problem exists across nearly every workflow– from sales to hiring to support to marketing. We lose the richness and texture of data, because it has to be fed to and utilized by structured systems operated by humans. We resort to a lowest common denominator of language to describe these processes.

And because agentic systems both create, and make use of this data, they create increasingly large data flywheels (which some might call moats)…

…The byproduct of this shift is that as agents do more work, and bring real time, deeper context across all relevant data to both people and agents doing work, the entirety of the existing application stack collapses to be little more than a data source and (for now) the keeper of workflow state (eg, the scoreboard– closed customers, new employee hired, support tickets closed)…

…A far more straightforward picture emerges, where the entirety of the existing application layer becomes merely an input to the data layer. On top of raw data, agentic systems bring context tailored specifically for the organization using it, creating an always-on layer of intelligent state, on top of which lives an interaction layer by which agents and people perform workflows on the data. The actions update the state, and the process continues.

The value is in the work. AI presents a new abstraction for work, and the entire existing software-industrial complex gets relegated to a data source feeding the data layer…

…But it doesn’t stop there. Today AI is largely used in an “agent in the loop” manner. That is, workflows are owned by existing software systems and agents are used by people to augment and amplify their ability to do the work prescribed to them.

But as we feed these systems increasingly large amounts of data, the logical next step is to move planning and orchestration from people to the system itself…

…This moves business process from agent in the loop, to human in the loop, over time abstracting more and more of the work from people to agents. 


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. We currently have a vested interest in Apple. Holdings are subject to change at any time.

Company Notes Series (#8): Ossia International

Editor’s note: This is the latest edition in the “Company Notes Series”, where we periodically share our notes on companies we’ve studied in the recent past but currently have no vested interest in (we may invest in or sell shares in the companies mentioned at any time). The notes are raw and not updated, and the “as of” date for the data is given at the start of the notes. The first seven editions in the series can be found hereherehereherehere, here, and here. Please share your thoughts on the series through the “Contact Us” page; your feedback will determine if we continue with it. Thanks in advance!


Start of notes for Ossia International

Data as of 6 September 2023

Details of Ossia International

  • Established in 1982 as a footwear manufacturer
  • HQ: Singapore
  • Listing exchange: Singapore Exchange
  • Ticker: SGX: O08
  • Employees: 227 at end of FY2023, fiscal year ended 31 March 2023.

Business of Ossia International

  • Ossia International distributes and retails lifestyle, outdoors, luggage, and accessories products. Ossia International has a subsidiary in Taiwan which has exclusive distribution rights for Kangol, True Religion, Tumi, Columbia and Sorel. Ossia International also holds an effective 19.8% stake in Pertama Holdings Pte Ltd, a leading retailer of consumer electronics and home furnishings trading under Harvey Norman retail stores in Singapore and Malaysia. Ossia International’s stake in Pertama Holdings Pte Ltd comes from a 40% stake in Harvey Norman Ossia (Asia) Pte Ltd, which in turn owns 49.4% of Pertama Holdings Pte Ltd. 
  • Kangol is a headwear brand, True Religion is a fashion apparel brand with a focus on denim, Tumi is a luggage brand, Columbia is an outdoor wear brand, and Sorel is a footwear brand.
  • Ossia International has a subsidiary in Malaysia which ceased operations since Jan 2019 and is currently dormant. 
  • The Pertama Holdings Pte Ltd business is accounted for by Ossia International under “Share of results of associated company – net of tax”. Over the past decade, Harvey Norman Ossia (Asia) Pte. Ltd – and in turn Pertama Holdings Pte Ltd – is the sole associated company of Ossia International; Pertama Holdings Pte Ltd’s business also did not change over the past decade, being focused on running Harvey Norman retail stores in Singapore and Malaysia. Ossia International’s effective ownership of Pertama Holdings Pte Ltd has not changed too.
  • The lion’s share of Ossia International’s profit in recent years (see Table 1 below) comes from the share of results of its associated company (the 40% interest in Harvey Norman Ossia (Asia) Pte Ltd, and thus, 19.8% effective interest in Pertama Holdings Pte. Ltd). Ossia International also receives dividends from Harvey Norman Ossia (Asia) Pte Ltd (see Table 1).
  • The non-Harvey Noman retail business of Ossia International currently comes solely from Taiwan. In FY2023, 100% of Ossia International’s S$30.2 million in revenue was from Taiwan. See Heading 3, “Change in Ossia International’s non-Harvey Norman retail business over time” for how the non-Harvey Norman retail business has changed over time.
Table 1

Change in Ossia International’s non-Harvey Norman retail business over time

  • FY2014: Ossia International operated in 4 regional markets (Singapore, Malaysia, Taiwan and Hong Kong), with a distribution network of more than 1,400 channels/outlets, spanning 50 cities. It had more than 40 specialty stores, more than 101 shop-in-shops, 4 franchise stores, and 8 consignment counters in fashion apparel, bags, footwear and golf products. Ossia International had exclusive distribution, licensee and franchise rights of over 40 well-known international brands. Of the 5 brands that Ossia International has distribution rights today (Kangol, True Religion, Tumi, Columbia, and Sorel), it had Kangol, Tumi, and Columbia.
  • FY2015: Ossia International operated in 4 key regional markets (Singapore, Malaysia, Taiwan and Hong Kong), with a distribution network of more than 1,400 channels/outlets, spanning 50 cities. It had more than 40 specialty stores and more than 68 shop-in-shops in fashion apparel, bags, footwear. Ossia International had exclusive distribution, licensee and franchise rights of over 30 well-known international brands. Of the 5 brands that Ossia International has distribution rights today (Kangol, True Religion, Tumi, Columbia, and Sorel), it had Kangol, Tumi, and Columbia.
  • FY2016: Ossia International operated in 4 regional markets (Singapore, Malaysia, Taiwan and Hong Kong) with a distribution network of more than 1,400 channels/ outlets, spanning 50 cities. It had more than 40 specialty stores, and more than 68 shop-in-shop, in fashion apparel, bags, footwear. Ossia International had exclusive distribution, licensee and franchise rights of over 30 well-known international brands. Of the 5 brands that Ossia International has distribution rights today (Kangol, True Religion, Tumi, Columbia, and Sorel), it had Kangol, Tumi, and Columbia.
  • FY2017: Ossia International operated in 2 regional markets (Malaysia and Taiwan). It had exclusive distribution, licensee, and franchise rights for 11 brands. Of the 5 brands that Ossia International has distribution rights today (Kangol, True Religion, Tumi, Columbia, and Sorel), it had Kangol, True Religion, Tumi, and Columbia.
  • FY2018: Ossia International operated in 2 regional markets (Malaysia and Taiwan). It had exclusive distribution, licensee, and franchise rights for 12 brands. Of the 5 brands that Ossia International has distribution rights today (Kangol, True Religion, Tumi, Columbia, and Sorel), it had Kangol, True Religion, Tumi, and Columbia.
  • FY2019: Ossia International operated in Malaysia and Taiwan. It had  exclusive distribution, licensee and franchise rights for 10 brands. Of the 5 brands that Ossia International has distribution rights today (Kangol, True Religion, Tumi, Columbia, and Sorel), it had Kangol, True Religion, Tumi, and Columbia.
  • FY2020: Ossia International operated in Taiwan, and ceased operations of its Malaysia business in FY2019. It had exclusive distribution rights for 5 brands. Of the 5 brands that Ossia International has distribution rights today (Kangol, True Religion, Tumi, Columbia, and Sorel), it had all 5.
  • FY2021: Ossia International operated in Taiwan. It had exclusive distribution rights for 5 brands. Of the 5 brands that Ossia International has distribution rights today (Kangol, True Religion, Tumi, Columbia, and Sorel), it had all 5.
  • FY2022: Ossia International operated in Taiwan. It had exclusive distribution rights for 5 brands. Of the 5 brands that Ossia International has distribution rights today (Kangol, True Religion, Tumi, Columbia, and Sorel), it had all 5.
  • FY2023: Ossia International operates in Taiwan and has exclusive distribution rights for 5 brands (Kangol, True Religion, Tumi, Columbia and Sorel)

Pertama Holdings’ business

  • Table 2 below shows how Pertama Holdings’ business has changed over time, in terms of (1) the growth in the number of Harvey Norman stores in Singapore and Malaysia, and (2) growth in the revenues of the Harvey Norman stores in Singapore and Malaysia. The key takeaways are: (1) Singapore’s store count has been flat, but revenue has been steady; (2) Malaysia has seen steady growth in store count and strong growth in revenue
  • Pertama Holdings Pte Ltd was once a listed entity on the Singapore stock market but was privatised in January 2014.
  • Harvey Norman Holdings (the parent company of Pertama Holdings, listed in Australia) management thinks Malaysia can have up to 80 Harvey Norman stores by 2028.
Table 2

Financials of Ossia International

  • Ossia International’s business quality was poor from FY2013-FY2019 as seen from the mostly negative operating profit. There’s been a recent turnaround, which has coincided with the massive streamlining of the brands that Ossia International distributes (see points under Heading 3, “Change in Ossia International’s non-Harvey Norman retail business over time”)
  • Share of results of associated company – the Harvey Norman stores in Singapore and Malaysia from the 19.8% effective interest in Pertama Holdings – has mostly been positive and has been increasing over time.
  • Operating cash flow (includes dividends from Harvey Norman Ossia (Asia) Pte Ltd) and free cash flow have both improved markedly since FY2018, demonstrating strength of the Harvey Norman business, and a turnaround in fortunes of the Ossia International operating business.
  • Balance sheet has improved markedly over time.
  • The dividend payout ratio for FY2023 is reasonable and suggests that Ossia International is not over-reaching. 
  • Some explanations of Ossia International’s financials in FY2023:
    • Reason for revenue growth: Ossia International’s revenue for FY2023 was up by 27.6%. The increase in sales is mainly due to travel restrictions being lifted, an influx of tourists and travellers has resulted in increased foot traffic and consumer spending in retail establishments. This uptick in retail activity has led to improved sales performance and enhanced profitability for the group’s retail operations.
    • Reason for better associated company performance: Ossia International’s share of results of the associated company has increased from $5.54 million to $7.88 million due to increase in the in sales performance of the associated company during the financial year.
    • Improvement in balance sheet: Ossia International’s bank borrowing has been reduced to zero as the group recovers from the effects of the COVID-19 pandemic, it has successfully managed its financial position and generated enough cash flow to meet its operational and financial needs. This positive development has led to a reduction in the utilization of bank facilities.
    • Reason for slight decrease in operating cash flow: Net cash from operating activities decreased due to income tax payments and a change in payment method to suppliers, resulting in lesser utilization of bank facilities.
Table 3 (total debt includes lease obligations)

Management of Ossia International

  • George Goh Ching Wah, 64, is the executive chairman of Ossia International. George and his brothers (Steven Goh Ching Huat Steven and Joe Goh Ching Lai) are experienced entrepreneurs who cofounded the Group. George Goh is also the Executive Deputy Chairman of Pertama Holdings Pte Ltd. George Goh and his two brothers have more than 35 years of experience in distribution and retailing of lifestyle/sporting/ outdoors products in footwear, apparel, sporting /outdoors goods, bags and accessories under the Group. George Goh also tried to contest in the 2023 Presidential Election in Singapore but his application was rejected by the Presidential Elections Committee.
  • Steven Goh Ching Huat, 58, is the CEO and an executive director of Ossia International.
  • Joe Goh Ching Lai, 64, is a non-executive director of Ossia International. He was appointed as a director on 1 September 1990, re-designated as a non-executive director on 1 May 2009, redesignated as an executive director on 17 June 2016, and re-designated as a non-executive director on 1 July 2021. Joe Goh is also a non-executive director of Pertama Holdings Private Limited.
  • Alan Hsu Chin Tung is the managing director of Great Alps Industry Co., Ltd, Ossia International’s wholly-owned subsidiary that is responsible for Ossia International’s business in Taiwan. Alan is responsible for the product development, brand management, marketing and distribution of footwear, apparel, bags, accessories in Taiwan. Alan joined as a brand manager in 1996 and was promoted to Managing Director in 2001.
  • The three Goh brothers collectively controlled 190.25 million Ossia International shares, or 75.3% of the company’s total shares, as of 20 June 2023. George Goh controlled 75.395 million shares (29.84% of Ossia International’s total shares). The Goh brothers’ Ossia International shares, are worth S$33.1 million at the company’s S$0.172 stock price as of 6 September 2023. This is not significant skin in the game – and it’s also unclear what George Goh’s Ossia International stake is as a percentage of his overall net worth. In the run-up to the 2023 Presidential Election, George Goh mentioned that he manages five companies with a combined shareholders’ equity of S$507 million when averaged over a 3-year period. Ossia International’s FY2023 shareholders’ equity is only S$54.9 million.
  • The Goh brothers’ salaries, shown in the table below, are not egregious compared to Ossia International’s business.
Table 4

Risks associated with Ossia International

  • The Goh brothers call the shots, and minority shareholders have no say
  • There’s a chance that Ossia International’s operating business, and the Harvey Norman stores in Singapore and Malaysia, are over-earning at the moment because of COVID pull-forward. Harvey Norman’s comparable sales in Malaysia for the 6 months ended 30 June 2023 was a negative 9.8%, and the total profit before tax for the Singapore and Malaysia stores for the 12 months ended 30 June 2023 was down 11.7%. 

Valuation of Ossia International

  • S$0.172 stock price as of 6 September 2023.
  • Trailing EPS and FCF per share of S$0.04 and S$0.033, thus PE and PFCF ratios are 4.3 and 5.2 – this is a low valuation if Ossia International’s operating business and the Harvey Norman stores in Singapore and Malaysia are all not over-earning at the moment
  • Attractive dividend yield of 9% given trailing dividend of S$0.018 per share.

Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. We currently have no vested interest in any company mentioned. Holdings are subject to change at any time.

What We’re Reading (Week Ending 01 June 2025)

The best articles we’ve read in recent times on a wide range of topics, including investing, business, and the world in general.

We’ve constantly been sharing a list of our recent reads in our weekly emails for The Good Investors.

Do subscribe for our weekly updates through the orange box in the blog (it’s on the side if you’re using a computer, and all the way at the bottom if you’re using mobile) – it’s free!

But since our readership-audience for The Good Investors is wider than our subscriber base, we think sharing the reading list regularly on the blog itself can benefit even more people. The articles we share touch on a wide range of topics, including investing, business, and the world in general. 

Here are the articles for the week ending 01 June 2025:

1. Shanghai After 16 Years: Three Transformations – Thomas Chua

I’ve recently returned from my trip to China, visiting Suzhou, Shanghai, and Hangzhou. The journey took me down memory lane—my first visit to Suzhou and Shanghai was in late 2008. That 16-year gap gave me a unique lens to measure just how dramatically China has evolved…

…The streets were immaculately clean. No scammers in sight. No need to guard against pickpockets. I even observed people using their valuables to reserve seats—similar to Singapore (though I don’t encourage this practice).

For such a transformation to occur, two factors must work in harmony: competent law enforcement and improved living standards.

My sense is that China’s tech ecosystem plays a crucial role in supporting law enforcement—everything leaves a digital footprint that can be traced, making potential perpetrators think twice…

…My second revelation came in the shopping districts. With the exception of Apple and Lululemon, once-dominant Western stores—Nike, Starbucks, and Under Armour—stood nearly empty.

This isn’t about consumers growing bored with Western offerings. Starbucks continues performing remarkably well in Japan, Bangkok, and Singapore despite nearly three decades of operations. The reality? Competition in China is ruthless…

… Their strong performance in China isn’t guaranteed—it must be continually earned. Currently, Lululemon’s China team operates with significant autonomy from North America, with freedom to customize products and store designs for the domestic market. This differs from their cookie-cutter approach in North America, and they’re crushing it in China…

…China has evolved from producing cheap knockoffs to creating exceptional products.

Beyond high-quality EVs like BYD, there’s DJI dominating consumer drones and handheld vlogging cameras, plus mobile phones like the Vivo X200 Ultra and OPPO Find X8 Ultra—phenomenal devices by any standard.

2. Investing in Iraq – yet more gains to come? – Swen Lorenz

Iraq’s oil and gas reserves are staggering: with current proven reserves of 140bn barrels, Iraq is the fifth-largest oil country on Earth.

Remarkably, with only about USD 3 per barrel, Iraq has extremely low production costs, cheaper even than those in Saudi Arabia. Only Iran can currently produce even cheaper oil…

…Why the cost advantage?

In Iraq, oil tends to be near the surface and therefore quite easy to access…

…Iraq’s oil reserves could be even bigger than what is known today. The country’s Western desert has seen little exploration so far, and some believe it will contain even more oil than the rest of the country. Estimates often cite 300bn barrels of oil in Iraq…

…BP had closed down its last operation in Iraq in 1974, following the nationalisation of the oil industry.

Yet, despite the best efforts by the US and UK governments in the early 2000s, BP and other oil majors weren’t going to get back into the country just yet.

It wasn’t until March 2025 – 51 years after its departure – that BP moved back into Iraq. Remarkably, it’s now re-entering with all the more momentum, even though few outside of the oil industry would have even noticed yet.

Two months ago, BP secured final approval from the Iraqi government to redevelop the vast Kirkuk oil fields. The company committed to spend USD 25bn (!) over 25 years. In the initial phase, it plans to produce an 3bn barrels of oil, but the potential is far greater. According to BP’s press release, “the wider resource opportunity across the contract and surrounding area is believed to include up to 20 billion barrels of oil equivalent.”…

…A few months earlier, France’s Total had begun construction of a gas processing facility, marking the first stage of a major energy project. Although Total had already reached an agreement with Iraq in 2021, subsequent squabbling over contract details delayed construction. With an investment of USD 10bn over 25 years, the project is now finally underway…

…Why the sudden rush by multinationals to invest multi-billions?

Iraq has now remained stable long enough and shown sufficient progress for foreign investment to return. The recent period of relative stability has had a cumulative effect: while few wanted to go in first, everyone is now rushing to get in at once…

…After the tumultuous 2010s, the market had been priced as though Iraq were to disappear off the face of the Earth.

Once investors realised that the country was turning a corner, the reaction was like that of a coiled spring.

What triggered this shift was the flow of information. Investors had been unaware of the changing situation, and once they realised, money started to flow into the market.

In 2023, the Iraqi market rose by 97.2%, followed by 44.8% in 2024 (measured in USD terms). Yet, it has only just returned to its 2014 level.

By some measure, Iraq remains an underdeveloped, underfollowed frontier market. E.g., the market capitalisation of all Iraqi companies stands at just USD 15bn. Relative to the country’s GDP of USD 258bn, that’s a national market capitalisation of just 5.7%…

…Iraq’s ongoing transformation – both politically and economically – does not yet appear to be priced in. Price/earnings ratios are in the mid-single digits but based on depressed earnings, i.e. there is lots of potential for companies to improve profitability through internal measures while also experiencing significant growth.

Currently, there are probably no more than 35,000 investors who have traded on the Iraqi exchange, and less than 5,000 of them could be described as active. Local institutional investors are almost non-existent, and the few foreign investment funds active in Iraq manage a total of just USD 250m…

…In frontier markets, basic industries often offer the best returns, and Iraqi banks are a prime example. In 2023, the total number of bank accounts rose by 51%, the usage of bank cards grew by 22%, and the adoption of e-wallets increased by 68%. The number of shops accepting electronic payments more than doubled, growing 115%…

…Needless to say, Iraq won’t become a developed nation overnight and will continue to face challenges. While oil exports to the US are exempt from reciprocal tariffs, the lower oil price weighs on the country’s income. However, Iraq plans to significantly increase its production. In January 2025, it produced oil at a run-rate of 3.9m barrels per day, aiming to reach 6m barrels per day by 2028 or 2029. If achieved, this volume growth should more than offset lower prices. There are even recent – but speculative – plans to even aim for 12m-13m barrels per day by 2030.

3. What Leonardo’s obsession with water teaches us about longevity – Eric Markowitz

But it’s in his obsession with water — fluid dynamics — where I think his secret becomes clearest.

Leonardo believed water was the “vehicle of nature.” He saw its movements as metaphors for everything: emotion, time, decay, even thought. He studied how it carved stone, how it shaped landscapes, how it sustained life. He used the same drawings of turbulence to explain everything from hair curls to planetary motion. Why does that matter? Because I’ve come to see how systems that last tend to flow, not freeze. They self-correct. They adapt. They look chaotic on the surface, but beneath that turbulence is order. They mirror nature. Which, of course, is what Leonardo saw: longevity isn’t about resisting entropy. It’s about dancing with it.

Leonardo wasn’t just studying fluids. He was fluid. Multidisciplinary. Nonlinear. If he had stayed in one lane — say, just painting or just engineering — he might’ve burned out or faded into obscurity. But he didn’t. He swirled. He looped. He revisited, rethought, revised. Like a river, he stayed alive by never staying still.

So what does Leonardo teach us about how to last?

First: Think like a system. Longevity isn’t a product of brute force. It’s an outcome of design. Leonardo’s mind was wired to see the parts within the whole. The relationship between muscle and movement. Between proportion and perception. Between science and art. He reminds us that siloed thinking leads to short-termism.

Enduring value is built by weaving domains together.

Second: Follow curiosity across boundaries. Leonardo didn’t care if something was “in his field.” He followed the thread. In doing so, he accumulated knowledge that compounded in unexpected ways. His heart drawings influenced his paintings. His engineering influenced his anatomy. If you want to build something that lasts — whether a company, a life, or a legacy — you need to let curiosity be your guide.

4. How Larry Goldstein made $250,000 in 2 hours – Dirtcheapstocks

It’s January 2009…

…Larry finds a tiny little business called Compass Knowledge Holdings (Ticker: CKNO).

CKNO partnered with universities to offer graduate degrees for online learning. Remember, this is 2009. The online learning thing is brand new. CKNO sits in a unique position because it was the only publicly traded online learning platform that was partnered with reputable colleges…

…CKNO was a non-SEC reporting company with a $10mm market cap.

Shares sold for $0.60.

The business was sitting on a mountain of net cash. Current assets were 4x larger than total liabilities.

Despite its overcapitalization, Compass earned a 36% ROE.

Put simply, the stock was cheap…

…The stock would be worth a lot more if it filed with the SEC and a broader set of investors could see how cheap the business was.

But how can you make a company register with the SEC?

There is an obscure rule in public markets. If a business has less than 300 registered shareholders, it can remain “public” without filing financials with the SEC. It’s an odd rule that exists to let smaller companies avoid the cost of filing.

Anyway, Larry decided to register a single share in each of his investors’ names. This was done to increase the number of record holders. Shares held by a single broker come through as one record holder for legal purposes. So, by registering each investor individually, Larry increased the number of record holders.

In response, the company initiated a 1 for 25,000 reverse split in April 2009…

…Anyone owning less than 25,000 shares would be cashed out at $1.45/share.

Not a bad return from $0.60/share in a 3-month period…

…On May 19th, 2009, the split went into effect.

The share count was reduced, and the post-split valuation was $36,250 ($1.45 * 25,000 shares).

To Larry’s amazement, when he checks the quote the morning after the split, he sees shares being offered for $2,000!

This is a 94% discount to where shares traded the day before! And even that price was a steal!

Larry called a market maker, and after double and triple checking, was ensured that the $2,000 offer was in fact for the post-split shares.

Larry was able to buy 100 shares at $2,000 apiece. This purchase effectively valued the business at 0.5x earnings and 20% of net cash.

As it turns out, the seller was UBS. The offer to sell was a mistake…

…After a morning of discussions with FINRA, UBS and market makers, UBS offered to buy back the shares at $4,500 apiece.

Larry decided to take a quick profit and avoid arbitration with an army of UBS lawyers.

So, he sold his 100 shares (after owning them for half a morning) for $4,500 a piece – netting a $2,500 profit on each share.

And that’s how Larry Goldstein made $250,000 in a matter of hours.

He held the remainder of his shares – having owned enough to avoid being cashed out in the reverse split. In October 2010, CKNO sold to Embanet for $209,000/share.

5. Building Blocks of Corporate Accounting: Intercorporate Shenanigans – Javier Pérez

Companies use affiliates—subsidiaries, associates, joint ventures—to pursue legitimate business opportunities. But when pressure mounts and performance stumbles, management can misuse those same affiliates to quietly hide problems. Debt disappears into unconsolidated entities. Revenue magically appears through transactions with related parties. Margins get inflated by shifting costs into partially owned ventures.

Here’s a simple framework to visualize the main accounting tricks enabled by affiliates:

Hide debt: A company creates or uses affiliates where it owns less than 50% — just enough to avoid “control” under consolidation rules (IFRS 10 or ASC 810). Even if the parent funds the affiliate, or guarantees its loans, as long as it doesn’t officially control it, the affiliate’s liabilities don’t show up on the parent company’s balance sheet.

Fake revenue: The company sets up or funds related entities that pose as independent customers. It then sells products or services to these entities, booking it as legitimate revenue. In truth, the cash used by the “customer” may have come from the company itself — via loans, marketing payments, or off-the-books financing.

Boost margins: The parent company sells goods or services to an affiliate or JV it owns, say, 30%. It sells at inflated prices, booking high profits. The affiliate eats the inflated costs, but since only 30% of the affiliate’s loss flows back to the parent (via equity method), the other 70% is “outsourced.” The parent books 100% of the gain on the transaction, but only absorbs a fraction of the cost impact from the affiliate. The result is asymmetric — a sort of profit laundering.

None of these tactics necessarily break accounting rules outright, at least initially. In fact, they often begin by exploiting genuine gray areas—using subtle tricks like careful structuring to keep subsidiaries below consolidation thresholds or cleverly timed transactions that auditors find hard to challenge. Over time, the line between aggressive accounting and outright fraud blurs, often unnoticed by investors until it’s too late…

…On the surface, Pescanova was a solid business: fishing fleets around the world, processing plants across multiple continents, and an ambitious international expansion. The story resonated well with investors, particularly in the mid-2000s, as Spain’s economy boomed. Investors saw steady growth, seemingly controlled debt levels, and consistent profits—exactly what you’d expect from a thriving global player…

…To understand exactly what Pescanova did, you need to know a bit about consolidation rules (remember those from the last article?). Under IFRS (specifically IFRS 10, previously IAS 27), companies must consolidate subsidiaries that they “control”—typically meaning they hold over 50% of shares or exert significant decision-making influence.

But consolidation isn’t always black-and-white. IFRS rules are principles-based, leaving substantial room for interpretation. Pescanova exploited this flexibility ruthlessly, ensuring that many entities—particularly those carrying significant debt—were carefully structured so they appeared outside the direct control of the parent. In reality, these companies were fully funded by Pescanova, directly or indirectly, through guarantees or hidden agreements.

By creating subsidiaries that technically sat just below the consolidation threshold (often just below 50% ownership), Pescanova legally avoided putting their massive debts onto its consolidated balance sheet. These were debts incurred to finance aggressive expansions—like shrimp farms in Ecuador, fish processing plants in Namibia, and ambitious salmon-farming ventures in Chile. Investors saw ambitious expansion, but not the corresponding liabilities…

…Pescanova’s accounting creativity wasn’t limited to hiding debt. They simultaneously inflated revenues through fictitious or exaggerated intercompany sales. Here’s how it worked:

  • Pescanova’s parent entity would “sell” products to a shell subsidiary or affiliate at inflated prices.
  • The affiliate would then record fake sales (often to other controlled entities), recognizing substantial revenue growth.
  • On consolidation, some of these intercompany transactions should eliminate—meaning revenues and profits from internal sales typically disappear when financial statements consolidate. But crucially, if the entities involved weren’t fully consolidated (below 50%), the transactions never canceled out fully.
  • Pescanova thus created the illusion of steady revenue growth and robust profitability—despite many sales being little more than accounting mirages.

Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. We currently have a vested interest in Apple and Starbucks. Holdings are subject to change at any time.

Passive Income

Knowing what you want to achieve, and what passive income is, is important

Passive income is not just income earned outside of your job. The real meaning of passive income is money that you earn with little to no effort. 

Some may think that money earned in the stock market or from properties is passive income. Yet, this may not be the case. Stock market investors can sometimes be so caught up in trading and looking at stock prices that investing becomes a huge part of their lives and can even be considered another job. Property investing can also turn out to be tedious if you manage your properties yourself.

I know of friends who want to earn passive income but instead end up spending so much time on their investments. Don’t get me wrong. I love spending time learning and investing but this isn’t really “passive”.

Here’s how you can earn real passive income.

Stop trading the stock market

First, stop short-term trading. My definition of trading is buying stocks in the very short-term based on price action and charts. This is not investing and can become a part-time or full-time job as it requires a lot of time and effort. The more trades you need to make, the more effort is required.

We should aim to invest in a way that reduces the number of trades and amount of work that we need to do. 

One way to do this is by investing long-term in set-and-forget investments. Investing in stocks that have the potential to grow earnings (and thus the share price) reliably over the long-term is one good strategy that lowers the time spent on investing.

You can also invest in passive index ETFs that track the performance of broad market indexes. Stock indexes have historically increased in value over a sufficiently long period of time and provide a good way to gain exposure to some of the biggest and most profitable companies.

You can also invest in dividend stocks that reliably pay a dividend. Investors from Singapore enjoy tax-free dividends when they invest in Singapore-listed dividend-paying stocks.

Outsource your investing

Another way to reduce time and effort spent on investing is to outsource your investing to an expert.

One way to do this is by employing financial experts who can advise you on stocks to buy or funds to purchase. You can also use robo advisors which can help you allocate your portfolio into a variety of investments.

While you will need to pay a fee for these services, having someone to invest on your behalf or advise you frees you from the hassle of doing everything yourself and saves you a ton of time.

Invest in other passive assets

You can also invest in other assets beside the stock market.

Assets such as long-term fixed deposits, government bonds, or even professionally-managed real estate may be a good way to grow your wealth without doing much work.

If you can find long-term investments that require little effort on your part but can provide a stable passive return, this is a potentially good asset to invest in and reduce your investment effort.

Know your goals

What do you really want to achieve? Do you want to grow your wealth as quickly as possible?

Then by all means go ahead and dig through annual reports, scour the market for undervalued stocks, sell weekly put options, or even manage your own AirBnB property for higher rental yields. There is nothing wrong with this and is my preferred style of investing.

But this is not passive income.

If you really want passive income, invest in long-term assets, find a professional or robo advisor to manage your wealth or build passive income by dollar cost averaging into funds or long-term assets.

While this may not always give you the best fee-adjusted returns, this is a true passive income strategy and frees up your time for other things in life.

Ultimately, when it comes to investing there is no one-size-fit-all strategy and knowing what you want to achieve can help determine how you should approach investing your spare capital (and time).


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I do not have a vested interest in any companies mentioned. Holdings are subject to change at any time.

The Latest Thoughts From American Technology Companies On AI (2025 Q1)

A collection of quotes on artificial intelligence, or AI, from the management teams of US-listed technology companies in the 2025 Q1 earnings season.

The way I see it, artificial intelligence (or AI), really leapt into the zeitgeist in late-2022 or early-2023 with the public introduction of DALL-E2 and ChatGPT. Both are provided by OpenAI and are software products that use AI to generate art and writing, respectively (and often at astounding quality). Since then, developments in AI have progressed at a breathtaking pace.

We’re thick in the action of the latest earnings season for the US stock market – for the first quarter of 2025 – and I thought it would be useful to collate some of the interesting commentary I’ve come across in earnings conference calls, from the leaders of technology companies that I follow or have a vested interest in, on the topic of AI and how the technology could impact their industry and the business world writ large. This is an ongoing series. For the older commentary:

With that, here are the latest commentary, in no particular order:

Airbnb (NASDAQ: ABNB)

Airbnb’s management thinks that designing end-to-end travel is very difficult and travelers often find planning travel to be very complicated, so travelers do it very infrequently; management thinks that a great user interface is the key to designing a great end-to-end travel experience for Airbnb users, and AI will be an important way to do it

I think a lot of companies have tried to like design an end-to-end travel. I think designing end-to-end travel is very, very hard. It’s funny — there’s this funny thing. One of the most common start-up ideas for entrepreneurs is to do a travel planning app. And yet travel planning apps almost always fail. So it’s almost like a riddle, why do travel planning apps fail, and everyone really tries to do it? And the reason why is because to plan travel is very complicated. In fact, it’s so complicated many people have assistants and a big part of their job is to plan travel for them. And yet you use it infrequently. So it’s a very difficult thing to do and you do it infrequently. And so therefore, a lot of companies have failed to design like a so-called connected trip. So I think to do this, a lot of it is to design a really good user experience. And I think that’s one of the things that we’re going to try to do to really design a great end-to-end experience, to able to book your entire trip, and much more. I think the user interface will be important. I think AI will be an important way to do this as well…

…We’re focused on making everything instant book and easy to use. We’re trying to make sure that the end-to-end travel experience is really, really wonderful with great Airbnb design, and we’re going to bring more AI into the application so that Airbnb, you can really solve your own problems with great self-solve through AI customer service agents.

Airbnb’s management recently rolled out an AI customer service agent; 50% of Airbnb’s US users are already using the customer service agent and it will soon be rolled out to 100% of Airbnb’s US users; management thinks Airbnb’s AI customer service agent is the best of its kind in travel, having already led to a 15% reduction in users needing to contact human agents; the AI customer service agent will be more personalised and agentic in the years ahead

We just rolled out our AI customer service agent this past month. 50% U.S. users are now using the agent, and we’ll roll it out to 100% of U.S. users this month. We believe this is the best AI-supported customers travel agent in travel. It’s already led to a 15% reduction in people needing to contact live human agents and it’s going to get significantly more personalized and agentic over the years to come.

Alphabet (NASDAQ: GOOG)

AI Overviews in Search now has more than 1.5 billion monthly users; AI Mode has received early positive reaction; usage growth of AI Overviews continues to increase nearly a year after its launch; management is leaning heavily into AI Overviews; management released the AI Mode in March as an experiment; AI Mode searches are twice as long as traditional search queries; AI Mode is getting really positive feedback from early users; the volume of commercial queries on Google Search has increased with the launch of AI Overviews; AI Overviews is now available in 15 languages and 140 countries; AI Overviews continues to monetise at a similar rate to traditional Search; reminder that ads within AI Overviews was launched in mobile in the USA in late-2024; an example of longer search queries in AI Mode is product comparisons; management thinks AI Overviews in Search and Gemini as 2 distinct consumer experiences; management thinks of AI Mode as a way to discover how the most advanced users are using AI-powered search

AI Overviews is going very well with over 1.5 billion users per month, and we are excited by the early positive reaction to AI Mode…

…Nearly a year after we launched AI Overviews in the U.S., we continue to see that usage growth is increasing as people learn that Search is more useful for more of their queries. So we are leaning in heavily here, continuing to roll the feature out in new countries to more users and to more queries. Building on the positive feedback for AI Overviews, in March, we released AI Mode, an experiment in labs. It expands what AI Overviews can do with more advanced reasoning, thinking and multimodal capabilities to help with questions that need further exploration and comparisons. On average, AI Mode queries are twice as long as traditional search queries. We’re getting really positive feedback from early users about its design, fast response time and ability to understand complex, nuanced questions…

…As we’ve mentioned before, with the launch of AI Overviews, the volume of commercial queries has increased. Q1 marked our largest expansion to date for AI Overviews, both in terms of launching to new users and providing responses for more questions. The feature is now available in more than 15 languages across 140 countries. For AI Overviews, overall, we continue to see monetization at approximately the same rate, which gives us a strong base in which we can innovate even more…

…On the ads of — in AI Overviews, last — late last year, actually, we launched them within the AI Overviews on mobile in the U.S. And this builds on our previous rollout of ads above and below. So this was a change that we have…

…I mentioned people typing in longer queries. There’s a lot more complex, nuanced questions. People are following through more. People are appreciating the clean design, the fast response time and the fact that they can kind of be much more open-ended, can undertake more complicated tasks. Product comparisons, for example, has been a positive one, exploring how tos, planning a trip…

…On AI-powered search and how do we see our consumer experience. Look, I do think Search and Gemini, obviously, will be 2 distinct efforts, right? I think there are obviously some areas of overlap, but they’re also — like expose very, very different use cases. And so for example, in Gemini, we see people iteratively coding and going much deeper on a coding workflow, as an example. So I think both will be around…

…AI Mode is the tip of the tree for us pushing forward on an AI-forward experience. There will be things which we discover there, which will make sense in the context of AI Overviews, so I think will flow through to our user base. But you almost want to think of what are the most advanced 1 million people using Search for, the most advanced 10 million people and then how do 1.5 billion people use Search for.

Alphabet’s management rolled out Alphabet’s latest foundation model, Gemini 2.5, in 2025 Q1; Gemini 2.5 is widely recognised as the best model in the industry; Gemini 2.5 Pro debuted at No.1 on the Chatbot Arena in 2025 Q1 by a significant margin; activer users in AI Studio and Gemini API is up 200% since the start of 2025; Alphabet introduced Gemini 2.5 Flash in April 2025; Gemini models are now found in all of Alphabet’s 15 products with at least 0.5 billion users each; Alphabet is upgrading Google Assistant on mobile devices to Gemini, and will also upgrade tablets, cars, and devices that connect to phones later this year; the Pixel 9a phone with Gemini integration was launched to strong reviews; the Gemini Live camera feature, among others, will soon be rolled out to all Android devices

This quarter was super exciting as we rolled out Gemini 2.5, our most intelligent AI model, which is achieving breakthroughs in performance, and it’s widely recognized as the best model in the industry…

…We released Gemini 2.5 Pro last month, receiving extremely positive feedback from both developers and consumers. 2.5 Pro is state-of-the-art on a wide range of benchmarks and debuted at #1 on the Chatbot Arena by a significant margin. 2.5 Pro achieved big leaps in reasoning, coding, science and math capabilities, opening up new possibilities for developers and customers. Active users in AI Studio and Gemini API have grown over 200% since the beginning of the year…

…Last week, we introduced 2.5 Flash, which enables developers to optimize quality and cost…

…All 15 of our products with 0.5 billion users now use Gemini models…

…We are upgrading Google Assistant on mobile devices to Gemini. And later this year, we’ll upgrade tablets, cars and devices that connect to your phones such as headphones and watches. The Pixel 9a launched very strong reviews, providing the best of Google’s AI offerings like Gemini Live and AI-powered camera features. And Gemini Live camera and screen sharing is now rolling out to all Android devices, including Pixel and Samsung S25.

Google Cloud is offering the industry’s widest range of TPUs and GPUs; Alphabet’s 7th generation TPU, Ironwood, has 10x better compute power and 2x better power efficiency than the previous generation TPU; Google Cloud is the first cloud provider to offer NVIDIA’s Blackwell family of GPUs; Google Cloud will be offering NVIDIA’s upcoming Rubin family of GPUs

Complementing this, we offer the industry’s widest range of TPUs and GPUs and continue to invest in next-generation capabilities. Ironwood, our seventh-generation TPU and most powerful to date, is the first designed specifically for inference at scale. It delivers more than 10x improvement in compute power or a recent high-performance TPU while being nearly twice as power efficient. Our strong relationship with NVIDIA continues to be a key advantage for us and our customers. We were the first cloud provider to offer NVIDIA’s groundbreaking B200 and GB200 Blackwell GPUs, and we’ll be offering their next-generation Vera Rubin GPUs.

Alphabet’s management is rolling out the company’s latest image and video generation models; Alphabet has launched its open-sourced Gemma 3 model in March 2025; Gemma models have been downloaded more than 140 million times; Alphabet is developing robotics AI models; Alphabet has launched a multi-agent AI research system called AI Co-Scientist; the AlphaFold model has been used by more than 2.5 million researchers

Our latest image and video generation models, Imagen 3 and Veo 2, are rolling out broadly and are powering incredible creativity. Turning to open models. We launched Gemma 3 last month, delivering state-of-the-art performance for its size. Gemma models have been downloaded more than 140 million times. Lastly, we are developing AI models in new areas where there’s enormous opportunity, for example, our new Gemini Robotics models. And in health, we launched AI Co-Scientist, a multi-agent AI research system, while AlphaFold has now been used by over 2.5 million researchers.

Google Cloud’s AI developer platform, Vertex AI, now has more than 200 foundation models available, including Alphabet’s in-house models and third-party models

Our Vertex AI platform makes over 200 foundation models available, helping customers like Lowe’s integrate AI. We offer industry-leading models, including Gemini 2.5 Pro, 2.5 Flash, Imagen 3, Veo 2, Chirp and Lyria, plus open-source and third-party models like Llama 4 and Anthropic.

Google Cloud is the leading cloud platform for building AI agents; Google Cloud has an open source framework for building AI agents and multi-agent systems called Agent Development Kit; Google Cloud has a low-code agent-building tool called Agent Designer; KPMG is using Google Cloud to deploy AI agents to employees; Google Cloud has the Google Agentspace product that helps employees in organisations use AI agents widely; Google Cloud offers pre-packaged AI agents across various functions including coding and customer engagement; Alphabet is working on agentic experiences internally and deploying it across the company; Alphabet’s customer service teams have deployed AI agents to dramatically enhance the user experience and is teaching Google Cloud customers how to do so

We are the leading cloud solution for companies looking to the new era of AI agents, a big opportunity. Our Agent Development Kit is a new open-source framework to simplify the process of building sophisticated AI agents and multi-agent systems. And Agent Designer is a low-code tool to build AI agents and automate tasks in over 100 enterprise applications and systems.

We are putting AI agents in the hands of employees at major global companies like KPMG. With Google Agentspace, employees can find and synthesize information from within their organization, converse with AI agents and take action with their enterprise applications. It combines enterprise search, conversational AI or chat and access to Gemini and third-party agents. We also offer pre-packaged agents across customer engagement, coding, creativity and more that are helping to provide conversational customer experiences, accelerate software development, and improve decision-making…

…Particularly with the newer models, I think we are working on early agentic workflows and how we can get those coding experiences to be much deeper. We are deploying it across all parts of the company. Our customer service teams are deeply leading the way there. We’ve both dramatically enhanced our user experience as well as made it much more efficient to do so. And we are actually bringing all our learnings and expertise in our solutions through cloud to our other customers. But beyond that, all the way from the finance team preparing for this earnings call to everything, it’s deeply embedded in everything we do.

Waymo is now serving 250,000 trips per week (was 150,000 in 2024 Q4), up 5x from a year ago; Waymo launched its paid service in Silicon Valley in 2025 Q1; Waymo has expanded in Austin, Texas, and will launch in Atlanta later this year; Waymo will launch in Washington DC and Miami in 2026; Waymo continues to make progress in airport access and freeway driving; management thinks Alphabet will not be able to scale Waymo by themselves, so partners are needed

Waymo is now safely serving over 0.25 million paid passenger trips each week. That’s up 5x from a year ago. This past quarter, Waymo opened up paid service in Silicon Valley. Through our partnership with Uber, we expanded in Austin and are preparing for our public launch in Atlanta later this summer. We recently announced Washington, D.C. as a future ride-hailing city, going live in 2026 alongside Miami. Waymo continues progressing on 2 important capabilities for riders, airport access and freeway driving…

More businesses are adopting Alphabet’s AI-powered campaigns; Alphabet’s recent work with AI is helping advertisers reach customers and searches where advertising would previously not be showed; Alphabet is infusing AI at every step of the marketing process for advertisers, for example, (1) advertisers can now generate a broader variety of lifestyle imagery customized to their business, (2) in PMax, advertisers can automatically source images from their landing pages and crop them, (3) on media buying, AI-powered campaigns continue to help advertisers find new customers, (4) in Demand Gen, advertisers can more precisely manage ad placements and understand which assets work best at a channel level; users of Demand Gen now see an average 26% year-on-year increase in conversions per dollar spend; when Demand Gen is paired with Product Feed, advertisers see double the conversion per dollar spend year-over-year on average; Royal Canin used Demand Gen and PMax campaigns and achieved a 2.7x higher conversion rate, a 70% lower cost per acquisition for purchases, a 8% higher value per user

More businesses, big and small, are adopting AI-powered campaigns, and the deployment of AI across our Ads business is driving results for our customers and for our business. Throughout 2024, we launched several features that leverage LLMs to enhance advertiser value, and we’re seeing this work pay off. The combination of these launches now allows us to match ads to more relevant search queries. And this helps advertisers reach customers and searches where we would not previously have shown their ads.

Focusing on our customers, we continue to solve advertisers’ pain points and find opportunities to help them create, distribute and measure more performant ads, infusing AI at every step of the marketing process. On Audience Insights, we released new asset audience recommendations, which tell businesses the themes that resonate most with their top audiences. On creatives, advertisers can now generate a broader variety of lifestyle imagery customized to their business to better engage their customers and use them across PMax, demand gen, display and app campaigns. Additionally, in PMax, advertisers can automatically source images from their landing pages and crop them, increasing the variety of their assets. On media buying, advertisers continue to see how AI-powered campaigns help them find new customers. In Demand Gen, advertisers can more precisely manage ad placements across YouTube, Gmail, Discover and Google Display Network globally and understand which assets work best at a channel level. Thanks to dozens of AI-powered improvements launched in 2024, businesses using Demand Gen now see an average 26% year-on-year increase in conversions per dollar spend for goals like purchases and leads. And when using Demand Gen with Product Feed, on average, they see more than double the conversion per dollar spend year-over-year…

…Royal Canin combined Demand Gen and PMax campaigns to find more customers for its cat and dog food products. The integration resulted in a 2.7x higher conversion rate, a 70% lower cost per acquisition for purchases and increased the value per user by 8%.

Google Cloud still has more AI demand than capacity in 2025 Q1 (as it did in 2024 Q4) 

Recall I’ve stated on the Q4 call that we exited the year in Cloud specifically with more customer demand than we had capacity. And that was the case this quarter as well.

30% of new code at Alphabet is now generated by AI (it was 25% in 2024 Q3)

We’re continuing to make a lot of progress there in terms of people using coding suggestions. I think the last time I had said, the number was like 25% of code that’s checked in. It involves people accepting AI-suggested solutions. That number is well over 30% now. But more importantly, we have deployed more deeper flows.

Amazon (NASDAQ: AMZN)

AWS grew 17% year-on-year in 2025 Q1, and is now at a US$117 billion annualised revenue run rate (was US$115 billion in 2024 Q4); management used to think AWS could be a multi-hundred billion dollar revenue run rate business without AI and now that there’s AI, they think AWS could be even bigger; AWS’s AI business is now at a multi-billion annual revenue run rate and is growing triple-digits year-on-year; the shifting from on-premise to the cloud is still a huge tailwind for AWS, and now even more so as companies that want realize the full potential of AI will need to shift to the cloud; AWS is currently still supply constrained and it will be on a lot more new chips in the coming months; management thinks that the supply chain issues with chips will get better as the year progresses

AWS grew 17% year-over-year in Q1 and now sits at a $117 billion annualized revenue run rate…

…Before this generation of AI, we thought AWS had the chance to ultimately be a multi-hundred billion dollar revenue run rate business. We now think it could be even larger…

…Our AI business has a multibillion-dollar annual revenue run rate, continues to grow triple-digit year-over-year percentages and is still in its very early days…

…Infrastructure modernization is much less sexy to talk about than AI, but fundamental to any company’s technology and invention capabilities, developer productivity, speed and cost structure. And for companies to realize the full potential of AI, they’re going to need their infrastructure and data in the cloud…

…During the first quarter, we continued to see growth in both generative AI business and non-generative AI offerings as companies turn their attention to newer initiatives, bring more workloads to the cloud, restart or accelerate existing migrations from on-premises to the cloud and tap into the power of Generative AI…

…We — as fast as we actually put the capacity in, it’s being consumed. So I think we could be driving — we could be helping more customers driving more revenue for the business if we had more capacity. We have a lot more Trainium2 instances and the next generation of NVIDIA’s instances landing in the coming months…

…I do believe that the supply chain issues and the capacity issues will continue to get better as the year proceeds.

Management is directing Amazon to invest aggressively in AI; Amazon is building 1000-plus AI applications across the company; the next generation of Alexa is Alexa+; Amazon is using AI in its fulfilment network, robotics, shopping, and more

If you believe your mission is to make customers’ lives easier and better every day, and you believe that every customer experience will be reinvented with AI, you’re going to invest very aggressively in AI, and that’s what we’re doing. You can see that in the 1,000-plus AI applications we’re building across Amazon. You can see that with our next generation of Alexa, named Alexa+. You can see that in how we’re using AI in our fulfillment network, robotics, shopping, Prime Video and advertising experiences. And you can see that in the building blocks AWS is constructing for external and internal builders to build their own AI solutions.

AWS’s in-house AI chip, Trainium 2, is starting to lay in capacity in larger quantities with significant appeal and demand; AWS will always be offering AI chips from multiple providers, but Trainium 2 offers a compelling option with 30%-40% better price performance; management believes that the price of inference needs to be much lower for AI to be successful, and they think the price of inference will go down; Anthropic is still building its next few models with Trainium 2

Our new custom AI chip Trainium2 is starting to lay in capacity in larger quantities with significant appeal and demand. While we offer customers the ability to do AI in multiple chip providers and will for as long as I can foresee, customers doing AI at any significant scale realize that it can get expensive quickly. So the 30% to 40% better price performance that Trainium2 offers versus other GPU-based instances is compelling. For AI to be as successful as we believe it can be, the price of inference needs to come down significantly…

…I would say that we’ve been bringing on a lot of P5, which is a form of NVIDIA chip instances, as well as landing more and more Trainium2 instances as fast as we can…

…Anthropic is running — building the next few training models on top of our Trainium2 chip on AWS…

…As they’re waiting to see the cost of inference continue to go down, which it will.

The latest premier Amazon Nova model was launched yesterday and it delivers frontier intelligence and industry-leading price performance; thousands of customers are already using Amazon Nova models; Amazon Nova Sonic, a speech-to-speech foundation model, was recently released and it enables developers to build voice-based AI applications; Amazon Nova Sonic has lower word error rates and higher win rates over other comparable models; AWS recently released a research preview of Amazon Nova Act, a new AI model that can perform actions within a web browser; Amazon Nova Act aims to move the current state-of-the-art accuracy of multi-step agentic actions from 30%-60% to 90%-plus

We offer our own Amazon Nova state-of-the-art foundation models in Bedrock with the latest premier model launching yesterday. They deliver frontier intelligence and industry-leading price performance, and we have thousands of customers already using them, including Slack, Siemens, Sumo Logic, Coinbase, FanDuel, Glean and Blue Origin. A few weeks ago, we released Amazon Nova Sonic, a new speech-to-speech foundation model that enables developers to build voice-based AI applications that are highly accurate, expressive and human-like. Nova Sonic has lower word error rates and higher win rates over other comparable models for speech interactions…

…We’ve just released a research preview of Amazon Nova Act, a new AI model trained to perform actions within a web browser. It enables developers to break down complex workflows into reliable atomic commands like search or checkout or answer questions about the screen. It also enables them to add more detailed instructions to these commands where needed, like don’t accept the insurance upsell. Nova Act aims to move the current state-of-the-art accuracy of multistep agentic actions from 30% to 60% to 90-plus percent with the right set of building blocks to build these action-oriented agents.

Amazon’s management sees question-and-answer being the only current use-case for AI agents, but they want AI agents to be be capable of performing a wide variety of complex tasks and they have built Alexa+ to be such an agent; management launched a new lightning fast AI agent coding experience in Amazon Q in 2025 Q1 and customers are loving it; management has made generally available GitLab Duo with Amazon Q, which enables AI agents to assist multi-step tasks; Alexa+ is meaningfully smarter and more capable than the previous Alexa; Alexa+ is free with Prime and available for non-Prime customers at $19.99 per month; Alexa+ is just starting to be rolled out in the USA and will be introduced to other countries later in 2025; users really like Alexa+ thus far; Alexa+ is now with more than 100,000 users; Amazon already has 0.5 billion devices in people’s homes and cars that can easily distribute Alexa+; management thinks users will have to relearn a little on how to communicate with Alexa+, but the communication experience is now much better; management asked Alexa+  about good Italian restaurants in New York and Alexa+ helped to make a reservation

To date, virtually all of the agentic use cases have been of the question-answer variety. Our intention is for agents to perform wide-ranging complex multistep tasks by organizing a trip or setting the lighting, temperature and music ambience in your house for dinner guests or handling complex IT tasks to increase business productivity. There haven’t been action-oriented agents like this until Alexa+…

…This past quarter, Amazon Q, the most capable generative AI-powered assistant for accelerating software development and leveraging your own data, launched a lightning fast new agent coating experience within the command line interface that can execute complex workflows autonomously. Customers are loving this. We also made generally available GitLab Duo with Amazon Q, enabling AI agents to assist multi-step tasks such as new feature development, code-based upgrades for Java 8 and 11, while also offering code review and unit testing, all within the same familiar GitLab platform…

…We introduced Alexa+, our next-generation Alexa personal assistant, who is meaningfully smarter and more capable than our prior self can both answer virtually any question and take actions and is free with Prime or available to non-Prime customers for $19.99 a month. We’re just starting to roll this out in the U.S., and we’ll be expanding to additional countries later this year. People are really liking Alexa+ this far…

…So we’ve worked hard on that in Alexa+. We’ve been — we started rolling out over the last several weeks. It’s with now over 100,000 users with more rolling out in the coming months. And so far, the response from our customers has been very, very positive…

…We’re very fortunate in that we have over 0.5 billion devices out there in people’s homes and offices and cars. So we have a lot of distribution already…

…To some degree, there will be a little bit of rewiring for people on what they can do because you get used to patterns. I mean even the simple thing of not having to speak, Alexa speak anymore, we we’re all used to saying, Alexa, before we want every action to happen. And what you find is you really don’t have to do it the first time, and then really the conversation is ongoing where you don’t have to say Alexa anymore. And I’ve been lucky enough to have the alpha and the beta that I’ve been playing with for several months, and it took me a little bit of time to realize they didn’t have to keep saying Alexa, it’s very freeing when you don’t have to do that…

…When I was in New York, when we were announcing, I asked her, what were the — we did the event way downtown. I asked her what was great Italian restaurants or pizza restaurants, she gave me a list and she asked me if she wanted me to make a reservation. I said yes. And she made the reservation and confirmed the time, like that. When you get into those types of routines and you have those types of experience, they’re very, very useful.

The majority of Amazon’s capital expenditure (capex) in 2025 Q1 was for AWS’s technology infrastructure, including the Trainium chips

Turning to our cash CapEx, which was $24.3 billion in Q1. The majority of this spend is to support the growing need for technology infrastructure. It primarily relates to AWS as we invest to support demand for our AI services and increasingly in custom silicon like Trainium as well as tech infrastructure to support our North America and International segments. We’re also investing in our fulfillment and transportation network to support future growth and improve delivery speeds and our cost structure. This investment will support growth for many years to come.

The vast majority of successful startups are built on AWS; high-profile startups building AI coding agents are on AWS

If you look at the start-up space, the vast majority of successful start-ups over the last 10 to 15 years have run on top of AWS…

…If you just look at the growth of these coding agents in the last few months, these are companies like Cursor or Vercel, both of them run significantly on AWS.

Amazon’s management thinks that current AI apps have yet to really tackle customer experiences that are going to be reinvented and many other agents that are going to be built

What’s interesting in AI is that we still haven’t gotten to all the other customer experiences that are going to be reinvented and all the other agents that are going to be built. They’re going to take the role of a lot of different functions today. And those are — they’re — even though we have a lot of combined inference in those areas, I would say we’re not even at the second strike of the first batter in the first inning. It is so early right now.

AWS operating margin improved from 37.6% in 2024 Q1 to 39.5% in 2025 Q1, but margins will fluctuate from time to time; AWS’s margin strength is from the business’s strong growth, the impact of some continued investments, and AWS’s custom chips; the investments include software optimisations for server capacity, low-cost custom networking equipment, and power usage in data centers

AWS operating income was $11.5 billion and reflects our continued growth coupled with our focus on driving efficiencies across the business. As we said before, we expect AWS operating margins to fluctuate over time, driven in part by the level of investments we’re making at any point in time…

…We had a strong quarter in AWS, as you mentioned, the margin performance. I would attribute it to the strong growth that we’re seeing, coupled with the impact of some continued investment we’re making in innovation and technology. I’ll give you some examples. So we invest in software and process improvements and ends up optimizing our server capacity, which helps our infrastructure cost. We’ve been developing more efficient network using our low-cost custom networking gear. We’re working to maximize the power usage in our existing data centers, which both lowers our costs and also reclaims power for other newer workloads. And we’re also seeing the impact of advancing custom silicon like Graviton. It provides lower cost not only for us, but also for our customers, better price performance for them.

Apple (NASDAQ: AAPL)

Apple is currently shipping an LLM (large language model) on the iPhone 16 where some of the queries are being handled on the device itself

As you know, we’re shipping an LLM on the iPhone 16 today. And there are — some of the queries that are being used by our customers are on-device, and then others go to the private cloud where we’ve essentially mimicked the security and privacy of the device into the cloud. nd then others, for world knowledge, are with the integration with ChatGPT.

The new Mac Studio has Apple’s M4 Max and M3 Ultra chips, and it can run large language models with over 600 billion parameters entirely in memory

The new Mac Studio is the most powerful Mac we’ve ever shipped, equipped with M4 Max and our new M3 Ultra chip. It’s a true AI powerhouse capable of running large language models with over 600 billion parameters entirely in memory.

Apple has released VisionOS 2.4 which unlocks the first set of Apple Intelligence features for Vision Pro users

VisionOS 2.4 unlocks the first set of Apple Intelligence features for Vision Pro users while inviting them to explore a curated and regularly updated collection of spatial experiences with the Spatial Gallery app.

Apple’s management has released iOS 18.4, which brings Apple Intelligence to more languages (including Singlish); Apple has built its own foundation models for everyday tasks; new Apple Intelligence features in iOS 18 include, Writing Tools, Genmoji, Image Playground, Image Wand, Clean Up, Visual Intelligence, and a seamless connection to ChatGPT

Turning to software. We just released iOS 18.4, which brought Apple Intelligence to more languages, including French, German, Italian, Portuguese, Spanish, Japanese, Korean and simplified Chinese as well as localized English to Singapore and India…

At WWDC24, we announced Apple Intelligence and shared our vision for integrating generative AI across our ecosystem into the apps and features our users rely on every day. To achieve this goal, we built our own highly capable foundation models that are specialized for everyday tasks. We designed helpful features that are right where our users need them and are easy to use. And we went to great lengths to build a system that protects user privacy whether requests are processed on-device or in the cloud with Private Cloud Compute, an extraordinary step forward for privacy and AI.

Since we launched iOS 18, we’ve released a number of Apple Intelligence features from helpful Writing Tools to Genmoji, Image Playground, Image Wand, Clean Up, visual intelligence and a seamless connection to ChatGPT. We made it possible for users to create movies of their memories with a simple prompt and added AI-powered photo search, smart replies, priority notifications, summaries for mail, messages and more. We’ve also expanded these capabilities to more languages and regions.

Apple’s in-house chips are designed with a neural engine that powers AI features across Apple’s products and 3rd-party apps; management thinks the neural engine makes Apple products the best devices for generative AI

AI and machine learning are core to so many profound features we’ve rolled out over the years to help our users live a better day. It’s why we designed Apple silicon with a neural engine that powers so many AI features across our products and third-party apps. It’s also what makes Apple products the best devices for generative AI.

Apple still needs more time to work on the more personalised Siri that was unveiled by management recently

With regard to the more personal Siri features we announced, we need more time to complete our work on these features so they meet our high-quality bar. We are making progress and we look forward to getting these features into customers’ hands.

Apple has low capital expenditures for AI relative to other US technology giants because it uses 3rd-party data centers so they are mostly operating expenses; Apple’s new $500 billion investment in the USA could signal more capital expenditures and data center investments

On the data center side, we have a hybrid strategy. And so we utilize third parties in addition to the data center investments that we’re making. And as I’ve mentioned in the $500 billion, there’s a number of states that we’re expanding in. Some of those are data center investments. And so we do plan on making investments in that area

Arista Networks (NYSE: ANET)

Arista Networks’ management remains confident of reaching $750 million in back-end AI revenue in 2025 even with the uncertainty surrounding US tariffs; the 1:1 ratio between front-end and back-end AI spending for Arista Networks’ products still remains, but management thinks it’s increasingly hard to parse between front-end and back-end

Our cloud and AI momentum continues as we remain confident of our $750 million front-end AI goal in 2025…

…Just a quick clarification before we go into Q&A. Jayshree meant we were reiterating our back-end goal of $750 million, not front-end AI…

…[Question] Is that 1:1 ratio for the front-end back is still intact in your perspective?

[Answer] On the front-end ratio, yes, we’ve said it’s generally 1:1. It’s getting harder and harder to measure front end and back end. Maybe we’ll look at the full AI cluster differently next year. But I think 1:1 is still a good ratio. It varies. Some of them just build a cluster and don’t worry about the front end and others worry about it entirely holistically. So it does vary, but I think the 1:1 is still a good ratio…

…[Question] You reiterated the $750 million back-end target, but you’ve kind of had this $1.5 billion kind of AI target for 2025. And just wondering, is the capability of that more dependent on kind of the tariffs given kind of some of the front-end spend?

[Answer] Regarding tariffs, I don’t think it will have a material difference on the $750 million number or the $1.5 billion. We got the demand. So unless we have some real trouble shipping it or customers change their mind, I think we’re good with both those targets for the year.

Arista Networks is progressing well with its 4 major AI customers; 1 of the 4 customers have been in NVIDIA’s Infiniband solution for a long time, so they’ll be small for Arista Networks; 2 of the 4 are heading towards 50,000 GPU deployments by end-2025, maybe even 100,000 GPUs; 3 of the 4 customers are already in production with the 4th progressing well towards production; management has a lot of visibility from the 4 major AI customers for 2025 and 2026 and it’s looking good; the 4 major AI customers are mostly deploying Arista Networks’ 800-gig switches

We are progressing well in all 4 customers and continue to add smaller ones as well…

…Let me start with the 4 customers. All of them are progressing well. One of them is still new to us. They’ve been in Infiniband for a long time, so they’ll be small. I would say 2 of them are heading towards 50,000 GPU deployments by end of the year, maybe they’ll be at 100 but I can be most certainly sure of 50,000, heading to 100,000. And then the other one is also in production. So I had talked about all 4 going into production. Three are already in production, the fourth one is well underway…

…[Question] If I can go back to the 4 Tier 1s that you’re working with on the AI back end and the progress that you updated on that front. Are these customers now giving you more visibility just given the tariff landscape and that you would need to sort of build inventory for some of the finished codes? And can you just update us how they’re handling the situation on that front? And particularly then, as you think about — I think the investor focus is a lot about sort of 2026 and potential sort of changes in the CapEx landscape from these customers at that point. Are you getting any forward visibility from them? Any sort of early signs for 2026 on these customers?

[Answer] We definitely have all the visibility in the world for this year, and we’re feeling good. We’re getting unofficial visibility because they all know our lead times are tied to some fairly long lead times from our partners and suppliers. So I would say 2026 is looking good. And based on our execution of 2025 and plans we’re putting together, we should have a great year in 2026 as well for AI sector specifically…

…[Question] Do you see the general cadence of hyperscalers deploying 800-gig switch ports this year? I ask because I believe your Etherlink family of switches became generally available in late 2024.

[Answer] I alluded to this earlier in 2024, the majority of our AI trials were on 400 gig at that time. So you’re right to observe that with our Etherlink portfolio really getting introduced in the second half of ’24 that a lot of our 800-gig activity has picked up in 2025, some of which will be reflected in shipments and some of it which will be part of our deferred. So it’s a good observation and an accurate one that this is the year of 800, like last year it was the year of 400.

Arista Networks’ management plans for the company to be the premier and preferred network for NVIDIA’s next-generation GPUs; Arista Networks’ Etherlink portfolio makes it easy to identify and localise performance issues in accelerated compute AI clusters

At the GTC event in March of 2025, we heard all about NVIDIA’s planned GPU road map every 12 to 18 months, and Arista intends to be the premier and preferred scale-out network for all of those GPUs and AI accelerators. Traditional GPUs have a collective communication libraries or CCL, as they’re known, that try to discover the underlying network topology using localization techniques. With this accelerated compute approach, the discrepancies between the discovered topology and the one that actually happens can impact AI job completion times. Arista’s ethylene portfolio highlights the accelerated networking approach, bringing that single point of network control and visibility as a differentiation. This makes it extremely crisp to identify and localize performance issues especially as the size of the AI cluster grows to 50,000 and 100,000 XPUs with the Arista AI Spine and leaf network designs.

Arista Networks’ campus portfolio provides cost-effective access points for agentic AI applications

Arista’s cognitive campus portfolio features our advanced spine with power or Ethernet-wired lease capabilities, along with a wide range of cost-effective wireless or 7 indoor and outdoor access points for the newer IoT and agentic applications.

The data center ecosystem is still somewhat new to AI and the suppliers are figuring things out together

But everybody is new to AI, they’ve never really put together a network design for 4-rail or 8-rail or how does it connect into the GPUs and what is the NIC [network interface card] attachment? What is the accessories in terms of cables or optics that connect? So this movement from trials to production causes us to bring a whole ecosystem together for the first time.

Arista Networks’ management thinks that when it comes to AI use-cases, Arista Networks’ products will play a far bigger role than whitebox networking manufacturers, even though whiteboxes will always be around and management is even happy to help customers build networking solutions that encompass both Arista Networks’ products and whiteboxes; Arista Networks was able to help a small AI customer build a network for a cluster of a few hundred GPUs very quickly after the customer struggled to do so with whiteboxes

I’ve always said, that white box is not new. It’s been with us since the beginning of time. In fact, when Arista got started, a couple of our customers had already implemented internally various implementations of white box. So there is a class of customers who will make the investments in engineering and operations to build their own network and manage it. And it’s a very different business model. It operates typically at 10% gross margins. I don’t think you want Arista to go there. And it’s very hardware-centric and doesn’t require the rich software foundation and investments that we’ve made. So first, I’ll start by saying we will always and will continue to coexist with white box. There are times that you’ve noticed this, too, that because Arista builds some very superior hardware, that even if they don’t use our EOS, they like to have our blue box, as I often call it, the Arista hardware that’s engineered much better than any others with a more open OS like Sonic or FBOSS or at least the attributes of running both EOS and an open-source networking system. So I think we view this as a natural part of selection in a customer base where if it’s a simple use case, they’re going to use something cost effective. But if it’s a really complex use case, like the AI spine or roles that require and demand more mission-critical features, Arista always plays a far bigger role in premium, highly scalable, highly valued software and hardware combinations than we do in a stand-alone white box. So we’ll remain coexistent peacefully, and we’re not in any way threatened by it. In fact, I would say we work with our customers to make sure as they’re building permutations and combinations of the white box, that we can work with that and build the right complement to that with our Etherlink portfolio…

…We had a customer, again, not material. We said, “I can’t get these boxes. I can’t make them run. I cannot get an AI network.” And one of my most technical sales leaders said, hey, we got a chance to build an AI cluster here for a few hundred GPUs. We jumped on it. Obviously, that customer is small and have been largely using white boxes and is now about to install an AI leaf and an AI spine, and we had to get it to him before the tariff deadline. So as an example of not material, but how quickly these decisions get made when you have the right product, right performance, right quality, right mission-critical nature and you can deal with that traffic pattern better than anyone else can. So it happens. It’s not big because we’ve got so much commitment in a given quarter from a customer, but when it is, we ask with great deal of nimbleness and agility to do that.

Arista Networks’ management is happy to support any kind of advanced packaging technologies – such as co-packaged optics or co-packaged copper – for back-end AI networks in the company’s products; management has yet to see any major adoption of co-packaged optics for back-end AI networks

[Question] I’d love to get your latest views around co-packaged optics. NVIDIA introduced its first CPO switches, GCC, for scale-out. And I was wondering whether that had any impact on your views regarding CPO adoption in back-end AI networks in coming years.

[Answer] It’s had no impact. It’s very early days. I think you’ve seen — Arista doesn’t build optics, but Arista enables optics and we’ve always been at the forefront, especially with Andy Bechtolsheim and his team of talented tech individuals that whether it is pluggable optics with LPO or how we define the OSFP connector for MSAs or 100 gig, 400 gig, it’s something we take seriously. And our views on CPOs, it’s not a new idea. It’s been demonstrated in prototype for, I don’t know, 10 to 20 years. The fundamental lack of adoption to date on CPO, it’s relatively high failure rates and it’s mostly been in the labs. So what are some of the advantages of CPO? Well, it has a linear interface. It has lower power than DSP for long-haul optics. It has a higher channel count. And I think if pluggable optics can achieve some of that in the best of both worlds, then you can overcome that with pluggable optics or even co-packaged copper. So Arista has no religion. We will do co-package copper. We’ll do co-package optics. We will do pluggable optics, but it’s too early to call this a real production-ready product that’s still in very early experiments and trials.

Arista Networks’ management is not seeing any material pull-forward in demand for its products because of US tariffs

[Question] We know tariffs are coming later in the year. Whether the strength you’re seeing is the result of early purchases of customers ahead of tariffs in order to save some dollars?

[Answer] Even if our customers try to pull it in and get it all by July, we would be unable to supply it. So that would be the first thing. So I’m not seeing the pull-ins that are really material in any fashion. I am seeing a few customers trying to save $1 here, $1 there to try and ship it before the tariff date but nothing material. Regarding pull-ins for 4 to 6 quarters, again, our best visibility is near term. And if we saw that kind of behavior, we would see a lot of inventory sitting in our customers, which we don’t. In fact, that’s long enough to ship faster and ship more.

2 years ago, Arista Networks’ management saw all its Cloud Titan customers pivot to AI and slow down their cloud spending; management is seeing more balanced spending now, with a more surgical focus on AI

2 years ago, I was very nervous because the entire cloud titans pivoted to AI and slowed down their cloud. Now we see a more balanced spend. And while we can’t measure how much of this cloud and how much of it is AI, if they’re kind of cobbled together, we are seeing less of a pivot, more of a surgical focus on AI and then a continued upgrade of the cloud networks as well. So compared to ’23, I would say the environment is much more balanced between AI and cloud.

Arista Networks’ management sees competitive advantages in the company’s hardware design, development, and operation that are hard to replicate even for its Cloud Titan customers

[Question] What functionality about the blue box actually makes it defensible versus what hyperscalers can kind of self-develop?

[Answer] Let me give you a few attributes of what I call the blue box, and I’m not saying others don’t have it, but Arista has built this as a mission, although we’re known for our software. We’re just as well known for our hardware. When you look at everything from a form factor of a one RU that we build to a chassis, we’ve got a tremendous focus on signal integrity, for example, all the way from layer 1, multilayer PCB boards, a focus on quality, a focus on driving distances, a focus on integrating optics for longer distances, a focus on driving MACsec, et cetera. So that’s a big focus. The second is hardware diagnostics. Internal to the company, we call it Arista boot. We’ve got a dedicated team focused on not just the hardware but the firmware to make it all possible in terms of troubleshooting because when these boards get super complex, you know where the failure is and you’re running at high-speed 200 [indiscernible] 30s. So things are very complex. So the ability to pinpoint and troubleshoot is a big part of what we do. And then there’s additional focus on the mechanical, the power supplies, the cooling, all of which translate to better power characteristics. Along with our partners and chip vendors, there’s a maniacal focus on not just high performance but low power. So some of the best attributes come from our blue boxes, not only for 48 ports, but all the way up to 576 ports of an AI spine or double that if you’re looking for dual capabilities. So well-designed, high-quality hardware is a thing of beauty, but also think of complexity that not everyone can do.

With neo AI cloud customers, Arista Networks’ management is observing that they are very willing to forsake NVIDIA’s GPUs and networking solutions and try other AI accelerators and Ethernet; management thinks that the establishment of the Ultra Ethernet Consortium in 2024 has a role to play in the increasing adoption of Ethernet for AI networking; with the Cloud Titans, management is also observing that they are shifting towards Ethernet; management thinks that the shift from Infiniband to Ethernet is faster than the the shift from NVIDIA’s GPUs to other companies’ GPUs

[Question] There’s a general perception that most of them are buying NVIDIA-defined clusters and networking. So I wonder if you could comment on those trends, their interest in moving past InfiniBand? And also are there opportunities developing with some of these folks to kind of multi-source their AI connectivity to different providers?

[Answer] We’re seeing more adventurous spirit in the neo-cloud customers because they want to try alternatives. So some of them are absolutely trying other AI accelerators like Lisa and AMD and my friends there. Some of them are absolutely looking at Ethernet, not InfiniBand as a scale-out. And that momentum has really shifted in the last year with the Ultra Ethernet Consortium and the spec coming out in May. I just want to give a shout-out to that team and what we have done. So I think Ethernet is a given that there’s an awful lot of legacy of InfiniBand that will obviously sort itself out. And a new class of AI accelerators we are seeing more niche players, more internal developments from the cloud titans, all of which is mandating more Ethernet. So I think between your 2 questions, I would say the progress from InfiniBand to Ethernet is faster, the progress from the ones they know and the high-performance GPU from NVIDIA versus the others is still taking time.

ASML (NASDAQ: ASML)

ASML’s management still sees AI (artificial intelligence) as the key growth driver; ASML will hit upper range of guidance for 2025 if AI demand continues to be strong, while ASML will hit the lower range of guidance if there is uncertainty among its customers

Consistent with our view from last quarter, the growth in artificial intelligence remains the key driver for growth in our industry. If AI demand continues to be strong and customers are successful in bringing on additional capacity to support the demand, there is a potential opportunity towards the upper end of our range. On the other hand, there is still quite some uncertainty for a number of our customers that can lead to the lower end of our range. 

ASML’s management is still positive on the long-term outlook for ASML, with AI being a driver for growth

Looking longer term, the semiconductor market remains strong with artificial intelligence, creating growth in recent quarters, and we see some of the future demand for AI solidifying, which is encouraging. 

ASML’s management thinks inference will become a larger part of AI demand going forward

I think there has been a lot of emphasis in the past quarters on the training side of life. I think more and more, which I think is logical, that you also see more and more emphasis being put on the inferencing side of the equation. So I think you will see the inferencing part becoming a larger component of AI demand on a go-forward basis.

ASML’s management is unable to tell what 2027 will look like for AI demand, but the commitment to AI chips in the next 2 years is very strong

You are looking at major investment, investment has been committed, investment that a lot of company believe they have to make in order to basically enter this AI race, I think the threshold to change this behavior is pretty high. And this is why — this is what our customers are telling us. And that’s also why we mentioned that, based on those conversations, we still see ’25, ’26 as growth years. That’s largely driven by AI and by that dynamic. Now ’27 start to be a bit further away, so you’re asking us too much, I think, to be able to answer basically what AI may look like in ’27. But if you look at the next couple of year, so far, the commitment to the AI investment and, therefore, the commitment also to deliver the chips for AI has been very solid.

Coupang (NYSE: CPNG)

Coupang’s management is investing in automation (such as automated picking, packing and sorting) and machine learning to deploy inventory more precisely to improve the customer experience and reduce costs

This quarter, we saw benefits from advances in our automated picking, packing and sorting systems and machine learning utilization that deploys inventory with more precise prediction of demand. This, coupled with our focus on operational excellence, enables us to continually improve the customer experience while also lowering their cost of service.

Datadog (NASDAQ: DDOG)

Existing customer usage growth in 2025 Q1 was in line with management’s expectations; management is seeing high growth in Datadog’s AI cohort, and stable growth in the other cohorts

Overall, we saw trends for usage growth from existing customers in Q1 that were in line with our expectations. We are seeing high growth in our AI cohort as well as consistent and stable growth in the rest of the business.

Datadog’s ,anagement continues to see increase in interest in next-gen AI capabilities and analysis; 4,000 Datadog customers at the end of 2025 Q1 used 1 or more Datadog AI integrations (was 3,500 in 2024 Q4), up 100% year-on-year; companies using end-to-end data observability to manage model performance, security, and quality, has more than doubled in the past 6 months; management has observed that data observability has become a big enabler of building AI workloads; the acquisition of Metaplane helps Datadog build towards a comprehensive data observability suite; management thinks data observability will be a big opportunity for Datadog

We continue to see rising customer for next-gen AI capabilities and analysis. At the end of Q1, more than 4,000 customers used one or more Datadog AI integrations, and this number has doubled year-over-year. With end-to-end data observability, we are seeing continued growth in customers and usage as they seek to manage end-to-end model performance, security and quality. I’ll call out the fact that the number of companies using end-to-end data observability has more than doubled in the past 6 months…

…[Question] What the vision is about moving into data observability and how consequential an opportunity it could be for Datadog?

[Answer] The field is evolving into a big enabler or it can be positive enabler, if you don’t do it right, for building enterprise workloads — for AI workloads, sorry. So in other words, making sure the data is being extracted from the the right place, transformed the right way and is being fed into the right AI models on the other hand…

…We only had some building blocks for data observability. We built data streams monitoring product for streaming data that comes out of few, such as Kafka, for example. We built their job monitoring product that monitors back jobs and large transformation jobs. We have a database monitoring product that looks at the way you optimize queries and optimize base performance and cost. And by adding data quality and data pipelines, with Metaplane, we have a full suite basically that allows our customers to manage everything from getting the data from their core data storage into all of the products and AI workloads and reports they need to go populate that data. And so we think it’s a big opportunity for us.

Datadog’s management has improved Bits AI, and is using next-gen AI to help solve customer issues quickly and move towards auto remediation

We are adding to Bits AI, with capabilities for customers to take action with workflow automation and App Builder, using next GenAI to help our customers immediate issues more quickly and move towards auto remediation in the future.

Datadog has made 2 recent acquisitions; Eppo is a feature management and experimentation platform; management sees automated experimentation as an important part of modern application development because of the use of AI in coding; Metaplane is a data observability platform that works well for new enterprise AI workloads; management is seeing more AI-written code in both its customers and the company itself; management thinks that as AI writes more code, more value will come from being able to observe and understand the AI-written code in production environments, which is Datadog’s expertise; the acquisitions of Eppo and Metaplane are to position Datadog for the transition towards a world of AI-written code

We recently announced a couple of acquisitions.

First, we acquired Eppo, a next-generation feature management and experimentation platform. The Eppo platform helps increase the velocity of releases, while also lowering risk by helping customers to release and validate features in a controlled manner. Eppo augments our efforts in product analytics, helping customers improve the variance and tie feature performance to business outcomes. More broadly, we see automated experimentation as a key part of modern application development, with the rapid adoption of the agent generative code, as well as more and more of the application logic itself being implemented with nondeterministic AI models. 

Second, we also acquired Metaplane, the data observability platform built for modern data teams. Metaplane helps prevent, detect and resolve their availability and quality issues across the company’s data warehouses and data pipelines. We’ve seen for several years now that better freshness and quality were critical for applications and business analytic. And we believe that they are becoming key enablers of the creation of new enterprise AI workloads, which is why we intend to integrate the Metaplane capabilities into our end-to-end dataset offerings…

…There is definitely a big transition that is happening right now, like we see the rise of AI written code. We see it across our customers. We also see it inside of Datadog, where we’ve had very rapid adoption of this technology as well…

…The way we see it is that it means that there’s a lot less value in writing the code itself, like everybody can do it pretty quickly, can do a lot of it. You can have the machine to do a lot of it, and you complement it with a little bit of your own work. But the real difficulty is in validating that code, making sure that it’s safe, making sure it runs well, that it’s performing and that it does what it’s supposed to do for the business. Also making sure that when 15 different people are changing the code at the same time, all of these different changes come together and work the right way, and you understand the way these different pieces interact in the way. So the way we see it is this move out a lot of their value from writing the code to observing it and understanding it in production environments, which is what we do. So a lot of the investments we’re making right now, including some of the acquisitions we’ve announced, build towards that, and making sure that we’re in the right spot.

Datadog signed a 7-figure expansion deal with a leading generative AI company; the generative AI company needs to reduce tool fragmentation; the generative AI company is replacing commercial tools for APM (application performance monitoring) and log management with Datadog, and is expanding to 5 Datadog products

We signed a 7-figure expansion as an annualized contract with a leading next GenAI company. This customer needs to reduce tool fragmentation to keep on top of its hyper growth in usage and employee headcount. With this expansion, the customer will use 5 Datadog products and will replace commercial tool for APM and log management.

AI-native customers accounted for 8.5% of Datadog’s ARR in 2024 Q4 (was 6% in 2024 Q4); AI-native customers contributed 6 percentage points to Datadog’s year-on-year growth in 2025 Q1, compared to 2 percentage points in 2024 Q1; management thinks AI-native customers will continue to optimise cloud and observability usage in the future; AI-native contracts that come up for renewal are healthy; Datadog has huge customer concentration with the AI-native cohort; Datadog has more than 10 AI-native customers that are spending $1 million or more with Datadog; the strong performance of the AI-native cohort in 2025 Q1 is fairly broad-based; Datadog is helping the AI-native customers mostly with inference, and not training; when Datadog sees growth among AI-native customers, that’s growth of AI adoption because the AI-native customers’ workloads are mostly customer-facing

We saw a continued rise in contribution from AI-native customers who represented about 8.5% of Q1 ARR, up from about 6% of ARR last quarter and up from about 3.5% of ARR in the year ago quarter. AI-native customers contributed about 6 points of year-over-year revenue growth in Q1 versus about 5 points last quarter and about 2 points in the year ago quarter. We continue to believe that adoption of AI will benefit Datadog in the long term, but we remain mindful that we may see volatility in our revenue growth on the backdrop of long-term volume growth from this cohort as customers renew with us on different terms and as they may choose to optimize cloud and observability usage…

…[Question] Could you talk about what you’re seeing from some of those AI-native contracts that have already come up for renewal and just how those conversations have been trending?

[Answer] All the contracts that come up for renewal, they are healthy. The trick with the cohort is that it’s growing fast. There’s also a revenue concentration there. We now have our largest customer in the cohort, and they’re growing very fast. And on the flip side of that, we also have a larger number of large customers that are also growing. So we — I think we mentioned more than 10 customers now that are spending $1 million or more with us in that AI-native cohort and that are also growing fast…

…On the AI side, we do have, as I mentioned, one customer large and the others there, they’re contributing more of the new revenue than the others. But we see growth in the rest of the cohort as well. So again, it’s fairly typical…

…For the AI natives, actually, what we help them with mostly is not training. It’s running their applications and their inference workloads as customer-facing. Because what’s training for the AI natives tends to be largely homegrown one-off and different from — between each and every one of them. We expect that as and if most other companies and enterprises do significant training, that this will not be the case. This will not be one-off and homegrown. But right now, it is still the AI natives that do most of the training, and they still do it in a way that’s largely homegrown. So when we see growth on the AI-native cohorts, that’s growth of AI adoption because that’s growth of customer-facing workloads by and large.

Datadog’s management sees the trend of cloud migration as being steady; management sees cloud migration being partly driven by customers’ desires to adopt AI, because migrating to the cloud is a prerequisite for AI

[Question] What are the trend lines on the cloud migration side?

[Answer] It’s consistent with what we’ve seen before. It’s also consistent with what you’ve heard from the hyperscalers over the past couple of weeks. So I would say it’s steady, unremarkable. It’s not really trending up nor trending down right now. But we see the same desire from customers to move more into the cloud and to lay the groundwork so they can also add up AI, because digital transformation and cloud migrations are prerequisites for that.

Datadog’s management thinks there will be more products for Datadog to build as AI workloads shift towards inferencing; management is seeing its LLM Observability product getting increasing usage as customers move AI workloads into production; management wants to build more products across the stack, from closer to the GPU to AI agents; 

On the workloads turning more towards inference, so there’s definitely more product to build there. So we have a — so we built an LLM Observability product that is being — that is getting increasing usage from customers as they move into production. And we think there’s more that we need to build both down the stack closer to the GPUs and up the stack closer to the agents that are being built on top of these models.

Datadog’s management is already seeing returns on Datadog’s internal investments in AI in terms of employee productivity; in the long-term, there’s the possibility that Datadog may need lesser headcount because of AI

[Question] Internally, how do you think about AI from an efficiency perspective?

[Answer] For right now, I think we’re seeing the returns in productivity, whether that be salespeople getting more information or R&D. We’re essentially trying to create an environment where we’re encouraging the various departments to use it and learning from it. Long term, there might well be efficiency gains — there may be efficiency gains that can be manifested in headcount.

Mastercard (NYSE: MA)

Mastercard’s management sees contactless payments and tokenised transactions as important parts of agentic AI digital commerce; Mastercard has announced Mastercard Agent Pay, which will facilitate safe, frictionless and programmable transactions across AI platforms; Mastercard is working with important AI companies such as Microsoft and OpenAI to deliver agentic payments

Today, 73% of all in-person switched transactions are contactless and approximately 35% of all our switch transactions are tokenized. These technologies will continue to play an important role as we move forward into the next phase of digital commerce, such as Agentic AI. We announced Mastercard Agent Pay to leverage our Agentic tokens as well as franchise rules, fraud and cybersecurity solutions. Combined, these will help partners like Microsoft to facilitate safe, frictionless and programmable transactions across AI platforms. We will also work with companies like OpenAI to deliver smarter, more secure and more personalized agentic payments. The launch of Agent Pay is an important step in redefining commerce in the AI era.

Mastercard closed the Recorded Future acquisition in 2024 Q4 (Recorded Future provides AI-powered solutions for real-time visibility into potential threats related to fraud); Recorded Future just unveiled the AI-powered Malware Intelligence; Malware Intelligence enables proactive threat prevention

On the cybersecurity front, Recorded Future just unveiled malware intelligence. It’s a new capability enabling proactive threat prevention for any business using real-time AI-powered intelligence insights.

Mastercard’s management sees AI as being deeply ingrained in Mastercard’s business; Mastercard’s access to an enormous amount of data is an advantage for Mastercard in deploying AI; in 2024, a third of Mastercard’s products in its value-added services and solutions segment was powered by AI

AI is deeply ingrained in our business. We have access to an enormous amount of data, and this uniquely positions us to enhance our AI’s performance, resulting in greater accuracy and reliability. And we’re deploying AI to enable many solutions in market today. In fact, in 2024, AI enabled approximately 1 in 3 of our products within value-added services and solutions.

Meta Platforms (NASDAQ: META)

Meta’s management is focused on 5 opportunities within AI namely, improved advertising, more engaging experiences, business messaging, Meta AI and AI devices; the 5 opportunities are downstream of management’s attempt to build artificial general intelligence and leading AI models and infrastructure in an efficient manner; management thinks the ROI of Meta’s investment in AI will be good even if Meta does not succeed in all the 5 opportunities;

As we continue to increase our investments and focus more of our resources on AI, I thought it would be useful today to lay out the 5 major opportunities that we are focused on. Those are improved advertising, more engaging experiences, business messaging, Meta AI and AI devices. And these are each long-term investments that are downstream from us building general intelligence and leading AI models and infrastructure. Even with our significant investments, we don’t need to succeed in all of these areas to have a good ROI. But if we do, then I think that we will be wildly happy with the investments that we are making…

…We are focused on building full general intelligence. All of the opportunities that I’ve discussed today are downstream of delivering general intelligence and doing so efficiently.

Meta’s management’s goal with the company’s advertising business is for businesses to simply tell Meta their objectives and budget, and for Meta to do all the rest with AI; management thinks that Meta can redefine advertising into an AI agent that delivers measurable business results at scale

Our goal is to make it so that any business can basically tell us what objective they’re trying to achieve like selling something or getting a new customer and how much they’re willing to pay for each result and then we just do the rest. Businesses used to have to generate their own ad creative and define what audiences they wanted to reach, but AI has already made us better at targeting and finding the audiences that will be interested in their products than many businesses are themselves, and that keeps improving. And now AI is generating better creative options for many businesses as well. I think that this is really redefining what advertising is into an AI agent that delivers measurable business results at scale.

Meta tested a new advertising recommendation model for Reels in 2025 Q1 called Generative Ads Recommendation Model, or GEM, that has improved conversion rates by 5%; 30% more advertisers are using Meta’s AI creative tools in 2025 Q1; GEM is twice as efficient at improving ad performance for a given amount of data and compute; GEM’s better efficiency helped Meta significantly scale up the amount of compute used for model training; GEM is now being rolled out to additional surfaces across Meta’s apps; the initial test of Advantage+’s streamlined campaign creation flow for sales, app and lead campaigns is encouraging and will be rolled out globally later in 2025; Advantage+ Creative is seeing strong adoption; all eligible advertisers can now automatically adjust the aspect ratio of their existing videos and generate images; management is testing a feature that uses gen AI to place clothing on virtual models; management has seen a 46% lift in incremental conversions in the testing of the incremental attribution feature and will roll out the feature to all advertisers in the coming weeks; improvements in Meta’s advertising ranking and modeling drove conversion growth that outpaced advertising impressions growth in 2025 Q1

In just the last quarter, we are testing a new ads recommendation model for Reels, which has already increased conversion rates by 5%. We’re seeing 30% more advertisers are using AI creative tools in the last quarter as well…

…In Q1, we introduced our new Generative Ads Recommendation Model, or GEM, for ads ranking. This model uses a new architecture we developed that is twice as efficient at improving ad performance for a given amount of data and compute. This efficiency gain enabled us to significantly scale up the amount of compute we use for model training with GEM trained on thousands of GPUs, our largest cluster for ads training to date. We began testing the new model for ads recommendations on Facebook Reels earlier this year and have seen up to a 5% increase in ad conversions. We’re now rolling it out to additional surfaces across our apps…

…We’re seeing continued momentum with our Advantage+ suite of AI-powered solutions. We’ve been encouraged by the initial test of our streamlined campaign creation flow for sales, app and lead campaigns, which starts with Advantage+ turned on from the beginning for advertisers. In April, we rolled this out to more advertisers and expect to complete the global rollout later this year. We’re also seeing strong adoption of Advantage+ Creative. This week, we are broadening access of video expansion to Facebook Reels for all eligible advertisers, enabling them to automatically adjust the aspect ratio of their existing videos by generating new pixels in each frame to optimize their ads for full screen surfaces. We also rolled out image generation to all eligible advertisers. And this quarter, we plan to continue testing a new virtual try-on feature that uses gen AI to place clothing on virtual models, helping customers visualize how an item may look and fit…

…We continue to evolve our ads platform to drive results that are optimized for each business’ objectives and the way they measure value. One example of this is our incremental attribution feature, which enables advertisers to optimize for driving incremental conversions or conversions we believe would not have occurred without an ad being shown. We’re seeing strong results in testing so far with advertisers using incremental attribution in tests seeing an average 46% lift in incremental conversions compared to their business-as-usual approach. We expect to make this available to all advertisers in the coming weeks…

…Year-over-year conversion growth remains strong. And in fact, we continue to see conversions grow at a faster rate than ad impressions in Q1, so reflecting increased conversion rates. And ads ranking and modeling improvements are a big driver of overall performance gains.

Improvements in the past 6 months to Meta’s content recommendation systems have driven increases of 7% in time spent on Facebook, 6% on Instagram, and 35% on Threads; video consumption in Facebook and Instagram grew strongly in 2025 Q1 because of improvements to Meta’s content recommendation systems; management sees opportunities for further gains in improving the content recommendation systems in 2025; Meta is making progress on longer-term efforts to improve its content recommendation systems in two areas, (1) develop increasingly efficient recommendation systems by incorporating innovations from LLM model architectures, and (2) integrating LLMs into content recommendation systems to better identify what is interesting to a user; management’s testing of Llama in Threads’ recommendation systems has led to a 4% increase in time spent from launch; management is exploring how Llama can be deployed in recommendation systems for photo and video content, which management expects can improve Meta AI’s personalisation by better understanding users’ interests and preferences through their use of Meta’s apps; management launched a new feed in Instagram in the US in 2025 Q1 of content a user’s friends have left a note on or liked and the new feed is producing good results; management has launched the Blend experience that blends a user’s Reels algorithm in direct messages with friends; the increases of 7% in time spent on Facebook and 6% on Instagram seen in the last 6 months is on top of uplift in time spent on Facebook and Instagram that management had already produced in the first 9 months of 2024

In the last 6 months, improvements to our recommendation systems have led to a 7% increase in time spent on Facebook, 6% increase on Instagram and 35% on Threads…

…In the first quarter, we saw strong growth in video consumption across both Facebook and Instagram, particularly in the U.S., where video time spent grew double digits year-over-year. This growth continues to be driven primarily by ongoing enhancements to our recommendation systems, and we see opportunities to deliver further gains this year.

We’re also progressing on longer-term efforts to develop innovative new approaches to recommendations. A big focus of this work will be on developing increasingly efficient recommendation systems so that we can continue scaling up the complexity and compute used to train our models while avoiding diminishing returns. There are promising techniques we’re working on that will incorporate the innovations from LLM model architectures to achieve this. Another area that is showing early promise is integrating LLM technology into our content recommendation systems. For example, we’re finding that LLM’s ability to understand a piece of content more deeply than traditional recommendation systems can help better identify what is interesting to someone about a piece of content, leading to better recommendations.

We began testing using Llama in Threads recommendation systems at the end of last year given the app’s text-based content and have already seen a 4% lift in time spent from the first launch. It remains early here, but a big focus this year will be on exploring how we can deploy this for other content types, including photos and videos. We also expect this to be complementary to Meta AI as it can provide more relevant responses to people’s queries by better understanding their interests and preferences through their interactions across Facebook, Instagram and Threads…

…In Q1, we launched a new experience on Instagram in the U.S. that consists of a feed of content your friends have left a note on or liked, and we’re seeing good results. We also just launched Blend, which is an opt-in experience in direct messages that enables you to blend your Reels algorithm with your friends to spark conversations over each other’s interest…

…We shared on the Q3 2024 call that improvements to our AI-driven feed and video recommendations drove a roughly 8% lift in time spent on Facebook and a 6% lift on Instagram over the first 9 months of last year. Since then, we’ve been able to deliver similar gains in just 6 months’ time with improvements to our AI recommendations delivering 7% and 6% time spent gains on Facebook and Instagram, respectively.

AI is enabling the creation of better content on Meta’s apps; the better content includes AI generating content directly for users and AI helping users produce better content; management thinks that the content created on Meta’s apps will be increasingly interactive over time; management recently launched the stand-alone Edits app that contains an ultra-high resolution, short-form video camera, and generative AI tools to remove backgrounds of video or animate still images; more features on Edits are coming soon; 

AI is also enabling the creation of better content as well. Some of this will be helping people produce better content to share themselves. Some of this will be AI generating content directly for people that is personalized for them. Some of this will be in existing formats like photos and videos, and some of it will be increasingly interactive…

…Our feeds started mostly with text and then became mostly photos when we all got mobile phones with cameras and then became mostly video when mobile networks became fast enough to handle that well. We are now in the video era, but I don’t think that this is the end of the line. In the near future, I think that we’re going to have content in our feeds that you can interact with and that it will interact back with you rather than you just watching it…

…Last week, we launched our stand-alone Edits app, which supports the full creative process for video creators from inspiration and creation to performance insights. Edits has an ultra-high resolution, short-form video camera and includes generative AI tools that enable people to remove the background of any video or animate still images with more features coming soon.

Countries like Thailand and Vietnam with low-cost labour actually conduct a lot of business through Meta’s messaging apps but management thinks this phenomena is absent in developed economies because of the high cost of labour; management thinks that AI will allow businesses in developed economies to conduct business through Meta’s messaging apps; management thinks that every business in the future will have AI business agents that are easy to set up and can perform customer support and sales; Meta is currently testing AI business agents with small businesses in the USA and a few countries across Meta’s apps; management has launched a new agent management experience to make it easier for businesses to train their AI; management’s vision is for that to be one agent that’s interacting with a consumer regardless of where he/she is engaging with the business AI; feedback from the tests are that the AI business agents are saving businesses a lot of time and helping them determine which conversations to spend more time on

In countries like Thailand and Vietnam, where there is a low cost of labor, we see many businesses conduct commerce through our messaging apps. There’s actually so much business through messaging that those countries are both in our top 10 or 11 by revenue even though they’re ranked in the 30s in global GDP. This phenomenon hasn’t yet spread to developed countries because the cost of labor is too high to make this a profitable model before AI, but AI should solve this. So in the next few years, I expect that just like every business today has an e-mail address, social media account and website, they’ll also have an AI business agent that can do customer support and sales. And they should be able to set that up very easily given all the context that they’ve already put into our business platforms…

…We are currently testing business AIs with a limited set of businesses in the U.S. and a few additional countries on WhatsApp, Messenger and on ads on Facebook and Instagram. We’ve been starting with small business and focusing first on helping them sell their goods and services with business AIs…

…We’ve launched a new agent management experience and dashboard that makes it easier for businesses to train their AI based on existing information on their website or WhatsApp profile or their Instagram and Facebook pages. And we’re starting with the ability for businesses to activate AI in their chats with customers. We are also testing business AIs on Facebook and Instagram ads that you can ask about product and return policies or assist you in making a purchase within our in-app browser…

…No matter where you engage with the business AI, it should be one agent that recalls your history and your preferences. And we’re hearing encouraging feedback, particularly that adopting these AIs are saving the business that we’re testing with a lot of time and helping to determine which conversations make sense for them to spend more time on.

Meta AI now has nearly 1 billion monthly actives; management’s focus for Meta AI in 2025 is to establish Meta AI as the leading personal AI for personalization, voice conversations, and entertainment; management thinks people will eventually have an AI to talk to throughout the day on smart-glasses and this AI will be one of the most important and valuable services that has ever been created; management recently released the first Meta AI stand-alone app; the Meta AI stand-alone app is personalised to the user’s behaviour on other Meta apps, and it also has a social feed for discovery on how others are using Meta AI; initial feedback on the Meta AI stand-alone app is good; management expects to focus on scaling and deepening engagement on Meta AI for at least the next year before attempting to monetise; management saw engagement on Meta AI improve when testing Meta AI’s ability to personalize responses by remembering people’s prior queries and their usage of Meta’s apps; management has built personalisation into Meta AI across all of Meta’s apps; the top use cases for Meta AI currently include information gathering, writing assistance, interacting with visual content, and seeking help; WhatsApp has the strongest usage of Meta AI, followed by Facebook; a standalone Meta AI app is important for Meta AI to become the leading personal AI assistant because WhatsApp is currently not the primary messaging app used in the USA; management thinks that people are going to use different AI agents for different things; management thinks having memory of a user will be a differentiator for AI agents

Across our apps, there are now almost 1 billion monthly actives using Meta AI. Our focus for this year is deepening the experience and making Meta AI the leading personal AI with an emphasis on personalization, voice conversations and entertainment. I think that we’re all going to have an AI that we talk to throughout the day, while we’re browsing content on our phones, and eventually, as we’re going through our days with glasses. And I think that this is going to be one of the most important and valuable services that has ever been created.

In addition to building Meta AI into our apps, we just released our first Meta AI stand-alone app. It is personalized. So you can talk to it about interests that you’ve shown while browsing Reels or different content across our apps. And we built a social feed into it. So you can discover entertaining ways that others are using Meta AI. And initial feedback on the app has been good so far.

Over time, I expect the business opportunity for Meta AI to follow our normal product development playbook. First, we build and scale the product. And then once it is at scale, then we focus on revenue. In this case, I think that there will be a large opportunity to show product recommendations or ads as well as a premium service for people who want to unlock more compute for additional functionality or intelligence. But I expect that we’re going to be largely focused on scaling and deepening engagement for at least the next year before we’ll really be ready to start building out the business here…

…Earlier this year, we began testing the ability for Meta AI to better personalize its responses by remembering certain details from people’s prior queries and considering what that person engages with on our apps. We are already seeing this lead to deeper engagement with people we’ve rolled it out to, and it is now built into Meta AI across Facebook, Instagram, Messenger and our new stand-alone Meta AI app in the U.S. and Canada…

…The top use case right now for Meta AI from a query perspective is really around information gathering as people are using it to search for and understand and analyze information followed by social interactions from — ranging from casual chatting to more in-depth discussion or debate. We also see people use it for writing assistance, interacting with visual content, seeking help…

…WhatsApp continues to see the strongest Meta AI usage across our Family of Apps. Most of that WhatsApp engagement is in one-on-one Threads, followed by Facebook, which is the second largest driver of Meta AI engagement, where we’re seeing strong engagement from our feed deep dives integration that lets people ask Meta AI questions about the content that’s recommended to them…

…I also think that the stand-alone app is going to be particularly important in the United States because WhatsApp, as Susan said, is the largest surface that people use Meta AI and which makes sense. If you want to text an AI, having that be closely integrated and a good experience in the messaging app that you use makes a lot of sense. But we’re — while we have more than 100 million people use WhatsApp in the United States, we’re clearly not the primary messaging app in the United States at this point. iMessage is. We hope to become the leader over time. But we’re in a different position there than we are in most of the rest of the world on WhatsApp. So I think that the Meta AI app as a stand-alone is going to be particularly important in the United States to establishing leadership in — as the main personal AI that people use…

…I think that there are going to be a number of different agents that people use, just like people use different apps for different things. I’m not sure that people are going to use multiple agents for the same exact things, but I’d imagine that something that is more focused on kind of enterprise productivity might be different from something that is somewhat more optimized for personal productivity. And that might be somewhat different from something that is optimized for entertainment and social connectivity. So I think there will be different experiences…

…Once an AI starts getting to know you and what you care about in context and can build up memory from the conversations that you’ve had with it over time, I think that will start to become somewhat more of a differentiator.

Meta’s management continues to think of glasses as the ideal form factor for an AI device; management thinks that the 1 billion people in the world today who wear glasses will likely all be wearing smart glasses in the next 5-10 years; management thinks that building the devices people use for Meta’s apps lets the company deliver the best AI and social experiences; sales of the Ray-Ban Meta AI glasses have tripled in the last year and usage of the glasses is high; Meta has new launches of smart glasses lined up for later this year; monthly actives of Ray-Ban Meta AI glasses is up 4x from a year ago, with the number of people using voice commands growing even faster; management has rolled out live translations on Ray-Ban Meta AI glasses to all markets for English, French, Italian and Spanish; management continues to want to scale the Ray-Ban Meta AI glasses to 10 million units or more for its 3rd generation; management intends to run the same monetisation playbook with the Ray-Ban Meta AI glasses as Meta’s other products

Glasses are the ideal form factor for both AI and the metaverse. They enable you to let an AI see what you see, hear what you hear and talk to you throughout the day. And they let you blend the physical and digital worlds together with holograms. More than 1 billion people worldwide wear glasses today, and it seems highly likely that these will become AI glasses over the next 5 to 10 years. Building the devices that people use to experience our services lets us deliver the highest-quality AI and social experiences…

…Ray-Ban Meta AI glasses have tripled in sales in the last year. The people who have them are using them a lot. We’ve got some exciting new launches with our partner, EssilorLuxottica, later this year as well that should expand that category and add some new technological capabilities to the glasses…

…We’re seeing very strong traction with Ray-Ban Meta AI glasses with over 4x as many monthly actives as a year ago. And the number of people using voice commands is growing even faster as people use it to answer questions and control their glasses. This month, we fully rolled out live translations on Ray-Ban Meta AI glasses to all markets for English, French, Italian and Spanish. Now when you are speaking to someone in one of these languages, you’ll hear what they say in your preferred language through the glasses in real time…

…If you look at some of the leading consumer electronics products of other categories, by the time they get to their third generation, they’re often selling 10 million units and scaling from there. And I’m not sure if we’re going to do exactly that, but I think that that’s like the ballpark of the opportunity that we have…

…As a bunch of the products start to hit and start to grow even bigger than the number that I just said is just sort of like the sort of a near-term milestone, then I think we’ll continue scaling in terms of distribution. And then at some point, just like the other products that we build out, we will feel like we’re at a sufficient scale that we’re going to primarily focus on making sure that we’re monetizing and building an efficient business around it.

Meta released the first few Llama 4 models in April 2025 and more Llama 4 models are on the way, including the massive Llama 4 Behemoth model; management thinks leading-edge AI models are critical for Meta’s business, so they want the company to control its own destiny; by developing its own models, Meta is also able to optimise the model to its infrastructure and use-cases; an example of the optimisation is the Llama 4 17-billion model that comes with low latency to suit voice interactions; another example of the optimisation is the models’ industry-leading context window length which helps Meta AI’s personalisation efforts; Llama 4 Behemoth is important for Meta because all the models the company is using internally, and some of the models the company will develop in the future, are distilled from Behemoth

We released the first Llama 4 models earlier this month. They are some of the most intelligent, best multimodal, lowest latency and most efficient models that anyone has built. We have more models on the way, including the massive Llama 4 Behemoth model…

…On the LLM, yes, there’s a lot of progress being made in a lot of different dimensions. And the reason why we want to build this out is — one is that we think it’s important that for kind of how critical this is for our business that we sort of have control of our own destiny and are not depending on another company for something so critical. But two, we want to make sure that we can shape the development to be optimized for our infrastructure and the use cases that we want.

So to that end, Llama 4, the shape of the model with 17 billion parameters per expert was designed specifically for the infrastructure that we have in order to provide the low latency experience to be voice optimized. One of the key things, if you’re having a voice conversation with AI, is it needs to be low latency. So that way, when you’re having a conversation with it, there’s isn’t a large gap between when you stop speaking and it starts. So everything from the shape of the model to the research that we’re doing to techniques that go into it are kind of fit into that.

Similarly, another thing that we focused on was context window length. And in some of our models, we have really — we’re industry-leading on context window length. And part of the reason why we think that that’s important is because we’re very focused on providing a personalized experience. And there are different ways that you can put personalization context into an LLM, but one of the ways to do it is to include some of that context in the context window. And having a long context window that can incorporate a lot of the background that the person has shared across our apps is one way to do that…

…I think it’s also very important to deliver big models like Behemoth, not because we’re going to end up serving them in production, but because of the technique of distilling from larger models, right? The Llama 4 models that we’ve published so far and the ones that we’re using internally and some of the ones that we’ll build in the future are basically distilled from the Behemoth model in order to get the 90%, 95% of the intelligence of the large model in a form factor that is much lower latency and much more efficient.

Meta’s management is accelerating the buildout of Meta’s AI capacity, leading to higher planned investment for 2025; Meta’s capex growth in 2025 is for both generative AI and core business needs with the majority of overall capex supporting Meta’s core business; management continues to build infrastructure in a flexible way where the company can react to how the AI ecosystem develops in the coming years; management is increasing the efficiency of Meta’s workloads and this has helped the company to achieve strong returns from its core AI initiatives

We are accelerating some of our efforts to bring capacity online more quickly this year as well as some longer-term projects that will give us the flexibility to add capacity in the coming years as well. And that has increased our planned investment for this year…

…Our primary focus remains investing capital back into the business with infrastructure and talent being our top priorities…

…Our CapEx growth this year is going toward both generative AI and core business needs with the majority of overall CapEx supporting the core. We expect the significant infrastructure footprint we are building will not only help us meet the demands of our business in the near term but also provide us an advantage in the quality and scale of AI services we can deliver. We continue to build this capacity in a way that grants us maximum flexibility in how and when we deploy it to ensure we have the agility to react to how the technology and industry develop in the coming years…

…The second way we’re meeting our compute needs is by increasing the efficiency of our workloads. In fact, many of the innovations coming out of our ranking work are focused on increasing the efficiency of our systems. This emphasis on efficiency is helping us deliver consistently strong returns from our core AI initiatives.

Meta’s management sees a number of long-term tailwinds that AI can provide for Meta’s business, including making advertising a larger share of global GDP, and freeing up more time for people to engage in entertainment

Over the coming years, I think that the increased productivity from AI will make advertising a meaningfully larger share of global GDP than it is today…

…Over the long term, as AI unlocks more productivity in the economy, I also expect that people will spend more of their time on entertainment and culture, which will create an even larger opportunity to create more engaging experiences across all of these apps.

Meta’s management still expects to develop an AI coding agent sometime in 2025 that can operate as a mid-level engineer; management expects this AI coding agent to be do a substantial part of Meta’s AI research and development in 2026 H2; management is focused on building AI that can run experiments to improve Meta’s recommendation systems

I’d say it’s basically still on track for something around a mid-level engineer kind of starting to become possible sometime this year, scaling into next year. So I’d expect that by the middle to end of next year, AI coding agents are going to be doing a substantial part of AI research and development. So we’re focused on that. Internally, we’re also very focused on building AI agents or systems that can help run different experiments to increase recommendations across our other AI products like the ones that do recommendations across our feeds and things like that.

Microsoft (NASDAQ: MSFT)

Microsoft’s management is seeing accelerating demand across industries for cloud migrations; there are 4 things happening to drive cloud migrations, (1) classic migration, (2) data growth, (3) growth in cloud-native companies’ consumption, and (4) growth in AI consumption, which also requires non-AI consumption 

When it comes to cloud migrations, we saw accelerating demand with customers in every industry, from Abercrombie in French, to Coca-Cola and ServiceNow expanding their footprints on Azure…

…[Question] On your comment about accelerating demand for cloud migrations. I’m curious if you could dig in and extrapolate a little more what you’re seeing there.

[Answer] One is, I’ll just say, the classic migration of whether it’s SQL, Windows Server. And so that sort of again got good steady-state progress because the reality is, I think everyone is now, perhaps there’s another sort of kick in the data center migrations just because of the efficiency the cloud provides. So that’s sort of one part.

The second piece is good data growth. You saw some — like Postgres on Azure — I mean, forgetting even SQL server, Postgres on Azure is growing. Cosmos is growing. The analytics stuff I talked about with Fabric. It’s even the others, whether it is Databricks or even Snowflake on Azure are growing. So we feel very good about Fabric growth and our data growth.

Then the cloud-native growth. So this is again before we even get to AI, some of the core compute consumption of cloud-native players is also pretty very healthy. It was healthy throughout the quarter. We projected to go moving forward as well.

Then the thing to notice is the ratio, and I think we mentioned this multiple times before, if you look underneath even ChatGPT, in fact, that team does a fantastic job of thinking about not only their growth in terms of AI accelerators they need, they use Cosmos DB, they use Postgres. They use core compute and storage. And so there’s even a ratio between any AI workload in terms of AI accelerator to others.

So those are the 4 pockets, I’d say, or 4 different trend lines, which all have a relationship with each other.

Foundry is now used by developers in over 70,000 companies, from enterprises to startups, to design, customize and manage their AI apps and agents; Foundry processed  more than 100 trillion tokens in 2025 Q1, up 5x from a year ago; Foundry now has industry-leading model fine tuning tools; the latest models from AI heavyweights including OpenAI and Meta are available on Foundry;  Microsoft’s Phi family of SLMs (small language model) now has over 38 million downloads (20 million downloads in 2024 Q4); Foundry will soon introduce an LLM (large language model) with 1 billion parameters that can run on just CPUs

Foundry is the agent in AI app factory. It’s now used by developers at over 70,000 enterprises and digital natives from Atomicwork to Epic, Fujitsu and Gainsight to H&R Block and LG Electronics to design, customize and manage their AI apps and agents. We processed over 100 trillion tokens this quarter, up 5x year-over-year, including a record 50 trillion tokens last month alone. And 4 months in, over 10,000 organizations have used our new agent service to build, deploy and scale their agents.

This quarter, we also made a new suite of fine-tuning tools available to customers with industry-leading reliability, and we brought the latest models from OpenAI along with new models from Cohere, DeepSeek, Meta, Mistral, Stability to Foundry. And we’ve expanded our Phi family of SLMs with new multimodal and mini models. All-up, Phi has been downloaded 38 million times. And our research teams are taking it one step further with BitNet b1.58, a billion parameter, large language model that can run on just CPUs coming to the Foundry.

With agent mode in VS Code, Github Copilot can now iterate on code, recognize errors, and fix them automatically; there are other Github agent modes that provide coding support to developers; Microsoft is previewing a first-of-its-kind SWE (software engineering) agent that can execute developer tasks; GitHub Copilot now has 15 million users, up 4x from a year ago; GitHub Copilot is used by a wide range of companies; VS Code has more than 50 million monthly active users

We’re evolving GitHub Copilot from paired to peer programmer with agent mode in VS Code, Copilot can now iterate on code, recognize errors and fix them automatically. This adds to other Copilot agents like Autofix, which helps developers remediate vulnerabilities as well as code review agent, which has already reviewed over 8 million pull requests. And we are previewing a first-of-its-kind SWE-agent capable of asynchronously executing developer tasks. All-up, we now have over 15 million GitHub Copilot users, up over 4x year-over-year. And both digital natives like Twilio and enterprises like Cisco, HPE, Skyscanner and Target continue to choose GitHub Copilot to their developers with AI throughout the entire dev life cycle. With Visual Studio and VS Code, we have the world’s most popular editor with over 50 million monthly active users.

Microsoft 365 Copilot is now used hundreds of thousands of customers, up 3x from a year ago; deal sizes for Microsoft 365 Copilot continue to grow; a record number of customers in 2025 Q1 returned to buy more seats for Microsoft 365 Copilot; new researcher and analyst deep reasoning agents can analyze vast amounts of web and enterprise data on-demand directly within Microsoft 365 Copilot; Microsoft is introducing agents for every role and business process; customers can build their own AI agents with no/low code with Copilot Studio and these agents can handle complex tasks, including taking action across desktop and web apps; 230,000 organisations, including 90% of the Fortune 500, have already used Copilot Studio; customers created more than 1 million custom agents across SharePoint and Copilot Studio, up 130% sequentially

Microsoft 365 Copilot is built to facilitate human agent collaboration, hundreds of thousands of customers across geographies and industries now use Copilot, up 3x year-over-year. Our overall deal size continues to grow. In this quarter, we saw a record number of customers returning to buy more seats. And we’re going further. Just last week, we announced a major update, bringing together agents, notebooks, search and create into a new scaffolding for work. Our new researcher and analyst deep reasoning agents analyze vast amounts of web and enterprise data to deliver highly skilled expertise on demand directly within Copilot…

…We are introducing agents for every role and business process. Our sales agent turns contacts into qualified leads and with sales chat reps can quickly get up to speed on new accounts. And our customer service agent is deflecting customer inquiries and helping service reps resolve issues faster.

With Copilot Studio, customers can extend Copilot and build their own agents with no code, low code. More than 230,000 organizations, including 90% of the Fortune 500 have already used Copilot Studio. With deep reasoning and agent flows in Copilot Studio, customers can build agents that perform more complex tasks and also handle deterministic scenarios like document processing and financial approvals. And they can now build Computer Use Agents that take action on the UI across desktop and web apps. And with just a click, they can turn any SharePoint site into an agent, too. This quarter alone, customers created over 1 million custom agents across SharePoint and Copilot Studio, up 130% quarter-over-quarter.

Azure grew revenue by 33% in 2025 Q1 (was 31% in 2024 Q4), with 16 points of growth from AI services (was 13 points in 2024 Q4); management brought capacity online for Azure AI services faster than expected;  Azure’s non-AI business saw accelerated growth in its Enterprise customer segment as well as some improvement in its scale motions; management thinks the real outperfomer within Azure in 2025 Q1 is the non-AI business; the strength in the AI business in 2025 Q1 came because Microsoft was able to match supply and demand somewhat, and also deliver supply early to some customers; management thinks it’s getting harder to separate an AI workload from a non-AI workload

In Azure and other cloud services, revenue grew 33% and 35% in constant currency, including 16 points from AI services. Focused execution drove non-AI services results, where we saw accelerated growth in our Enterprise customer segment as well as some improvement in our scale motions. And in Azure AI services, we brought capacity online faster than expected…

…The real outperformance in Azure this quarter was in our non-AI business. So then to talk about the AI business, really, what was better was precisely what we said. We talked about this. We knew Q3 that we had and hadn’t really match supply and demand pretty carefully and so didn’t expect to do much better than we had guided to on the AI side. We’ve been quite consistent on that. So the only real upside we saw on the AI side of the business was that we were able to deliver supply early to a number of customers…

…[Question] You mentioned that the upside on Azure came from the non-AI services this time around. I was wondering if you could just talk a little bit more about that.

[Answer] In general, we saw better-than-expected performance across our segments, but we saw acceleration in our largest customers. We call that the Enterprise segment in general. And then in what we talked about of our scale motions, where we had some challenges in Q2, things were a little better. And we still have some work to do in our scale motions, and we’re encouraged by our progress. We’re excited to stay focused on that as, of course, we work through the final quarter of our fiscal year…

…It’s getting harder and harder to separate what an AI workload is from a non-AI workload.

Around half of Microsoft’s cloud and AI-related capex in 2025 Q1 (FY2025 Q3) are for long-lived assets that will support monetisation over the next 15 years and more, while the other half are for CPUs and GPUs; management expects Microsoft’s capex in 2025 Q2 (FY2025 Q4) to increase sequentially, but the guidance for total capex for FY2025 H2 is unchanged from previous guidance (previously, expectation was for capex for 2025 Q1 and 2025 Q2 to be at similar levels as 2024 Q4 (FY2025 Q2); FY2026’s capex is still expected to grow at a lower rate than in FY2025; the mix of spend in FY2026 will shift to short-lived assets in FY2026; demand for Azure’s AI services is growing faster than capacity is being brought online and management expects to have some AI capacity constraints beyond June 2025 (or FY2025 Q4); management’s goal with Microsoft’s data center investments is to be positioned for the workload growth of the future; management thinks pretraining plus test-time compute is a big change in terms of model-training workloads; Microsoft is short of power in fulfilling its data center growth plans; Microsoft’s data center builds have very long lead-times; in Microsoft’s 2024 Q4 (FY 2025 Q1) earnings call, management expected Azure to no longer be capacity-constrained by the end of 2025 Q2 (FY2025 Q4) but demand was stronger than expected in 2025 Q1 (FY2025 Q3); management still thinks they can get better and better capital efficiency from the cloud and AI capex; Azure’s margin on the AI business now is far better than what the margin was when the cloud transition was at a similar stage

Roughly half of our cloud and AI-related spend was on long-lived assets that will support monetization over the next 15 years and beyond. The remaining cloud and AI spend was primarily for servers, both CPUs and GPUs, to serve customers based on demand signals, including our customer contracted backlog of $315 billion…

…We expect Q4 capital expenditures to increase on a sequential basis. H2 CapEx in total remains unchanged from our January H2 guidance. As a reminder, there can be quarterly spend variability from cloud infrastructure build-outs and the timing of delivery of finance leases…

…Our earlier comments on FY ’26 capital expenditures remain unchanged. We expect CapEx to grow. It will grow at a lower rate than FY ’25 and will include a greater mix of short-lived assets, which are more directly correlated to revenue than long-lived assets…

… In our AI services, while we continue to bring data center capacity online as planned, demand is growing a bit faster. Therefore, we now expect to have some AI capacity constraints beyond June…

…the key thing for us is to have our builds and lease be positioned for what is the workload growth of the future, right? So that’s what you have to [ goal ] seek to. So there’s a demand part to it, there is the shape of the workload part to it, and there is a location part to it. So you don’t want to be upside down on having one big data center in one region when you have a global demand footprint. You don’t want to be upside down when the shape of demand changes because, after all, with essentially pretraining plus test-time compute, that’s a big change in terms of how you think about even what is training, right, forget inferencing…

…We will be short power. And so therefore — but it’s not a blanket statement. I need power in specific places so that we can either lease or build at the pace at which we want…

…From land to build to build-outs can be lead times of 5 to 7 years, 2 to 3 years. So we’re constantly in a balancing position as we watch demand curves…

…I did talk about in my comments, we had hoped to be in balance by the end of Q4. We did see some increased demand as you saw through the quarter. So we are going to be a little short still, say, a little tight as we exit the year…

…[Question] You’ve said in the past that you can attain better and better capital efficiency with the cloud business and probably cloud and AI business. Where do you stand today?

[Answer] The way, of course, you’ve seen that historically is right when we went through the prior cloud transitions, you see CapEx accelerate, you build out data center footprint.,, You slowly filled GPU capacity. And over time, you see software efficiencies and hardware efficiencies build on themselves. And you saw that process for us for goodness now quite a long time. And what Satya’s talking about is how quickly that’s happening on the AI side of the business and you add to that model diversity. So think about the same levers plus model efficiency, those compounds. Now the one thing that’s a little different this time is just the pace. And so when you’re seeing that happen, pace in terms of efficiency side, but also pace in terms of the build-out. So it can mask some of the progress… Our margins on the AI side of the business are better than they were at this point by far than when we went through the same transition in the server to cloud transition…

…I think the way to think about this is you can ask the question, what’s the difference between a hosting business and a hyperscale business? It’s software. That’s, I think, the gist of it. Yes, for sure, it’s a capital-intensive business, but capital efficiency comes from that system-wide software optimization. And that’s what makes the hyperscale business attractive and that’s what we want to just keep executing super well on.

Microsoft’s management sees Azure as Microsoft’s largest business; management thinks that the next platform shift in technology, which is AI, is built on the last major platform, which was for cloud computing, so this benefits Microsoft

There’s nothing certain for sure in the future, except for one thing, which is our largest business is our infrastructure business. And the good news here is the next big platform shift builds on that. So it’s not a complete rebuild, having gone through all these platform shifts where you have to come out on the other side with a full rebuild. If there is good news here is that we have a good business in Azure that continues to grow and the new platform depends on that.

It’s possible that software optimizations with AI model development and deployment could lead to even longer useful lives for GPUs, but management wants to observe this for longer

[Question] Could we start to consider the possibility that software enhancements might extend the useful life assumption that you’re using for GPUs?

[Answer] In terms of thinking about the depreciable life of an asset, we like to have a long history before we make any of those changes. So we’re focused on getting every bit of useful life we can, of course, out of assets. But to Satya’s point, that tends to be a software question more than a hardware one.

Netflix (NASDAQ: NFLX)

Netflix’s content talent are already using AI tools to improve the content production process; management thinks AI tools can enable lower-budget projects to access top-grade VFX; Rodrigo Prieto is directing his first feature film with Netflix in 2025, Pedro Paramo, and he’s able to use AI tools for de-aging VFX at a much lower cost than The Irishman film that Prieto worked on 5 years ago; the entire budget for Pedro Paramo is similar to the cost of VFX alone for The Irishman; management’s focus with AI is to find ways for AI to improve the member and creator experience

So our talent today is using AI tools to do set references or previs, VFX sequence prep, shop planning, all kinds of things today that kind of make the process better. Traditionally, only big budget projects would have access to things like advanced visual effects such as de-aging. So today, you can use these AI-powered tools so to enable smaller budget projects to have access to big VFX on screen.

A recent example, I think, is really exciting. Rodrigo Prieto was the DP on The Irishman just 5 years ago. And if you remember that movie, we were using very cutting edge, very expensive de-aging technology that still had massive limitations, still creating a bunch of complexity on set for the actors. It was a giant leap forward for sure, but nowhere near what we needed for that film. So this year, just 5 years later, Rodrigo is directing his first feature film for us, Pedro Páramo in Mexico. Using AI-powered tools he was able to deliver this de-aging VFX to the screen for a fraction of what it cost on The Irishman. In fact, the entire budget of the film was about the VFX cost on The Irishman…

…So our focus is simple, find ways for AI to improve the member and the creator experience.

Netflix’s management is building interactive search into Netflix which is based on generative AI

We’re also building out like new capabilities, an example would be interactive search. That’s based on generative technologies. We expect that will improve that aspect of discovery for members.

Paycom Software (NYSE: PAYC)

Paycom’s GONE is the industry’s first fully automated time-off solution, utilising AI, that automates all time off requests; prior to GONE, 10% of an organisation’s labour cost was unmanaged; GONE can generate ROI of up to 800%, according to Forrester; GONE helped Paycom be named by Fast Company as one of the world’s most innovative companies

Our award-winning solution, GONE, is a perfect example of how Paycom simplifies tests through automation and AI. GONE is the industry’s first fully automated time-off solution that decisions all time-off requests based on customizable guidelines set by the company’s time-off rules. Before GONE, 10% of an organization’s labor cost went substantially unmanaged, creating scheduling errors, increased cost from overpayments, staffing shortages and employee uncertainty over pending time-off requests. According to a Forrester study, GONE’s automation delivers an ROI of up to 800% for clients. GONE continues to receive recognition. Most recently, Fast Company magazine named Paycom, one of the world’s most innovative companies for a second time. This honor specifically recognized GONE and is a testament to how Paycom is shaping our industry by setting new standards for automation across the globe.

PayPal (NASDAQ: PYPL)

PayPal’s management is leaning into agentic commerce; PayPal recently launched the payments industry’s first remote MCP (Model Context Protocol) server to enable AI agent frameworks to integrate with PayPal APIs; the introduction of the MCP allows any business to create an agentic commerce experience; all major AI players are involved with PayPal’s annual Developer Days to engage PayPal’s developer community

At Investor Day, I told you we were leaning into agentic commerce…

…Just a few weeks ago, we launched the industry’s first remote MCP server and enabled the leading AI agent frameworks to seamlessly integrate with PayPal APIs. Now any business can create agentic experience that allow customers to pay, track shipments, manage invoices and more, all powered by PayPal and all within an AI client. As we speak, developers are gathering in our San Jose headquarters for our annual Developer Days. Every major player in AI is represented, providing demos and engaging with our developer community.

Shopify (NASDAQ: SHOP)

Shopify’s management recently launched TariffGuide.ai, an AI-powered tool that provides duty rates based on just a product description and the country of origin, helping merchants source the right products in minutes

And just this past week, we launched TariffGuide.ai. This AI driven tool provides duty rates based on just a product description and the country of origin. Sourcing the right products from the right country can mean the difference between a 0% and a 15% duty rate or higher, And TariffGuide.ai allows merchants to do this in minutes, not days.

Shopify CEO Tobi Lutke penned a memo recently on his vision on how Shopify should be workin with AI; AI is becoming 2nd nature to how Shopify’s employees work, where employees use AI reflexively; before any team requests for additional headcount, they need to first assess if AI can meet their goals; Shopify has built a dozen MCP (model context protocol) servers in the last few weeks to enable anyone in Shopify to ask questions and find resources more easily; management sees AI being a cornerstone of how Shopify delivers value; management is investing more in AI, but the increased investment is not a driver for the lower gross margin in Shopify’s Subscription Solutions segment in 2025 Q1; management does not expect the Subscription Solutions segment’s gross margin to change much in the near term; Shopify has shown strong operating leverage partly because of its growing internal use of AI

AI is at the core of how we operate and is transforming our work processes. For those who have not seen it, I encourage you to check out Toby’s recent company wide email on AI that has now been shared publicly. At Shopify, we take AI seriously. In fact, it’s becoming second nature to how we work. By fostering a culture of reflexive AI usage, our teams default to using AI first, reflexive being the key term here. This also means that before requesting additional headcount or resources, teams are required to start with assessing how they can meet their goals using AI first. This approach is sparking some really fascinating explorations and discussions around the company, challenging the way we think, the way we operate, and pushing us to look ahead as we redefine our decision making processes. In the past couple of weeks, we built a dozen MCP servers that make Shopify’s work legible and accessible. And now anyone within Shopify can ask questions, find resources, and leverage those tools for greater efficiency. This reflexive use of AI goes well beyond internal improvements. It supercharges our team’s capabilities and drives operational efficiencies, keeping us agile. And as we continue to innovate, AI will remain a cornerstone of how we deliver value across the board…

…Gross profit for Subscription Solutions grew 19%, slightly less than the 21% revenue growth for Subscription Solutions. The lower rate was driven primarily by higher cloud and infrastructure hosting costs needed to support higher volumes and geographic expansion. Although we are investing more in AI, it is not a significant factor in this increase. Over the past 5 years, the gross margin for Subscription Solutions has centered around 80%, plus or minus a couple of hundred basis points in any given quarter, and we do not anticipate that trend changing in the near term…

…Our continued discipline on head count across all 3 of R&D, sales and marketing and G&A continued to yield strong operating leverage, all while helping us move even faster on product development aided by our increasing use of AI.

Shopify’s management rearchitected the AI engine of Sidekick, Shopify’s AI merchant assistant, in 2025 Q1; monthly average users of Sidekick has more than doubled since the start of 2025; early results of Sidekick are really strong for both large and small merchants

In Q1, key developments for Sidekick included a complete rearchitecture of the AI engine for deeper reasoning capabilities, enhancing processing of larger business datasets and accessibility in all supported languages, allowing every Shopify merchant to use Sidekick in their preferred language. And these changes, well, they’re working. In fact, our monthly average users of Sidekick continue to climb more than doubling since the start of 2025. Now this is still really early days, but the progress we are making is already yielding some really strong results for merchants, both large and small. 

Shopify acquired Vantage Discovery in 2025 Q1; Vantage Discovery works on AI-powered, multi-vector search; management thinks the acquisition will improve the overall consumer search experience delivered by Shopify’s merchants

In March, we closed the acquisition of Vantage Discovery, which helps accelerate the development of AI-powered, multi-vector search across our search, APIs, shop and storefront search offerings. This acquisition is one piece of a broader strategy to ensure that our merchants are able to continue meeting buyers regardless of where they’re shopping or discovering great products…

…The Vantage team coming in who are rock stars in AI are going to help take our search abilities to the next level.

Shopify’s management is seeing more and more commerce searches starting away from a search engine; Shopify is already working with AI chatbot providers on AI shopping; management thinks that AI shopping is a huge opportunity; management thinks AI agents will be a great opportunity for Shopify too

One of the things we think about is that wherever commerce is taking place, Shopify will be there. And obviously, one of the things we are seeing is that more and more searches are starting on places beyond just somebody’s search engine. That’s a huge opportunity whereby more consumers are going to be searching for great products…

…We’ve talked about some of the partnerships in the past. You’ve seen what we’ve done with Perplexity and OpenAI. We will continue doing that. We’re not going to front run our product road map when it comes to anything, frankly. But we do think though that AI shopping, in particular, is a huge opportunity…

…[Question] How does Shopify view the emergence of AI agents in terms of do you guys see this as an opportunity or more of a threat because, on one hand, they could facilitate direct checkout with their own platforms. On the other hand, this may also unlock some new sales channel for Shopify merchants, very similar to sort of what happened with social media commerce

[Answer] We think it’s a great opportunity. Look, the more channels that exist in the world, the more complexity it is for merchants and brands, that’s where the value of Shopify really shines. So if there’s a new surface area, whether it’s through AI agents or through just simply LLMs and AI wrappers, that consumer goes to, to look for a new pair of sneakers or a new cosmetic or a piece of furniture, they want to have access to the most interesting products for the most important brands, and those are all on Shopify. So for us, we think that all of these new areas where commerce is happening is a great thing. It allows Shopify to increase its value.

Taiwan Semiconductor Manufacturing Company (NYSE: TSM)

TSMC’s management continues to expect AI accelerators revenue to double in 2025; management has factored China-bans on US chips into TSMC’s 2025 outlook; AI-related demand outside of China appears to have become even stronger over the last 3 months

We reaffirm our revenue from AI accelerated to double in 2025. The AI accelerators we define as AI GPU, AI ASIC and HPM controllers for AI training and inference in the data center. Based on our customers’ strong demand, we are also working hard to double our CoWoS capacity in 2025 to support their needs…

…[Question] The geopolitical risk, micro concerns is one of the major uncertainty nowadays. Last 2 days, we have like H20 being banned in China, blah, blah, blah. So how does that impact to TSMC’s focus and production planning, right? Do we have enough other customers and demand to keep our advanced node capacity fully utilized? Or how does that change our long-term production planning moving forward?

[Answer] Of course, we do not comment on specific customers or product, but let me assure you that we have taken this into consideration when providing our full year’s growth outlook. Did I answer the question?…

…[Question] AI is still expected to double this year despite the U.S. ban on AI GPUs into China. And I guess, China was a meaningful portion of accelerated shipments well over 10% of volumes. So factoring this in, it would imply your AI outlook this year, still doubling would mean that the AI orders have improved meaningfully outside of China in the last sort of 3 months. Is that how we should interpret your comment about you still expect the business to double?

[Answer] 3 months ago, we are — we just cannot supply enough wafer to our customer. And now it’s a little bit balanced, but still, the demand is very strong. And you are right, other than China, the demand is still very strong, especially in U.S.

TSMC’s management has a disciplined approach when building capacity and management recognises how important the discipline is given the high forecasted demand for AI-related chips

At TSMC, higher level of capital expenditures is always correlated with higher growth opportunities in the following years. We reiterate our 2025 capital budget is expected to be between USD 38 billion and USD 42 billion as we continue to invest to support customers’ growth. About 70% of the capital budget will be allocated for advanced process technologies. About 10% to 20% will be spent for specialty technologies and about 10% to 20% will be spent for advanced packaging, testing, mass-making and others. Our 2025 CapEx also includes a small amount related to our recently announced additional $100 billion investment plan to expand our capacity in Arizona…

…To address the structural increase in the long-term market demand profile, TSMC employed a disciplined and robust capacity planning system. This is especially important when we have such high forecasted demand from AI-related business. Externally, we work closely with our customers and our customers’ customers to plan our capacity. Internally, our planning system involves multiple teams across several functions to assess and evaluate the market demand from both a top-down and bottom-up approach to determine the appropriate capacity build.

TSMC’s management expects the Foundry 2.0 industry to grow 10% year-on-year in 2025, driven by AI-related demand and mild recovery in other end markets; management expects TSMC to outperform the Foundry 2.0 industry in 2025

Looking at the full year of 2025, we expect Foundry 2.0 industry growth to be supported by robust AI-related demand and a mild recovery in other end market segment. In January, we had forecasted a Foundry 2.0 industry to grow 10 points year-over-year in 2025, which is consistent with IDC’s forecast of 11% year-over-year growth for Foundry 2.0…

…We are confident TSMC can continue to outperform the Foundry 2.0 industry growth in 2025.

TSMC’s management thinks impact from recent AI models, including DeepSeek, will lower the barrier to future long-term AI development; TSMC’s management continues to expect mid-40% revenue CAGR from AI accelerators in the 5-years starting from 2024

Recent developments are also positive to AI’s long-term demand outlook. In our assessment, the impact from AI recent models, including DeepSeek, will drive greater efficiency and help lower the barrier to future AI development. This will lead to wider usage and greater adoption of AI models, which all require use of leading-edge silicon. These developments only serve to strengthen our conviction in the long-term growth opportunities from the industry megatrend of 5G, AI and HPC…

…Based on our planning framework, we are confident that our revenue growth from AI accelerators will approach a mid-40s percentage CAGR for the next 5 years period starting from 2024.

TSMC’s 2nd fab in Arizona will utilise N3 process technology and is already complete and management wants to speed up volume production schedule to meet AI-related demand

Our first fab in Arizona has already successfully entered high-volume production in 4Q ’24, utilizing N4 process technology with a yield comparable to our fab in Taiwan. The construction of our second fab, which will utilize the 3-nanometer process technology, is already complete and we are working on speeding up the volume production schedule based on the strong AI-related demand from our customers. Our third and fourth fab will utilize N2 and A16 process technologies and with the expectation of receiving all the necessary permits are scheduled to begin construction later this year. Our fifth and sixth fab will use even more advanced technologies. The construction and ramp schedule for this fab will be based on our customers’ demand.

TSMC’s management believes its A16 technology has a best-in-class backside power delivery solution that is also the first in the industry; A16 is best suited for specific HPC (high-performance computing) products, which means it is best suited for AI-related workloads; A16 is scheduled for volume production in 2026 H2

We also introduced a 16 feature in super power rail or SPR as a separate offering. Compared with the N2P, A16 provides a further 8% to 10% speed improvement at the same power or 15% to 20% power improvement at the same speed and additional 7% to 10% chip density gain. A16 is best suited for specific HPC products with complex signal route and dense power delivery network. Volume production is scheduled for second half 2026.

Tesla (NASDAQ: TSLA)

Tesla’s management continues to expect fully autonomous Tesla rides in Austin, Texas in June 2025; management will sell full autonomy software for Model Y in Austin; management now demarcates CyberCab as a separate product, and all of the other models (S, 3, X, Y) that is compatible with autonomous software as being robotaxis; management reiterates that once Tesla can solve for autonomy in 1 city, it can very quickly scale because Tesla’s autonomous solution is a general solution, not a city-specific solution; Tesla’s autonomous solution involves AI and a specific Tesla-designed AI chip, as opposed to expensive sensors and high-precision maps; the fully autonomous Teslas in June 2025 in Austin will be Model Ys; management expects full autonomy in Tesla’s fleet to ramp up very quickly; management is confident that Tesla will have large-scale autonomy by 2026 H2, meaning, millions of fully autonomous Tesla vehicles by 2026 H2; even with the introduction of full autonomy, management thinks there will be some localised parameters – effectively a mixture of experts model – set for safety; management thinks Tesla’s autonomous solution can scale well because when the FSD (Full Self Driving) software was deployed in China, it used very minimal China-specific data and yet could work well in China; validation of Tesla’s autonomous solution will be important in determining its rate of acceptance; there are now convoys of Teslas in Austin running autonomously in testing in order to compress Tesla’s AI’s learning curve; a consumer in China used FSD on a narrow mountain dirt road; management expects FSD unsupervised to be available for personal use by end of 2025; Musk thinks the first Model Y ro drive itself from factory to customer will happen later in 2025; newly-manufactured Model Ys are already driving themselves around in Tesla factories

We expect to have — be selling fully autonomous rides in June in Austin as we’ve been saying for now several months. So that’s continued…

…Unsupervised autonomy will first be sold for the Model Y in Austin, and then actually, should parse out the term for robotic taxi or robotaxi and just generally like what’s the Cybercab because we’ve got a product called the Cybercab. And then any Tesla, which could be an S, 3, X or Y that is autonomous is a robotic taxi or a robotaxi. It’s very confusing. So the vast majority of the Tesla fleet that we’ve made is capable of being a robotaxi or a robotic taxi…

…Once we can make the system work where you can have paid rides, fully autonomously with no one in the car in 1 city, that is a very scalable thing for us to go broadly within whatever jurisdiction allows us to operate. So because what we’re solving for is a general solution to autonomy, not a city-specific solution for autonomy, once we make it work in a few cities, we can basically make it work in all cities in that legal jurisdiction. So if it’s — once we can make the pace to work in a few cities in America, we can make it work anywhere in America. Once we can make it work in a few cities in China, we can make it work anywhere in China, likewise in Europe, limited only by regulatory approvals. So this is the advantage of having a generalized solution using artificial intelligence and an AI chip that Tesla designed specifically for this purpose, as opposed to very expensive sensors and high-precision maps on a particular neighborhood where that neighborhood may change or often changes and then the car stops working. So we have a general solution instead of a specific solution…

…The Teslas that will be fully autonomous in June in Austin are fully Model Ys. So that is — it’s currently on track to be able to do paid rides fully autonomously in Austin in June, and then to be in many other cities in the U.S. by the end of this year.

It’s difficult to predict the exact ramp sort of week by week and month by month, except that it will ramp up very quickly. So it’s going to be like some — basically an S-curve where it’s very difficult to predict the intermediate slope of the S-curve, but you kind of know where the S-curve is going to end up, which is the vast majority of the Tesla fleet being autonomous. So that’s why I feel confident in predicting large-scale autonomy around the middle of next year, certainly the second half next year, meaning I bet that there will be millions of Teslas operating autonomously, fully autonomously in the second half of next year, yes…

…It does seem increasingly likely that there will be a localized parameter set sort of — especially for places that have, say, a very snowy weather, like I say, if you’re in the Northeast or something like this — you can think of — it’s kind of like a human. Like you can be a very good driver in California but are you going to be also a good driver in a blizzard in Manhattan? You’re not going to be as good. So there is actually some value in — you can still drive but your probability of an accident is higher. So the — it’s increasingly obvious that there’s some value to having a localized set of parameters for different regions and localities…

…You can see that from our deployment of FSD supervised in China with this very minimal data that’s China-specific, the model is generalized quite well to completely different driving styles. That just like shows that the AI-based solution that we have is the right one because if you had gone down the previous rule-based solutions, sort of like more hard-coded HD map-based solutions, it would have taken like many, many years to get China to work. You can see those in the videos that people post online themselves. So the generalized solution that we are pursuing is the right one that’s going to scale well…

…You can think of this like location-specific parameters that Elon alluded to as a mixture of experts. And if you are sort of familiar with the AI models, Grok and others, they all use this mixture of experts to sort of specialize the parameters to specific tasks while still being general…

…What are the critical things that need to get right, one thing I would like to note is validation. Self-driving is a long-tail problem where there can be a lot of edge cases that only happen very, very rarely. Currently, we are driving around in Austin using our QA fleet, but then super [ rare ] to get interventions that are critical for robotaxi operation. And so you can go many days without getting a single intervention. So you can’t easily know whether you are improving or regressing in your capacity. And we need to build out sophisticated simulations, including neural network-based video generation…

…There’s just always a convoy of Teslas going — just going all over to Austin in circles. But yes, I just can’t emphasize this enough. In order to get a figure on the long-tail things, it’s 1 in 10,000, that says 1 in 20,000 miles or 1 in 30,000. The average person drives 10,000 miles in a year. So not trying to compress that test cycle into a matter of a few months. It means you need a lot of cars doing a lot of driving in order to compress that to do in a matter of a month what would normally take someone a year…

…I saw one guy take a Tesla on — autonomously on a narrow dirt road across like a mountain. And I’m like, still a very brave person. And I said this driving along the road with no barriers where he makes a mistake, he’s going to plunge to his doom. But it worked…

…[Question] when will FSD unsupervised be available for personal use on personally-owned cars?

[Answer] Before the end of this year… the acid test being you should — can you go to sleep in your car and wait until your destination? And I’m confident that will be available in many cities in the U.S. by the end of this year…

…I’m confident also that later this year, the first Model Y will drive itself all the way to the customer. So from our — probably from a factory in Austin and our one in here in Fremont, California, I’m confident that from both factories, we’ll be able to drive directly to a customer from the factory…

…We have — it has been put to use — it’s doing useful work fully autonomously at the factories, as Ashok was mentioning, the cars drive themselves from end of line to where they supposed to be picked up by a truck to be taken to the customer… It’s important to note in the factories, we don’t have dedicated lengths or anything. People are coming out every day, trucks delivering supplies, parts, construction.

Tesla’s management expects thousands of Optimus robots to be working in Tesla factories by end-2025; management expects Optimus to be the fastest product to get to millions of units per year; management thinks Tesla can get to 1 million units annually in 4-5 years; management expects to make thousands of Optimus robots at the end of this year; there’s no existing supply chain for all of Optimus’s components, so Tesla has to build a supply chain from scratch; the speed of manufacturing of a product is governed by the speed of the slowest item in the supply chain, but in Optimus’s case, there are many, many such items since it’s so new; Optimus production is currently rate-limited by restrictions on rare-earth magnets from China but management is working on it; management still has no idea how Optimus’s supply chain will look like at maturity

Making good progress in Optimus. We expect to have thousands of Optimus robots working in Tesla factories by the end of this year beginning this fall. And we expect to see Optimus faster than any product, I think, in history to get to millions of units per year as soon as possible. I think we feel confident in getting to 1 million units per year in less than 5 years, maybe 4 years. So by 2030, I feel confident in predicting 1 million Optimus units per year. It might be 2029…

…This year, we’ll make a few — we do expect to make thousands of Optimus robots, but most of that production is going to be at the end of the year…

…Almost everything in Optimus is new. There’s not like an existing supply chain for the motors, gearboxes, electronics, actuators, really anything in the Optimus apart from the AI for Tesla, the Tesla AI computer, which is the same as the one in the car. So when you have a new complex manufactured product, it will move as fast as the slowest and the least lucky component in the entire thing. And as a first order approximation, there’s like 10,000 unique things. So that’s why anyone who tells you they can predict with precision, the production ramp of the truly new product is — doesn’t know what they’re talking about. It is literally impossible…

…Now Optimus was affected by the magnet issue from China because the Optimus actuators in the arm to use permanent magnet. Now Tesla, as a whole, does not need to use permanent magnets. But when something is volume constrained like an arm of the robot, then you want to try to make the motor as small as possible. And then — so we did design in permanent magnets for those motors and those were affected by the supply chain by basically China requiring an export license to send out any rare earth magnets. So we’re working through that with China. Hopefully, we’ll get a license to use the rare earth magnets. China wants some assurances that these are not used for military purposes, which obviously they’re not. They’re just going into a humanoid robot. So — and it’s a nonweapon system…

…[Question] Wanted to ask about the Optimus supply chain going forward. You mentioned a very fast ramp-up. What do you envision that supply chain looking like? Is it going to require many more suppliers to be in the U.S. now because of the tariffs?

[Answer] We’ll have to see how things settle out. I don’t know yet. I mean some things we’re doing, as we’ve already talked about, which is that we’ve already taken tremendous steps to localize our supply chain. We’re more localized than any other manufacturer. And we have a lot of things kind of underway that to increase the localization to reduce supply chain risk associated with geopolitical uncertainty.

Tesla’s supervised FSD (full-self driving) software is safer than a human driver; management has been using social media (X, or Twitter) to encourage people to try out Tesla’s FSD software; management did not directly answer a question on FSD pricing once the vehicle can be totally unsupervised

Not only is FSD supervised safer than a human driver, but it is also improving the lifestyle of individuals who experience it. And again, this is something you have to experience and anybody who has experienced just knows it. And we’ve been doing a lot lately to try and get those stories out, at least on X, so that people can see how other people have benefited from this…

…[Question] Can we envision when you launch unsupervised FSD that there could be sort of a multitiered pricing approach to unsupervised versus supervised similar to what you did with autopilot versus FSD in the past?

[Answer] I mean this is something which we’ve been thinking about. I mean just so now for people who have been trying FSD and who’ve been using FSD, they think given the current pricing is too cheap because for $99, basically getting a personal shop… I mean we do need to give people more time to — if they want to look at — like a key breakpoint is, can you read your text messages or not? Can you write a text message or not? Because obviously, people are doing this, by the way, with unautonomous cars all the time. And if you just go over and drive down the highway and you’ll see people texting while driving doing 80-mile an hour… So that value — it will really be profound when you can basically do whatever you want, including sleep. And then that $99 is going to seem like the best $99 you ever spent in your life.

Tesla’s management thinks Waymo vehicles are too expensive compared to Teslas; Waymo has expensive sensor suites; management thinks Tesla will have lion’s share of the robotaxi market; a big difference between Tesla and Waymo is that Tesla is also manufacturing the cars whereas Waymo is retrofitting cars from other parties; management thinks Tesla’s vision-only approach will not have issues with cameras becoming blinded by glare and stuff because the system uses direct photon counting and bypasses image signal processing

The issue with Waymo’s cars is it costs way more money, but that is the issue. The car is very expensive, made in low-volume. Teslas are probably cost 1/4, 20% of what a Waymo costs and made in very high volume. Ironically, like we’re the ones who made the bet that a pure AI solution with cameras and [ already ] what the car actually will listen for sirens and that kind of thing. It’s the right move. And Waymo decided that an expensive sensor suite is the way to go, even though Google is very good at AI. So I’m wondering…

….As far as I’m aware, Tesla will have, I don’t know, 99% market share or something ridiculous…

…The other thing which people forget is that we’re not just developing the software solution, we are also manufacturing the cars. And like you know what like Waymo has, they’re taking cars and then trying to…

…[Question] You’re still sticking with the vision-only approach. A lot of autonomous people still have a lot of concerns about sun glare, fog and dust. Any color on how you anticipate on getting around those issues? Because my understanding, it kind of blinds the camera when you get glare and stuff.

[Answer] Actually, it does not blind the camera. We use an approach which is a direct photon count. So when you see a processed image, so the image that goes from the — with sort of photon counter, the silicon photon counter, that they get — goes through a digital signal processor or image signal processor. That’s normally what happens. And then the image that you see looks all washed out because if it’s — you pointed a camera at the sun, the post-processing of the photon counting washes things out. It actually adds noise. So quite a big breakthrough that we made some time ago was to go with direct photon counting and bypass the image signal processor. And then you can drive pretty much straight at the sun, and you can also see in what appears to be the blackest of night. And then here in fog, we can see as well as people can, probably better, but in fact probably slightly better than people than the average person anyway.

Tesla’s AI software team and chip-design team was built from scratch with no acquisitions; management thinks Tesla’s team is the best

It is worth noting that Tesla has built an incredible AI software team and AI hardware chip design team from scratch, didn’t acquire anyone. We just built it. So yes, it’s really — I mean I don’t see anyone being able to compete with Tesla at present.

Tesla’s management thinks China is ahead of the USA in physical AI with respect to autonomous drones because China has the ability to manufacture autonomous drones, but the USA does not;  management thinks Tesla is ahead of any company in the world, even Chinese companies, in terms of humanoid autonomous robots 

[Question] Between China and United States, who, in your opinion, is further ahead on the development of physical AI, specifically on humanoid and also drones?

[Answer] A friend of mine posted on X, I reposted it. I think of a prophetic statement, which is any country that cannot manufacture its own drones is going to be the vassal state of any country that can. And we can’t — America cannot currently manufacture its own drones. Let that sink in, unfortunately. So China, I believe manufactures about 70% of all drones. And if you look at the total supply chain, China is almost 100% of drones are — have a supply chain dependency on China. So China is in a very strong position. And here in America, we need to tip more of our people and resources to manufacturing because this is — and I have a lot of respect for China because I think China is amazing, actually. But the United States does have such a severe dependency on China for drones and be unable to make them unless China gives us the parts, which is currently the situation.

With respect to humanoid robots, I don’t think there’s any company and any country that can match as well. Tesla and SpaceX are #1. And then I’m a little concerned that on the leaderboard, ranks 2 through 10 will be Chinese companies. I’m confident that rank 1 will be Tesla.

The Trade Desk (NASDAQ: TTD)

Trade Desk’s industry-leading Koa AI tools are embedded across Kokai; adoption of Kokai is now ahead of schedule, with 2/3 of clients using it; the bulk of spending on Trade Desk now takes place on Kokai; management continues to expect all Trade Desk clients to be using Kokai by end-2025; management is confident that Kokai will be seen as the most powerful buying platform by the industry by end-2025

The injection of our industry-leading Koa AI tools across every aspect of our platform has been a game changer, and we are just getting started…

…The core of Kokai has been delivered and adoption is now ahead of schedule. Around 2/3 of our clients are now using it and the bulk of the spend in our platform is now running through Kokai. We expect all clients to be using it by the end of the year…

…I’m confident that by the end of this year, we will reflect on Kokai as the most powerful buying platform the industry has ever seen, precisely because it combines client needs with the strong point of view on where value is shifting and how to deliver the most efficient return on ad spend.

…Kokai adoption now represents the majority of our spend, almost 2/3, a significant acceleration from where we ended 2024.

Deutsche Telekom used Kokai’s AI tools and saw an 11x improvement in post-click conversions and an 18x improvement in the cost of conversions; Deutsche Telekom is now planning to use Kokai across more campaigns and transition from Trade Desk’s previous platform, Solimar, like many other Trade Desk clients

Deutsche Telekom. They’re running the streaming TV service called Magenta TV, and they use our platform to try to grow their subscriber base…

…Using seed data from their existing customers, Deutsche Telekom was able to use the advanced AI tools in our Kokai platform to find new customers and define the right ad impressions across display and CTV, most relevant to retain those new customers successfully, and the results were very impressive. They saw an 11x improvement in post-click conversions attributed to advertising and an 18x improvement in the cost of those conversions. Deutsche Telekom is now planning to use Kokai across more campaigns, a transition that is fairly typical as clients move from our previous platform, Solimar to our newer, more advanced AI fuel platform, Kokai.

Visa (NASDAQ: V)

Visa recently announced the Authorize.net product that features AI capabilities, including an AI agent; Authorize.net enables all different types of payments;

In Acceptance Solutions, we recently announced 2 new product offerings. The first is a completely new version of Authorize.net, launching in the U.S. next quarter and additional countries next year. It features a streamlined user in base; AI capabilities with an AI agent, Anet; improved dashboards for day-to-day management and support for in-person card readers and Tap to Phone. It will help businesses analyze data, summarize insights and adapt to rapidly changing customer trends…

……I talked about the Authorize.net platform that we’ve relaunched and we’re relaunching. That’s a great example of enabling all different types of payments. And that’s going to be, we think, a really positive impact in the market specifically focused on growing our share in small business checkout.

Visa has an enhanced holistic fraud protection solution known as Adaptive Real-time Individual Change identification (ARIC) Risk Hub; ARIC Risk Hub uses AI to build more accurate risk profiles;

We also now provide an enhanced holistic fraud protection solution from Featurespace called the Adaptive Real-time Individual Change identification, or ARIC, Risk Hub. This solution utilizes machine learning and AI solutions to enable clients to build more accurate risk profiles and more confidently detect and block fraudulent transactions, ultimately helping to increase approvals and stop bad actors in real time. 


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I have a vested interest in Alphabet, Amazon, Apple, ASML, Coupang, Datadog, Mastercard, Meta Platforms, Microsoft, Netflix, Paycom Software, PayPal, Shopify, TSMC, Tesla, The Trade Desk and Visa. Holdings are subject to change at any time.

What We’re Reading (Week Ending 11 May 2025)

The best articles we’ve read in recent times on a wide range of topics, including investing, business, and the world in general.

We’ve constantly been sharing a list of our recent reads in our weekly emails for The Good Investors.

Do subscribe for our weekly updates through the orange box in the blog (it’s on the side if you’re using a computer, and all the way at the bottom if you’re using mobile) – it’s free!

But since our readership-audience for The Good Investors is wider than our subscriber base, we think sharing the reading list regularly on the blog itself can benefit even more people. The articles we share touch on a wide range of topics, including investing, business, and the world in general. 

Here are the articles for the week ending 11 May 2025:

1. AGI is not a milestone – Sayash Kapoor and Arvind Narayanan

Many people have the intuition that AGI will have these properties. It will be so powerful and humanlike that it will be obvious when we’ve built it. And it will immediately bring massive benefits and risks — automation of a big swath of the economy, a great acceleration of innovation, including AI research itself, and potentially catastrophic consequences for humanity from uncontrollable superintelligence.

In this essay, we argue that AGI will be exactly the opposite — it is unobservable because there is no clear capability threshold that has particular significance; it will have no immediate impact on the world; and even a long-term transformation of the economy is uncertain…

…One argument for treating AGI as a milestone — and taking declarations of AGI seriously — is that AGI could lead to rapid economic impacts, both positive and negative, such as a world without scarcity, an end to the concept of money, or sudden mass joblessness.

But AI’s economic impact is only realized when it is adopted across the economy. Technical advances are necessary, but not sufficient, to realize this impact. For past general-purpose technologies, such as electricity, computing, and the internet, it took decades for the underlying technical advances to diffuse across society. The miracle of the Industrial Revolution wasn’t the high growth rate — annual growth rates averaged below 3% — but the sustained period of decades of growth.

There are many bottlenecks to the diffusion of AI: developing useful products and applications, training the workforce to utilize these products, implementing organizational changes to enable AI use, and establishing laws and norms that facilitate AI adoption by companies. Like past general-purpose technologies, we expect the economic impacts of AI to be realized over decades, as this process of diffusion unfolds…

…The US and China are often described as being in an AI arms race, with each country racing to build AGI. It is hypothesized that the country to build it first would have a decisive strategic advantage — resulting in dominance in the world order for the foreseeable future.

This narrative doesn’t make sense because the knowledge required to create AI models, and model capabilities themselves, tend to proliferate quickly between countries. There are hundreds of thousands of AI technologists, and they work in the private sector rather than government labs, so it is not feasible to keep secrets at that scale.

Invention — in this case, AI model development — is overrated as a source of competitive advantage…

…While Chinese AI companies are at most 6-12 months behind leading US companies in terms of AI models and capabilities, China lags significantly behind the US in several key indicators that might enable diffusion: Digitization, cloud computing adoption, and workforce training. All of these are required to enable the productive diffusion of AI advances across industries. This is the actual source of American competitive advantage.

Of course, this could change in the coming years. But if it does, it will result from policy changes to promote diffusion rather than the development of AGI…

…Even if it doesn’t have immediate economic impacts, could AGI unlock, say, 10% annual GDP growth that could add up to something big over a few decades?

Maybe. But it is far from clear why and how this will happen.

Historically, this kind of acceleration in growth has happened very few times — the industrial revolution had this effect, but not the internet, which barely had any impact on GDP. Note that even if you don’t think that GDP is the right thing to measure, a qualitative change in the GDP growth rate is a good proxy for whatever fundamental change in the economy you care about.

The problem is that accelerating growth requires eliminating bottlenecks to progress. That’s harder than most AI boosters assume. AI will likely have uneven effects across sectors, and long-term growth will be bottlenecked by the weakest sector…

…More broadly, progress depends not just on the technology but on having the right preconditions — complementary innovations as well as cultural, economic, and political factors. If all it took to create the industrial revolution was the invention of steam power, the Roman Empire would have done it.

Our current laws, norms, institutions, and politics evolved in a time of much less technological potential. They are already choking opportunities for straightforward types of growth, such as building more public infrastructure. To reap the economic benefits that broad cognitive automation can potentially bring, the degree of structural change that needs to happen is unfathomably greater…

…On the flip side, AGI could be a turning point for AI’s societal risks. Could it cause loss of control, massive societal harm, or even human extinction?

Discussions of AGI risks conflate power — the ability to modify the environment — with capability — the capacity to solve specified tasks correctly. Capability is an intrinsic property of an AI system, whereas power is a matter of how we design the environment in which AI systems operate. And humans have agency over this design. This distinction is often overlooked…

…We do expect AI capabilities to keep increasing. But regardless of capability level, we can choose to ensure that AI remains a tool and is not given power and autonomy to operate without human oversight. In the AI as Normal Technology essay, we address all the usual counterarguments to this, including arms races among companies, power seeking, superhuman persuasion, deceptive alignment, and more.

We argue in the paper that there will be strong business incentives against deploying AI without adequate oversight, and that these incentives can and should be buttressed by regulation when necessary. This has historically been the case in areas ranging from self-driving cars to AI assistants. We don’t expect this trend to suddenly flip once AI capabilities reach a presumed tipping point that we arbitrarily designate as AGI…

…Yet another reason to consider AGI a milestone is the view that shortly after we build AGI, AI systems could recursively self-improve — AGI could train future versions of models that become far more capable, leading to an “intelligence explosion.” Soon afterwards, we would get superintelligent AI (AI systems that far exceed human abilities on any conceivable task), leading to either utopia or dystopia, depending on how well superintelligent AI is “aligned” with human interests.

In the normal technology view, there are two big reasons to doubt this narrative. The first is that even if arbitrary speedups in AI methods are possible, we think that innovation and diffusion will happen at human speed…

…Second, the fact that AI would help conduct AI research does not imply that this process can be arbitrarily accelerated. AI is already used to automate a significant portion of AI research today. But there are many bottlenecks to progress in AI methods, such as the social nature of data collection and real-world interaction that might be required for achieving certain capabilities, computational and cost limits, or herding around popular or intuitive ideas while ignoring the ones that enable true breakthroughs.

We could be wrong about this, and recursive self-improvement could be possible, leading to unbounded speedups in progress in AI methods. And this might have some interesting implications, including some discontinuities in impact, even if widespread diffusion will be slower. For these reasons, it is important to have early warning systems for recursive self-improvement…

…OpenAI’s 2018 definition of AGI was “highly autonomous systems that outperform humans at most economically valuable work”. From our perspective — our interest being in the impacts of AI — this definition is potentially very useful. If AI outperformed [all] humans at most economically valuable work, it would be unquestionably impactful.

But let’s be clear — this is not a property of an AI system. It is a property of the state of the world. It has at least as much to do with the complementary innovations that we make and the extent to which we choose to integrate AI into our organizations and institutions. It would be absurd to try to test an AI system in isolation in the lab and ask whether it outperforms people at their jobs. It is a category error.

For example, whether AI can (autonomously) outperform a medical researcher depends in part on whether we collectively choose to allow AI systems to perform large-scale medical experiments on people. We shouldn’t and we won’t, which means that irrespective of the systems’ capabilities, they cannot perform the function of a medical researcher. This might be an extreme example, but similar bottlenecks arise in virtually every job.

2. The Lesson in Buffett’s Winning Apple Bet – Sarah Krouse

A Berkshire investment manager bought a small stake in the iPhone maker in 2016, nine years after its introduction. Around that time, Buffett asked another investment manager to find an S&P 500 stock that met three criteria.

Buffett wanted a company with a reasonably cheap price/earnings multiple of no more than 15, based on the next 12 months’ projected earnings, The Wall Street Journal previously reported. Berkshire managers had to be at least 90% sure that the stock would generate higher earnings over the next five years. And he wanted Berkshire to be at least 50% confident that the company would grow a minimum of 7% annually for at least five years.

The manager’s research pointed to Apple.

The stock was already a winner by then—and not a huge bargain. It traded for about 14 times its expected earnings, on the higher end of the range of what Buffett had been looking for. Some investors had sold after capturing gains.

And Buffett, a flip-phone user at the time, was hardly a techie. But he saw the hold the company had on its customers. Buffett’s grandchildren were iPhone devotees, and Apple’s customer retention rate was about 95%.

3. What happens when a nation built on growth runs short of babies? – Nina Chen

China’s plummeting birthrate can be traced to three interlocking factors that form a vicious cycle: the shrinking pool of childbearing-age women, collapsing marriage rates, and evaporating fertility intentions. These elements don’t merely add up – they multiply each other’s downward momentum, creating what demographers call a “triple demographic shock.”…’

… The number of women in their prime reproductive years (20-29) has undergone a staggering contraction, halving from 12.51 million in the 1990 birth cohort to just 6.33 million for those born in 2003. This dramatic shrinkage, a direct consequence of strict family planning policies after 1987, represents an irreversible demographic reality…

…China’s marriage rate has collapsed to a historic low of 4.3 marriages per 1,000 people in 2024—less than half its 2013 peak (9.9‰). This places China alongside Japan and South Korea (4.2-4.3‰) but significantly below the U.S. (5.1‰), reflecting broader East Asian demographic trends…

…The average age of first marriage for women has jumped from 24 in 2010 to 28.2 in 2023, with over 30% now marrying after 30—directly truncating peak fertility years (25-29)…

… In major cities, saving for a marital home down payment now consumes 15-20 years of family income, while betrothal gifts (bride prices) often exceed 300-500% of annual household earnings—creating what amounts to a brutal financial gatekeeping system…

…China’s marriage collapse directly strangles fertility—pushing the total fertility rate (TFR) to a catastrophic 1.0, far below both OECD averages (1.5) and Japan (1.2). This crisis stems not from changing individual preferences but from structural contradictions between progressive education and regressive social systems.

Higher education expansion has reshaped demographics: female tertiary enrollment rates exploded from 3.4% in 1990 to 59.6% in 2022, with each additional year of education reducing desired fertility by 0.26 children. Paradoxically, within each educational cohort, women’s fertility intentions have actually increased since 2010, according to a research made by MetroData. The aggregate decline occurs because higher-education groups—who have fewer children—now dominate the population…

…Groundbreaking research reveals the severe professional tradeoffs Chinese women face when starting families. According to the 2023 Report on Chinese Women’s Career Development, a rigorous 2021 study published in Population & Economics (a Peking University core journal) demonstrates that each child born to middle-income families reduces mothers’ employment probability by 6.6% for the first child and an additional 9.3% for the second—even after controlling for education, region, and household characteristics. Notably, children show no statistically significant impact on fathers’ employment prospects…

…When discussing the impacted industries, I’ve found that in most cases, the decline in newborn numbers is not the root cause of their struggles—rather, it serves as a catalyst, exposing and amplifying pre-existing structural weaknesses within these sectors…

…Maternity service pricing remains at levels set during the midwife era of the 1950s, yet hospitals must maintain modern, 24/7 medical teams. This “high-cost, low-return” operation previously relied on overwhelming patient volume to break even. However, with national newborn numbers dropping below 9 million in 2023, the fatal flaw was exposed. Data from a Shanghai specialist hospital shows obstetricians’ incomes have fallen 20-30%, with bonuses halved during low seasons…

… The CMI index and tier-4 surgery metrics in public hospital evaluations contradict maternity care’s core mission of “prevention-first, safety-focused” care. As one tertiary hospital administrator admitted, “Achieving 98% natural delivery rates comes at the cost of bottom-tier performance evaluations.”

4. Mark Zuckerberg – Meta’s AGI Plan – Dwarkesh Patel and Mark Zuckerberg

Mark Zuckerberg: I’m also excited about the Behemoth model, which is coming up. It’s going to be our first model that’s sort of at the frontier—more than 2 trillion parameters…

…Mark Zuckerberg: In general, the prediction that this would be the year open source generally overtakes closed source as the most used models out there, I think that’s generally on track to be true. One interesting surprise—positive in some ways, negative in others, but overall good—is that it’s not just Llama. There are a lot of good ones out there. I think that’s quite good. Then there’s the reasoning phenomenon, which you’re alluding to talking about o3, o4, and other models. There’s a specialization happening. If you want a model that’s the best at math problems, coding, or different things like those tasks, then reasoning models that consume more test-time or inference-time compute in order to provide more intelligence are a really compelling paradigm…

…Mark Zuckerberg: One of the things we’ve generally tried to do over the last year is anchor more of our models in our Meta AI product north star use cases. The issue with open source benchmarks, and any given thing like the LM Arena stuff, is that they’re often skewed toward a very specific set of uses cases, which are often not actually what any normal person does in your product. The portfolio of things they’re trying to measure is often different from what people care about in any given product…

…Mark Zuckerberg: I think a lot of them are quite easily gameable. On the Arena you’ll see stuff like Sonnet 3.7, which is a great model, and it’s not near the top. It was relatively easy for our team to tune a version of Llama 4 Maverick that could be way at the top. But the version we released, the pure model, actually has no tuning for that at all, so it’s further down. So you just need to be careful with some of these benchmarks. We’re going to index primarily on the products…

…Mark Zuckerberg: There’s a space which, if I had to guess, I think will end up being the most used one: quick, very natural to interact with, natively multimodal, fitting throughout your day in the ways you want to interact with it…

…Mark Zuckerberg: If you fast-forward a few years, I think we’re just going to be talking to AI throughout the day about different things we’re wondering about. You’ll have your phone. You’ll talk to it while browsing your feed apps. It’ll give you context about different stuff. It’ll answer your questions. It’ll help you as you’re interacting with people in messaging apps. Eventually, I think we’ll walk through our daily lives and have glasses or other kinds of AI devices and just seamlessly interact with it all day long…

…Mark Zuckerberg: I would guess that sometime in the next 12 to 18 months, we’ll reach the point where most of the code that’s going toward these efforts is written by AI. And I don’t mean autocomplete. Today you have good autocomplete. You start writing something and it can complete a section of code. I’m talking more like: you give it a goal, it can run tests, it can improve things, it can find issues, it writes higher quality code than the average very good person on the team already…

…Mark Zuckerberg: Part of what I generally disagree with on the fast-takeoff view is that it takes time to build out physical infrastructure. If you want to build a gigawatt cluster of compute, that just takes time. NVIDIA needs time to stabilize their new generation of systems. Then you need to figure out the networking around it. Then you need to build the building. You need to get permitting. You need to get the energy. Maybe that means gas turbines or green energy, either way, there’s a whole supply chain of that stuff…

…Mark Zuckerberg: One of my core guiding principles in designing products is that people are smart. They know what’s valuable in their lives. Every once in a while, something bad happens in a product and you want to make sure you design your product well to minimize that. But if you think something someone is doing is bad and they think it’s really valuable, most of the time in my experience, they’re right and you’re wrong. You just haven’t come up with the framework yet for understanding why the thing they’re doing is valuable and helpful in their life…

…Mark Zuckerberg: Here’s one stat from working on social media for a long time that I always think is crazy. The average American has fewer than three friends, fewer than three people they would consider friends. And the average person has demand for meaningfully more. I think it’s something like 15 friends or something. At some point you’re like, “All right, I’m just too busy, I can’t deal with more people.” But the average person wants more connection than they have…

…Dwarkesh Patel: If China is better at physical infrastructure, industrial scale-ups, getting more power and more data centers online, how worried are you that they might beat us here?

Mark Zuckerberg: It’s a real competition. You’re seeing industrial policies really play out. China is bringing online more power. Because of that, the US really needs to focus on streamlining the ability to build data centers and produce energy. Otherwise, I think we’ll be at a significant disadvantage. At the same time, some of the export controls on things like chips, I think you can see how they’re clearly working in a way. There was all the conversation with DeepSeek about, “Oh, they did all these very impressive low-level optimizations.” And the reality is, they did and that is impressive. But then you ask, “Why did they have to do that, when none of the American labs did it?” It’s because they’re using partially nerfed chips that are the only ones NVIDIA is allowed to sell in China because of the export controls. DeepSeek basically had to spend a bunch of their calories and time doing low-level infrastructure optimizations that the American labs didn’t have to do…

…Mark Zuckerberg: We made the Llama Scout and Maverick models certain sizes for a specific reason. They fit on a host and we wanted certain latency—especially for the voice models that we’re working on—that we want to pervade everything we’re doing from the glasses to all of our apps to the Meta AI app and all that stuff. There’s a level of control of your own destiny that you only get when you build the stuff yourself…

…Mark Zuckerberg: You also asked, would it not be important anymore because other people are doing open source? On this, I’m a little more worried. You have to ask yourself this. For anyone who shows up now and is doing open source—now that we have done it—would they still be doing open source if we weren’t doing it?…

…Mark Zuckerberg: I think these models encode values and ways of thinking about the world. We had this interesting experience early on, where we took an early version of Llama and translated it. I think it was French, or some other language. The feedback we got from French people was, “This sounds like an American who learned to speak French. It doesn’t sound like a French person.” And we were like, “what do you mean, does it not speak French well?” No, it speaks French fine. It was just that the way it thought about the world seemed slightly American. So I think there are these subtle things that get built into the models. Over time, as models get more sophisticated, they should be able to embody different value sets across the world. So maybe that’s not a particularly sophisticated example, but I think it illustrates the point. Some of the stuff we’ve seen in testing some of the models, especially coming out of China, have certain values encoded in them. And it’s not just a light fine-tune to change that…

…Mark Zuckerberg: There’s a whole different set of issues around coding, which is the other verifiable domain. You need to worry about waking up one day and if you’re using a model that has some tie to another government, can it embed vulnerabilities in code that their intelligence organizations could exploit later? In some future version you’re using a model that came from another country and it’s securing your systems. Then you wake up and everything is just vulnerable in a way that that country knows about and you don’t. Or it turns on a vulnerability at some point. Those are real issues…

…Mark Zuckerberg: You can basically take a model that’s much bigger, and capture probably 90 or 95% of its intelligence, and run it in something that’s 10% of the size…

…Mark Zuckerberg: There are going to be business models at each point along the spectrum. At Meta, for the consumer piece we definitely want to have a free thing. I’m sure that will end up being ad-supported. But I also think we’re going to want to have a business model that supports people using arbitrary amounts of compute to do even more amazing things than what it would make sense to offer in the free service. For that, I’m sure we’ll end up having a premium service…

…Mark Zuckerberg: AI is interesting because, more than some of the other stuff that we do, it is more research and model-led than really product-led. You can’t just design the product that you want and then try to build the model to fit into it. You really need to design the model first and the capabilities that you want, and then you get some emergent properties. Then it’s, “Oh, you can build some different stuff because this turned out in a certain way.” At the end of the day, people want to use the best model…

…Dwarkesh Patel: Will tariffs increase the cost of building data centers in the US and shift buildouts to Europe and Asia?

Mark Zuckerberg: It is really hard to know how that plays out. I think we’re probably in the early innings on that, and it’s very hard to know…

…Mark Zuckerberg: We have almost three and a half billion people using our services every day. One question we’ve struggled with forever is how do we provide customer support? Today, you can write an email, but we’ve never seriously been able to contemplate having voice support where someone can just call in. I guess that’s maybe one of the artifacts of having a free service. The revenue per person isn’t high enough to have an economic model where people can call in… But let’s say AI can handle 90% of that. Then if it can’t, it kicks it off to a person. If you get the cost of providing that service down to one-tenth of what it would’ve otherwise been, then maybe now it actually makes sense to do it. That would be cool. So the net result is that I actually think we’re probably going to hire more customer support people. The common belief is that AI will automate jobs away. But that hasn’t really been how the history of technology has worked. Usually, you create things that take away 90% of the work, and that leads you to want more people, not less.

5. The Best OTC Investment Story Never Told – Joe Raymond

MN&C started making a market in Best Lock (BLOC) in the mid-1970s.

Market makers provide bids and offers on select stocks, facilitating trading and liquidity. They earn a profit on the spread (the price between the bid and offer)…

…The mid-1970s was a good time to find bargains, and BLOC certainly looked like a bargain. It was trading for around 3-5x earnings and a discount to book value.

Best Lock was a simple business. It designed, manufactured, and marketed lock mechanisms, primarily for doors…

…The annual report showed over 4,000 shareholders of record, yet MN&C was only getting a few orders a year.

Where were all the shareholders and why wasn’t there more volume in the stock?…

…Best Lock was founded in Seattle in 1922 by Frank E. Best.

Like many startups in the 1920s, shares were sold door-to-door to average citizens.

When the Depression hit, Best Lock stopped paying dividends. Then the company moved its headquarters from Seattle to Indianapolis to be closer to suppliers and customers…

…By the late 1970s when Martin was looking at the shareholder list, nearly 50 years had passed and the company was again profitable, growing, and paying dividends.

After going through the Depression, World War II, moving to Indianapolis, and the Seattle address overhaul, many shareholders had been lost.

In many cases, heirs had no idea they inherited the stock…

…Martin knew he had an opportunity on his hands: an illiquid stock with lost shareholders trading for a low-single-digit P/E multiple.

He decided to form a new company dedicated to finding the rightful owners of these shares. This involved genealogical research and many hours spent at the local library and county records office…

…Best Lock was trading for around $30 per share at the time, so after his one third fee Martin was buying the shares for around $20. This equated to 2-3x earnings.

Over time, Martin was able to acquire roughly 15% of the float (shares not held by the Best family) using this approach…

…A few years after taking full control, Russell decided to take the company private.

He did this through a series of reverse splits in 1998 that effectively cashed everyone out for $525 per share—a high-single-digit multiple of earnings. The stock had been trading for $300 prior to the reverse splits, so the cash out price was a nice 75% premium.

A group of minority shareholders dissented and perfected their appraisal rights in Delaware—arguing that Russell Best had violated his fiduciary duty, and that the $525/share figure was too low for a company of Best Lock’s caliber.

At some point in the legal process, Russell decided to explore a sale of the entire company.

Stanley Black & Decker stepped up to the plate and offered $310 million to buy Best Lock (more than triple the reverse split takeout price). Final payout for the dissenting shareholders was received in April 2003.

Those initial shares Martin was buying for $20 in 1980 turned into $1,597 in 2003, good for a CAGR of 20% before dividends over the 23-year period.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. We currently have a vested interest in Apple and Meta Platforms. Holdings are subject to change at any time.

Insights From Berkshire Hathaway’s 2025 Annual General Meeting

Warren Buffett and his team shared plenty of wisdom at the recent Berkshire Hathaway AGM.

Warren Buffett is one of my investment heroes. On 3 May 2025, he held court at the 2025 Berkshire Hathaway AGM (annual general meeting).

For many years, I’ve anticipated the AGM to hear his latest thoughts. This year’s session holds special significance because it may well be his last – during the AGM, he announced that he would be stepping down as CEO of Berkshire Hathaway by the end of this year, ending an amazing 60-year run since becoming the company’s leader in 1965. Greg Abel is slated to be Berkshire Hathaway’s next CEO.

The most recent Berkshire meeting contained great insights from Buffett and other senior Berkshire executives that I wish to share and document. Before I get to them, I would like to thank my friend Thomas Chua for performing a great act of public service. Shortly after the AGM ended, Thomas posted a transcript of the session at his excellent investing website Steady Compounding

Without further ado, the italicised passages between the two horizontal lines below are my favourite takeaways after I went through Thomas’ transcript.


Buffett thinks his idea on import certificates is different from tariffs and that it’s important to have more balanced trade between countries; he also thinks that trade should not be wielded as a weapon, and that the more prosperous the world becomes, the better the USA would be

Becky Quick: Thanks Warren. This first question comes from Bill Mitchell. I received more questions about this than any other question. He writes, “Warren, in a 2003 Fortune article, you argued for import certificates to limit trade deficits and said these import certificates basically amounted to a tariff, but recently you called tariffs an act of economic war. Has your view on trade barriers changed or do you see import certificates as somehow distinct from tariffs?”

Warren Buffett: Well, the import certificates were distinct, but their goal was to balance imports against exports so that the trade deficit would not grow in an enormous way. It had various provisions to help third world countries catch up a little bit. They were designed to balance trade, and I think you can make very good arguments that balanced trade is good for the world. It makes sense for cocoa to be raised in Ghana and coffee in Colombia and a few other things…

…There’s no question that trade can be an act of war, and I think it’s led to bad things like the attitudes it’s brought out in the United States. We should be looking to trade with the rest of the world. We should do what we do best, and they should do what they do best…

…The main thing is that trade should not be a weapon. The United States has become an incredibly important country starting from nothing 250 years ago – there’s never been anything like it. And it’s a big mistake when you have 7.5 billion people who don’t like you very well and you have 300 million people crowing about how well they’ve done. I don’t think it’s right and I don’t think it’s wise. The more prosperous the rest of the world becomes, it won’t be at our expense – the more prosperous we’ll become and the safer we’ll feel and your children will feel someday.

Buffett did not look at macroeconomic factors in Japan when making the huge investments he did in five Japanese trading houses; Berkshire won’t be selling the Japanese investments for a long, long time, if at all; Berkshire would be happy to invest a lot more in Japan if there was capacity to do so; the fact that Berkshire could borrow in Japanese Yen to hedge the Japanese investments’ currency risk is merely a lucky coincidence

Question: Mr. Buffett and Mr. Munger did a very good and successful investment in Japan in the past five or six years. The recent CPI in Japan is currently above 3%, not far away from its 2% target. Bank of Japan seems very determined in raising rates while Fed, ECB, and other central banks are considering cutting them. Do you think BOJ makes sense to proceed with the rate hike? Will its planned rate hike deter you from further investing in the Japanese stock market or even considering realizing your current profits?

Warren Buffett: Well, I’m going to extend the same goodwill to Japan that you’ve just extended to me. I’ll let the people of Japan determine their best course of action in terms of economics. It’s an incredible story. It’s been about six years now since our Japanese investments. I was just going through a little handbook that probably had two or three thousand Japanese companies in it. One problem I have is that I can’t read that handbook anymore – the print’s too small. But there were these five trading companies selling at ridiculously low prices. So I spent about a year acquiring them. And then we got to know the people better, and everything that Greg and I saw, we liked better as we went along…

Greg Abel: When you think of the five companies, there’s definitely a couple meetings a year, Warren. The thing we’re building with the five companies is, one, it’s been a very good investment, but we really envision holding the investment for 50 years or forever…

Warren Bufett: We will not be selling any stock. That will not happen in decades, if then…

…It’s too bad that Berkshire has gotten as big as it is because we love that position and I’d like it to be a lot larger. Even with the five companies being very large in Japan, we’ve got at market in the range of $20 billion invested, but I’d rather have $100 billion than $20 billion…

…The Japanese situation is different because we intend to stay so long with that position and the funding situation is so cheap that we’ve attempted to some degree to match purchases against yen-denominated funding. But that’s not a policy of ours…

Greg Abel: There’s no question we were fundamentally very comfortable with investing in the five Japanese companies and recognizing we’re investing in yen. The fact we could then borrow in yen was almost just a nice incremental opportunity. But we were very comfortable both with the Japanese companies and with the currency we would ultimately realize in yen.

Just the simple act of reading about companies can lead to great investment opportunities

Warren Buffett: It’s been about six years now since our Japanese investments. I was just going through a little handbook that probably had two or three thousand Japanese companies in it…

…I never dreamt of that when I picked up that handbook. It’s amazing what you can find when you just turn the page. We showed a movie last year about “turn every page,” and I would say that turning every page is one important ingredient to bring to the investment field. Very few people do turn every page, and the ones who turn every page aren’t going to tell you what they’re finding. So you’ve got to do a little of it yourself.

Berkshire’s current huge cash position is the result of Buffett not being able to find sufficiently attractive investment opportunities; Buffett thinks that great investment opportunities appear infrequently

Becky Quick: This next question comes from Advate Prasad in New York. He writes, “Today, Berkshire holds over $300 billion in cash and short-term investments, representing about 27% of total assets, a historically high figure compared to the 13% average over the last 25 years. This has also led Berkshire to effectively own nearly 5% of the entire US Treasury market. Beyond the need for liquidity to meet insurance obligations, is the decision to raise cash primarily a de-risking strategy in response to high market valuations?…

Warren Buffett: Well, I wouldn’t do anything nearly so noble as to withhold investing myself just so that Greg could look good later on. If he gets any edge of what I leave behind, I’ll resent it. The amount of cash we have – we would spend $100 billion if something is offered that makes sense to us, that we understand, offers good value, and where we don’t worry about losing money. The problem with the investment business is that things don’t come along in an orderly fashion, and they never will. I’ve had about 16,000 trading days in my career. It would be nice if every day you got four opportunities or something like that with equal attractiveness. If I was running a numbers racket, every day would have the same expectancy that I would keep 40% of whatever the handle was, and the only question would be how much we transacted. But we’re not running that kind of business. We’re running a business which is very opportunistic.

Investing in stocks is a much better investment-bet than investing in real estate

Warren Buffett: Well, in respect to real estate, it’s so much harder than stocks in terms of negotiation of deals, time spent, and the involvement of multiple parties in the ownership. Usually when real estate gets in trouble, you find out you’re dealing with more than just the equity holder. There have been times when large amounts of real estate have changed hands at bargain prices, but usually stocks were cheaper and they were a lot easier to do.

Charlie did more real estate. Charlie enjoyed real estate transactions, and he actually did a fair number of them in the last 5 years of his life. But he was playing a game that was interesting to him. I think if you’d asked him to make a choice when he was 21 – either be in stocks exclusively for the rest of his life or real estate for the rest of his life – he would have chosen stocks. There’s just so much more opportunity, at least in the United States, that presents itself in the security market than in real estate…

…When you walk down to the New York Stock Exchange, you can do billions of dollars worth of business, totally anonymous, and you can do it in 5 minutes. The trades are complete when they’re complete. In real estate, when you make a deal with a distressed lender, when you sign the deal, that’s just the beginning. Then people start negotiating more things, and it’s a whole different game with a different type of person who enjoys the game.

Berkshire’s leaders think AI will have a massive impact on the insurance business, but they are not in a hurry to pour money into AI as they think there’s plenty of faddish capital in the space

Ajit Jain: There is no question in my mind that AI is going to be a real game-changer. It’s going to change the way we assess risk, we price risk, we sell risk, and then the way we end up paying claims. Having said that, I certainly also feel that people end up spending enormous amounts of money trying to chase the next fashionable thing…

…Right now the individual insurance operations do dabble in AI and try to figure out the best way to exploit it. But we have not yet made a conscious big-time effort in terms of pouring a lot of money into this opportunity.

Buffett prefers Ajit Jain to any kind of sophisticated AI systems when pricing insurance risks

Warren Buffett: I wouldn’t trade everything that’s developed in AI in the next 10 years for Ajit. If you gave me a choice of having a hundred billion dollars available to participate in the property casualty insurance business for the next 10 years and a choice of getting the top AI product from whoever’s developing it or having Ajit making the decisions, I would take Ajit anytime – and I’m not kidding about that.

Despite the political upheaval happening in the USA right now, Buffett still thinks the long-term future of the country is incredibly bright; in Buffett’s eyes, the USA has been through plenty of tumultuous periods and emerged stronger

Warren Buffett: America has been undergoing significant and revolutionary change ever since it was developed. I mentioned that we started out as an agricultural society with high promises that we didn’t deliver on very well. We said all men were created equal, and then we wrote a constitution that counted blacks as three-fifths of a person. In Article 2, you’ll find male pronouns used 20 times and no female pronouns. So it took until 1920, with the 19th amendment, to finally give women the vote that we had promised back in 1776.

We’re always in the process of change, and we’ll always find all kinds of things to criticize in the country. But the luckiest day in my life is the day I was born, because I was born in the United States. At that time, about 3% of all births in the world were taking place in the United States. I was just lucky, and I was lucky to be born white, among other things…

…We’ve gone through all kinds of things – great recessions, world wars, the development of the atomic bomb that we never dreamt of when I was born. So I would not get discouraged about the fact that we haven’t solved every problem that’s come along. If I were being born today, I would just keep negotiating in the womb until they said I could be in the United States.

It’s important to be patient while waiting for opportunities, but equally important to pounce when the opportunity appears

Warren Buffett: The trick when you get in business with somebody who wants to sell you something for $6 million that’s got $2 million of cash, a couple million of real estate, and is making $2 million a year, is you don’t want to be patient at that moment. You want to be patient in waiting to get the occasional call. My phone will ring sometime with something that wakes me up. You just never know when it’ll happen. That’s what makes it fun. So patience is a combination of patience and a willingness to do something that afternoon if it comes to you.

It does not pay to invest in a way that depends on the appearance of a greater fool

Warren Buffett: If people are making more money because they’re borrowing money or participating in securities that are pieces of junk but they hope to find a bigger sucker later on, you have to forget that.

Buffett does not think it’s important to manage currency risk with Berkshire’s international investments, but he avoids investments denominated in currencies that are at risk of depreciating wildly

Warren Buffett: We’ve owned lots of securities in foreign currencies. We do nothing in terms of its impact on quarterly and annual earnings. We don’t do anything based on its impact on quarterly and annual earnings. There’s never been a board meeting I can remember where I’ve said, “If we do this, our annual earnings will be this, therefore we ought to do it.” The number will turn out to be what it’ll be. What counts is where we are five or 10 or 20 years from now…

…Obviously, we wouldn’t want to own anything in a currency that we thought was really going to hell.

Buffett is worried about the tendency for governments to want to devalue their currencies, the USA included, but there’s nothing much that can be done about it; Buffett thinks the USA is running a fiscal deficit that is unsustainable over a long period of time; Buffett thinks a 3% fiscal deficit appears sustainable

Warren Buffett: That’s the big thing we worry about with the United States currency. The tendency of a government to want to debase its currency over time – there’s no system that beats that. You can pick dictators, you can pick representatives, you can do anything, but there will be a push toward weaker currencies. I mentioned very briefly in the annual report that fiscal policy is what scares me in the United States because of the way it’s made, and all the motivations are toward doing things that can cause trouble with money. But that’s not limited to the United States – it’s all over the world, and in some places, it gets out of control regularly. They devalue at rates that are breathtaking, and that’s continued…

…So currency value is a scary thing, and we don’t have any great system for beating that…

…We’re operating at a fiscal deficit now that is unsustainable over a very long period of time. We don’t know whether that means two years or 20 years because there’s never been a country like the United States. But as Herbert Stein, the famous economist, said, “If something can’t go on forever, it will end.” We are doing something that is unsustainable, and it has the aspect to it that it gets uncontrollable to a certain point….

…I wouldn’t want the job of trying to correct what’s going on in revenue and expenditures of the United States with roughly a 7% gap when probably a 3% gap is sustainable…

…We’ve got a lot of problems always as a country, but this is one we bring on ourselves. We have a revenue stream, a capital-producing stream, a brains-producing machine like the world has never seen. And if you picked a way to screw it up, it would involve the currency. That’s happened a lot of places.

Buffett thinks the key factors for a developing economy to attract investors are having a solid currency, and being business-friendly

Audience member: What advice would you give to government and business leaders of emerging markets like Mongolia to attract institutional investors like yourself?

Warren Buffett: If you’re looking for advice to give the government over there, it’s to develop a reputation for having a solid currency over time. We don’t really want to go into any country where we think there’s a significant probability of runaway inflation. That’s too hard to figure…

…If the country develops a reputation for being business-friendly and currency-conscious, that bodes very well for the residents of that country, particularly if it has some natural assets that it can build around.

Private equity firms are flooding the life insurance market, but they are doing so by taking on lots of leverage and credit risk

Ajit Jain: There’s no question the private equity firms have come into the space, and we are no longer competitive in the space. We used to do a fair amount in this space, but in the last 3-4 years, I don’t think we’ve done a single deal.

You should separate this whole segment into two parts: the property casualty end of the business and the life end of the business. The private equity firms you mentioned are all very active in the life end of the business, not the property casualty end.

You are right in identifying the risks these private equity firms are taking on both in terms of leverage and credit risk. While the economy is doing great and credit spreads are low, these firms have taken the assets from very conservative investments to ones where they get a lot more return. As long as the economy is good and credit spreads are low, they will make money – they’ll make a lot of money because of leverage.

However, there is always the danger that at some point the regulators might get cranky and say they’re taking too much risk on behalf of their policyholders, and that could end in tears. We do not like the risk-reward that these situations offer, and therefore we put up the white flag and said we can’t compete in this segment right now.

Buffett thinks Berkshire’s insurance operation is effectively unreplicable

Warren Buffett: I think there are people that want to copy Berkshire’s model, but usually they don’t want to copy it by also copying the model of the CEO having all of his money in the company forever. They have a different equation – they’re interested in something else. That’s capitalism, but they have a whole different situation and probably a somewhat different fiduciary feeling about what they’re doing. Sometimes it works and sometimes it doesn’t work. If it doesn’t work, they go on to other things. If what we do at Berkshire doesn’t work, I spend the end of my life regretting what I’ve created. So it’s just a whole different personal equation.

There is no property casualty company that can basically replicate Berkshire. That wasn’t the case at the start – at the start we just had National Indemnity a few miles from here, and anybody could have duplicated what we had. But that was before Ajit came with us in 1986, and at that point the other fellows should have given up.

Buffett thinks recent market volatility is not noteworthy at all; it’s nearly certain that significant downward moves in stocks will happen sometime in the next 20 years

Warren Buffett: What has happened in the last 30-45 days, 100 days, whatever this period has been, is really nothing. There have been three times since we acquired Berkshire that Berkshire has gone down 50% in a fairly short period of time – three different times. Nothing was fundamentally wrong with the company at any time. This is not a huge move. The Dow Jones average was at 381 in September of 1929 and got down to 42. That’s going from 100 to 11. This has not been a dramatic bear market or anything of the sort. I’ve had about 17,000 or 18,000 trading days. There have been plenty of periods that are dramatically different than this…

…You will see a period in the next 20 years that will be a “hair curler” compared to anything you’ve seen before. That just happens periodically. The world makes big mistakes, and surprises happen in dramatic ways. The more sophisticated the system gets, the more the surprises can come out of left field. That’s part of the stock market, and that’s what makes it a good place to focus your efforts if you’ve got the proper temperament for it and a terrible place to get involved if you get frightened by markets that decline and get excited when stock markets go up.

Berkshire’s leaders think the biggest change autonomous vehicles will bring to the automotive insurance industry is substitution of operator error policies by product liability policies; Berkshire’s leaders also think that the cost per repair in the event of an accident will rise significantly; the total cost of providing insurance for autonomous vehicles is still unclear; from the 1950s to today, cars have gotten 6x safer but auto insurance has become 50x pricier

Ajit Jain: There’s no question that insurance for automobiles is going to change dramatically once self-driving cars become a reality. The big change will be what you identified. Most of the insurance that is sold and bought revolves around operator errors – how often they happen, how severe they are, and therefore what premium we ought to charge. To the extent these new self-driving cars are safer and involved in fewer accidents, that insurance will be less required. Instead, it’ll be substituted by product liability. So we at GEICO and elsewhere are certainly trying to get ready for that switch, where we move from providing insurance for operator errors to being more ready to provide protection for product errors and errors and omissions in the construction of these automobiles…

…We talked about the shift to product liability and protection for accidents that take place because of an error in product design or supply. In addition to that shift, I think what we’ll see is a major shift where the number of accidents will drop dramatically because of automatic driving. But on the other hand, the cost per repair every time there’s an accident will go up very significantly because of the amount of technology in the car. How those two variables interact with each other in terms of the total cost of providing insurance, I think, is still an open issue…

Warren Buffett: When I walked into GEICO’s office in 1951, the average price of a policy was around $40 a year. Now it’s easy to get up to $2,000 depending on location and other factors. During that same time, the number of people killed in auto accidents has fallen from roughly six per 100 million miles driven to a little over one. So the car has become incredibly safer, and it costs 50 times as much to buy an insurance policy.

There’s a tax now when American companies conduct share buybacks

Warren Buffett: I don’t think people generally know that, but there is a tax that was introduced a year or so ago where we pay 1%. That not only hurts us because we pay more for it than you do – it’s a better deal for you than for us – but it actually hurts some of our investee companies quite substantially. Tim Cook has done a wonderful job running Apple, but he spent about $100 billion in a year repurchasing shares, and there’s a 1% charge attached to that now. So that’s a billion dollars a year that he pays when he buys Apple stock compared to what you pay.

Buffett is very careful with the risks that come with derivative contracts on a company’s balance sheet

Greg Abel: I’ll maybe go back to the very first meeting with Warren because it still stands out in my mind. Warren was thinking about acquiring Mid-America Energy Holdings Company at that time, and we had the opportunity with my partners to go over there on a Saturday morning. We were discussing the business and Warren had the financial statements in front of him. Like anybody, I was sort of expecting a few questions on how the business was performing, but Warren locked in immediately to what was on the balance sheet and the fact we had some derivative contracts, the “weapons of mass destruction.”

In the utility business, we do have derivatives because they’re used to match certain positions. They’re never matched perfectly, but we have them and they’re required in the regulated business. I remember Warren going to it immediately and asking about the composition and what was the underlying risk, wanting to thoroughly understand. It wasn’t that big of a position, but it was absolutely one of the risks he was concerned about as he was acquiring Mid-America, especially in light of Enron and everything that had gone on.

The followup to that was a year or 18 months later. There was an energy crisis in the US around electricity and natural gas, and various companies were making significant sums of money. Warren’s follow-up question to me was, “How much money are we making during this energy crisis? Are we making a lot? Do we have speculative positions in place?” The answer was we weren’t making any more than we would have been six months ago because all those derivatives were truly to support our business and weren’t speculative. That focus on understanding the business and the risks around it still stands out in my mind.

Buffett spends more time analysing a company’s balance sheet than other financial statements

Warren Buffett: I spend more time looking at balance sheets than I do income statements. Wall Street doesn’t pay much attention to balance sheets, but I like to look at balance sheets over an 8 or 10 year period before I even look at the income account because there are certain things it’s harder to hide or play games with on the balance sheet than with the income statement.

Buffett thinks America’s electric grid needs a massive overhaul and it can only be done in via a partnership between the private sector and the government – unfortunately, nobody has figured out the partnership model yet

Warren Buffett: t’s very obvious that the country needs an incredible improvement, rethinking, redirection to some extent in the electric grid. We’ve outgrown what would be the model that America should have. In a sense, it’s a problem something akin to the interstate highway system where you needed the power of the government really to get things done because it doesn’t work so well when you get 48 or 50 jurisdictions that each has their own way of thinking about things…

…There are certain really major investment situations where we have capital like nobody else has in the private system. We have particular knowhow in the whole generation and transmission arena. The country is going to need it. But we have to figure out a way that makes sense from the standpoint of the government, from the standpoint of the public, and from the standpoint of Berkshire, and we haven’t figured that out yet. It’s a clear and present use of hundreds of billions of dollars. You have people that set up funds and they’re getting paid for just assembling stuff, but that’s not the way to handle it. The way to handle it is to have some kind of government-private industry cooperation similar to what you do in a war.

The risk of wildfires to electric utilities is not going to go away, and in fact, will increase over time

Greg Abel: The reality is the risk around wildfires – do the wildfires occur – they’re not going away, and we know that. The risk probably goes up each year.

Berkshire’s leaders think it’s important for utilities to de-energise when wildfires occur to minimise societal damage; Berkshire is the only utility operator so far that’s willing to de-energise; but de-energising also has its drawbacks; Berkshire may not be able to solve the conundrum of de-energising

Greg Abel: the one thing we hadn’t tackled – this is very relevant to the significant event we had back in 2020 in PacifiCorp – is we didn’t de-energize the system as the fire was approaching. Our employees and the whole management team have been trained all their lives to keep the lights on, and the last thing they want to do is turn those lights off and have a system de-energized. After those events and as we looked at how we’re going to move forward in managing the assets and reducing risk, we recognized as a team that we have to de-energize those assets. Now as we get fires encroaching at a certain number of miles, we de-energize because we do not want to contribute to the fire nor harm any of our consumers or contribute to a death. We had to take our team to managing a different risk now. It’s not around keeping the lights on, it’s around protecting the general public and ensuring the fire does not spread further. We’re probably the one utility or across our utilities that does that today, and we strongly believe in that approach.

Becky Quick: Doesn’t that open you up to other risk if you shut down your system, a hospital gets shut down, somebody dies?

Greg Abel: That’s something we do deal with a lot because we have power outages that occur by accident. When we look at critical infrastructure, that’s an excellent point and we’re constantly re-evaluating it. We do receive a lot of feedback from our customer groups as to how to manage that…

Warren Buffett: There’s some problems that can’t be solved, and we shouldn’t be in the business of taking investors’ money and tackling things that we don’t know the solution for. You can present the arguments, but it’s a political decision when you are dealing with states or the federal government. If you’re in something where you’re going to lose, the big thing to do is quit.

Buffett thinks the value of electric utility companies have fallen a lot over the past two years because of societal trends and his enthusiasm for investing in electric utilities has waned considerably

Becky Quick: Ricardo Bri, a longtime shareholder based in Panama, says that he was very happy to see Berkshire acquire 100% of BHE. It was done in two steps: one in late 2022 – 1% was purchased from Greg Abel for $870 million implying a valuation of BHE of $87 billion, and then in 2024 the remaining 8% was purchased from the family of Walter Scott Jr. for $3.9 billion implying a valuation of $48.8 billion for the enterprise. That second larger transaction represented a 44% reduction in valuation in just two years. Ricardo writes that PacifiCorp liabilities seem too small to explain this. Therefore, what factors contributed to the difference in value for BHE between those two moments in time?

Warren Buffett: Well, we don’t know how much we’ll lose out of PacifiCorp and decisions that are made, but we also know that certain of the attitudes demonstrated by that particular example have analogues throughout the utility system. There are a lot of states that so far have been very good to operate in, and there are some now that are rat poison, as Charlie would say, to operate in. That knowledge was accentuated when we saw what happened in the Pacific Northwest, and it’s eventuated by what we’ve seen as to how utilities have been treated in certain other situations. So it wasn’t just a direct question of what was involved at PacifiCorp. It was an extrapolation of a societal trend…

…We’re not in the mood to sell any business. But Berkshire Hathaway Energy is worth considerably less money than it was two years ago based on societal factors. And that happens in some of our businesses. It certainly happened to our textile business. The public utility business is not as good a business as it was a couple of years ago. If anybody doesn’t believe that, they can look at Hawaiian Electric and look at Edison in the current wildfires situation in California. There are societal trends that are changing things…

…I would say that our enthusiasm for buying public utility companies is different now than it would have been a couple years ago. That happens in other industries, too, but it’s pretty dramatic in public utilities. And it’s particularly dramatic in public utilities because they are going to need lots of money. So, if you’re going to need lots of money, you probably ought to behave in a way that encourages people to give you lots of money.

Buffett thinks the future capital intensity of the USA’s large technology companies remains to be seen

Warren Buffett: It’ll be interesting to see how much capital intensity there is now with the Magnificent 7 compared to a few years ago. Basically, Apple has not really needed any capital over the years and it’s repurchased shares with a dramatic reduction. Whether that world is the same in the future or not is something yet to be seen.

Buffett thinks there’s no better system than capitalism that has been discovered so far

Warren Buffett: Capitalism in the United States has succeeded like nothing you’ve ever seen. But what it is is a combination of this magnificent cathedral which has produced an economy like nothing the world’s ever seen, and then it’s got this massive casino attached…

…In the cathedral, they’re designing things that will be producing goods and services for 300 and some million people like it’s never been done before in history. It’s an interesting system we developed, but it’s worked. It dispenses rewards in what seems like a terribly capricious manner. The idea that people get what they deserve in life – it’s hard to make that argument. But if you argue with it that any other system works better, the answer is we haven’t found one.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I currently have a vested interest in Alphabet, Amazon, Apple, Meta Platforms, Microsoft, and Tesla (they are all part of the Magnificent 7). Holdings are subject to change at any time.

What We’re Reading (Week Ending 04 May 2025)

The best articles we’ve read in recent times on a wide range of topics, including investing, business, and the world in general.

We’ve constantly been sharing a list of our recent reads in our weekly emails for The Good Investors.

Do subscribe for our weekly updates through the orange box in the blog (it’s on the side if you’re using a computer, and all the way at the bottom if you’re using mobile) – it’s free!

But since our readership-audience for The Good Investors is wider than our subscriber base, we think sharing the reading list regularly on the blog itself can benefit even more people. The articles we share touch on a wide range of topics, including investing, business, and the world in general. 

Here are the articles for the week ending 04 May 2025:

1. Everyone Says They’ll Pay More for “Made in the USA.” So We Ran an A/B Test – Ramon van Meer

We make filtered showerheads. Clean, sleek design. But more importantly, with the best shower filters on the market. 

Our bestselling model—manufactured in Asia (China and Vietnam)—sells for $129. But this year, as tariffs jumped from 25% to 170%, we wondered: Could we reshore manufacturing to the U.S. while maintaining margins to keep our lights on?…

…We found a U.S.-based supplier. The new unit cost us nearly 3x more to produce. To maintain our margins, we’d have to sell it for $239.

So we ran an experiment.

We created a secret landing page. The product and design were identical. The only difference? One was labeled “Made in Asia” and priced at $129. The other, “Made in the USA,” at $239…

…Add-to-carts for the U.S. version were only 24! Conversion? 0.0% (zero).

Not a single customer purchased the Made-in-USA version…

…We wanted to believe customers would back American labor with their dollars. But when faced with a real decision—not a survey or a comment section—they didn’t…

…Small brands like ours want to manufacture here. We’re willing to invest. But without serious shifts—in consumer incentives, automation, and trade policy—the math doesn’t work. Not for us. Not for our customers.

We’re still committed to exploring local manufacturing. But for now, it’s not viable.

We’re sharing this because the numbers surprised even us. And we think they’re worth talking about.

2. Perspectives from 30 business leaders on the trade war frontlines: preparing for the best, bracing for the worst – Amber Zhang

For companies with strong brand equity and pricing power, the strategy is clear: raise prices.

Anker Innovations was among the first movers, increasing prices on Amazon by roughly 20%. A person familiar with the company said its U.S. market share remains stable despite the hikes. Pop Mart hasn’t adjusted pricing yet, but told Waves that it’s “not ruling out the possibility.”

One cross-border commerce insider noted that for companies like DJI, whose supply chain advantages are hard to replicate, rising costs may squeeze margins, but competitors will struggle to gain market share in the near term.“Tariffs are a toll, not a blockade. Only great products hold long-term passports,” Anker commented…

…According to Waves, most ODM firms that rely heavily on the U.S. market are currently in a holding pattern, halting shipments and refraining from taking new orders for the coming months…

…Wei believes a resolution – or at least a temporary workaround – is likely within the next two to three months. Why? ODM manufacturers need to finalize production plans for Christmas orders by July or August. If the standoff drags on any longer, “America may be headed for a very empty Christmas.”…

…On April 2, Donald Trump signed an executive order eliminating the de minimis exemption for packages from the Chinese mainland and Hong Kong. The White House followed up on April 8, raising tariffs on these goods from 30% to 90%, effective May 2.

This presents a major setback for small Chinese sellers who rely on platforms like TikTok, Temu, and Shein to ship low-value parcels without stocking local warehouses…

…That said, some insiders point out that more than 90% of shipments to the U.S. currently rely on “Gray Customs Clearance” — unofficial customs channels where third-party brokers help exporters bypass formal declarations at minimal cost. From their perspective, the new tariffs are more likely to disrupt clearance efficiency rather than kill off the model entirely…

…Amid persistent U.S.-China trade friction, a viable strategy is to first export raw materials or semi-finished goods from China for overseas assembly and packaging. Over time, upstream production can gradually shift to local markets, supported by regional suppliers…

…In the wake of the de minimis exemption repeal, TikTok Shop recently issued a notice to U.S. sellers: starting May 2, all incoming shipments will face a 30% ad valorem tax, with an additional flat tariff of $25 per item before June 1, rising to $50 afterward. Carriers will also be required to post international bonds as part of the compliance framework. So far, Temu and Shein have yet to publicly announce specific countermeasures.

Ray Bu predicts that platform-based giants will be forced to accelerate full-scale internationalization, separating their domestic and overseas supply chains and localizing operations in major markets…

…Both Temu and Shein are ramping up local presence in multiple countries, aggressively recruiting local merchants. Shein, for example, has been building production capacity in Turkey and Brazil — jurisdictions seen as tariff-safe zones amid the current climate.

Temu, for its part, plans to launch six “native fulfillment hubs” and nine “semi-managed hubs” between April and June. The former are geared toward local legal entities, while the latter target Chinese sellers with the ability to fulfill orders domestically in target markets…

…According to Xue Yi, professor at the University of International Business and Economics, if access to the U.S. market becomes more restricted, the European Union may be China’s best substitute…

…According to Xue Feng, a partner at Fangda Partners, recent tariff hikes have dealt significant blows to sectors including furniture, toys, textiles, auto parts, chemicals, steel, and aluminum. These industries, heavily reliant on low-end supply chains, are structurally tied to the U.S. and face steep challenges when seeking alternative markets…

…At its core, America simply isn’t ready to rebuild industrial capacity at scale.

To start with, U.S. manufacturing wages are 6–8 times higher than those in emerging markets, and the country faces a chronic shortage of skilled labor. But more critical than labor costs is the disintegration of domestic supply chains. After decades of offshoring, the U.S. industrial base is severely fragmented — core components like semiconductor materials and electronic parts still depend heavily on Asian suppliers.

Even if reshoring happens — as envisioned by the Trump administration — these factories would function as isolated islands, unable to form self-sustaining industrial clusters.

3. Did ancient Rome have a stock market? – Swen Lorenz

“The New Deal in Old Rome” was published by H. J. Haskell in 1940. The original book is rare and expensive today, but reprints can easily be found. As it reports on page 11:

“Indications are there was a curiously high degree of commercial organisation in the ancient world. In the time of Cicero, in the last century before Christ, wealthy Romans were busily exploiting the eastern provinces. Companies of contractors were organised to construct public works and to collect government revenue, from which the contractors were took a large cut. They sold shares in offices on the Via Sacra, the Wall Street of Rome. Everybody, says the Greek historian Polybius, meaning all the country club crowd, bought them. … We may imagine how the bottom dropped out of Asiatic stocks on the Roman market when the news came of the concerted massacre of eighty thousand Italians at the instigation of the native ruler of an adjoining kingdom.”…

…How much of Haskell’s claims were true?

A quick Google search of “did the Romans have a stock market” produces contradicting results.

The first search result is an abstract from Prof. Pellegrino Manfra, a professor at City University New York: “Ancient Rome Economy and Investment: The Origins of the Stock Market

The origins of the stock market can be observed as far back as ancient Rome. The earliest example of organized market for equities can be found in the Roman Republic in second century B.C…. Back in Roman times, organizations called ‘Societates Publicanorum’ were formed that offered investments referred to as ‘partes’ or what we now know them as – shares. … The shares were tradable and had fluctuating prices based on the underlying project’s success. …. The place where trading occurred was the forum, near the temple of Castor.”

Prof. Manfra’s conclusion couldn’t be clearer.

However, three search results further down, you find a complete repudiation of his claims.

In 2016, Bocconi University in Milan put out a press release summarising a scientific article that its Prof. Manuela Geranio had published: “Ancient Rome Stock Exchange Is a Myth

Manuela Geranio, in a paper with Geoffrey Poitras, shows that modern claims of the existence of a market for shares of the societates publicanorum in the late Roman Republic are not supported by primary sources … Recent claims of trading in shares (partes) of tax-farming corporations (societates publicanorum) in the late Roman Republic can thus raise some skepticism. ‘Upon closer inspection there is only brief discussion of possible share trading in a few sources that, in turn, depend fundamentally on a debatable interpretation of the commercial and legal context’. The location of the proto-stock-exchange near the temple of Castor results, for example, the fruit of ‘romanticized descriptions’. … The paper … highlights the need for more careful historical and legal analyses before concluding about the existence of its peculiar institutions in ancient times.”…

…The most common source cited as evidence that modern-day share trading took place in ancient Rome is “Publicans and Sinners”, a 1972 book by Ernst Badian, an Austrian-born classical scholar who served as a professor at Harvard University from 1971-1998.

It describes how the Romans used “partes (shares) in public companies” and traded “over the counter” based on a “register” of shareholders that the companies kept. It all sounds like the ancient Romans did have an early version of a stock market!…

…Badian was elected a fellow of the American Academy of Arts and Sciences in 1974, and in 1999 his native Austria awarded him the Cross of Honor for Science and Art. His work may be dated, but it’s impossible to dismiss his writing as that of a crank.

Badian was an early member of a group that critics of this field of study today decry as the “maximalists” who allegedly were too aggressive in interpreting incomplete historical evidence in a favourable way…

…Still, another group of equally serious scientists delivered a broadside to the whole idea of the Romans operating something worthy of comparing it to modern-day stock markets.

In 2015, Bocconi University’s Manuela Geranio teamed up with Geoffrey Poitras to publish “Trading of shares in the Societates Publicanorum?

The often repeated modern claim of significant trading in ‘shares of the societates publicanorum’ (partes) during the late Roman Republic cannot be supported using the available ‘primary sources’.”

As the authors go on to argue, previous analysis of this field failed to take into consideration differences in language, nuances in interpreting complex terms, the historical context, and the lack of sufficient amounts of clear evidence.

“Even where elements related to possible share trading can be identified in the primary sources, evidence is often vague or questionable. … To avoid semantic confusions, understanding the commercial and legal context for claims of share trading and other activities involving the societates publicanorum requires definition of important terms – shares, share trading, company, joint-stock company, corporation. Appropriate definition is essential to clarify various claims made in modern sources.”

Geranio and Poitras argue that previous interpretations in this field relied too much on an “artful interpretation” of key terms.

Their conclusion is supported by the widely-known notion that the day and age a scientist lives in (as well as their language and culture of origin) can play heavily into the conclusions of research.

4. China is trying to create a national network of cloud computing centers – Andrew Stokols

I’ve written before about the Eastern Data Western Compute 东数西算 project, China’s effort to boost its’ “computing power” by constructing new data centers in the country’s West…

…The vision of the EDWC is not only about building new “nodes” of data centers, it is also about creating a “national network of computing”, 全国算力一张网 quanguo suanli yizhang wang so that computing power can be shifted between data centers depending on fluctuations in computing demand and supply in different regions, not unlike the way interconnected or smart power grids can move electricity around the grid depending on where demand is greatest. One of the premises of the EDWC is that data centers, which consume large amounts of energy, should be located in areas with ample renewable energy supply and cheaper land and energy than in the populated east coast. Networking computing resources across the country can make better use of energy in the West without having to transmit electricity from West to East…

…In a televised interview in 2022, Yu Xiaohui 余晓晖 the President of the influential think tank CAICT1, which has played a leading role in developing the EDWC plan, notes that “other countries are doing similar things [to the EDWC] but only within one cloud company, but this is an advantage of China, we are doing a systematic layout.”2 In other words, what he alludes to is the fact that the EDWC project is a state-led effort to coordinate the data center infrastructure of the entire country, which requires going beyond simply encouraging cloud/telecom providers to invest in their own new cloud computing centers. It also means creating a unified national computing network that would allow for a more dynamic allocation of computing demand across the country.

But to do this requires creating a system that can allocate demand from computing centers of different providers. From a business and data privacy standpoint this seems difficult to do. However, China’s three state-owned telecom operators are the ones playing a significant role in building out the EDWC project…

…In various documents and news announcements there have been reference to 调度中心 or “adjustment centers” in each of the data center nodes, which are supposed to function as traffic centers for nationwide computing resources, balancing supply (western data centers) and demand (eastern applications). Such “adjustment centers” allow computing tasks to be dynamically allocated to different data centers, such as allocating workloads between 8 national computing hubs and 10 data center clusters, optimizing latency, and improving energy efficiency by redirecting “non-urgent” data tasks to western renewable-powered hubs during off-peak hours…

…There are technical challenges to developing the national network. But there are also obvious financial/proprietary hurdles as well, namely how to interconnect cloud networks of separate companies who have their own proprietary businesses and systems. This may be easier to do with the three national operators (China Mobile, China Telecom, and China Unicom) than it is with private cloud operators, which are still the leading cloud platform companies (Alibaba, Tencent, Huawei, Baidu)…

…In the U.S., there are some examples of so-called “multi-cloud” interconnections, such as agreements between cloud operators Microsoft and Oracle that allow clients to connect certain cloud databases stored on different cloud platforms.7 But China’s ambition to create a national computing network would require a much greater interconnection and coordination to carry out…

…The degree to which China is able to build out the national network of cloud computing will have implications for its digital innovation, particularly in AI. While the U.S. leads China in the number of data centers by a wide margin, China’s system could develop in ways that diverge from the proprietary open-cloud model in the U.S. in which large enterprise cloud platforms dominate the market (AWS, Microsoft Azure, Google Cloud). Whether and to what degree China’s existing cloud providers continue to dominate the market will depend on their ability to innovate and maintain their edge in the face of increasing entry by state-owned cloud providers into cloud markets.

5. AI Horseless Carriages – Pete Koomen

I noticed something interesting the other day: I enjoy using AI to build software more than I enjoy using most AI applications–software built with AI.

When I use AI to build software I feel like I can create almost anything I can imagine very quickly. AI feels like a power tool. It’s a lot of fun.

Many AI apps don’t feel like that. Their AI features feel tacked-on and useless, even counter-productive.

I am beginning to suspect that these apps are the “horseless carriages” of the AI era. They’re bad because they mimic old ways of building software that unnecessarily constrain the AI models they’re built with…

…Up until very recently, if you wanted a computer to do something you had two options for making that happen:

  1. Write a program
  2. Use a program written by someone else

Programming is hard, so most of us choose option 2 most of the time. It’s why I’d rather pay a few dollars for an off-the-shelf app than build it myself, and why big companies would rather pay millions of dollars to Salesforce than build their own CRM.

The modern software industry is built on the assumption that we need developers to act as middlemen between us and computers. They translate our desires into code and abstract it away from us behind simple, one-size-fits-all interfaces we can understand.

The division of labor is clear: developers decide how software behaves in the general case, and users provide input that determines how it behaves in the specific case.

By splitting the prompt into System and User components, we’ve created analogs that map cleanly onto these old world domains. The System Prompt governs how the LLM behaves in the general case and the User Prompt is the input that determines how the LLM behaves in the specific case.

With this framing, it’s only natural to assume that it’s the developer’s job to write the System Prompt and the user’s job to write the User Prompt. That’s how we’ve always built software.

But in Gmail’s case, this AI assistant is supposed to represent me. These are my emails and I want them written in my voice, not the one-size-fits-all voice designed by a committee of Google product managers and lawyers.

In the old world I’d have to accept the one-size-fits-all version because the only alternative was to write my own program, and writing programs is hard.

In the new world I don’t need a middleman tell a computer what to do anymore. I just need to be able to write my own System Prompt, and writing System Prompts is easy!…

…In most AI apps, System Prompts should be written and maintained by users, not software developers or even domain experts hired by developers.

Most AI apps should be agent builders, not agents…

…AI-native software should maximize a user’s leverage in a specific domain. An AI-native email client should minimize the time I have to spend on email. AI-native accounting software should minimize the time an accountant spends keeping the books.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. We currently have a vested interest in Alphabet (parent of Google Cloud), Amazon (parent of AWS), Microsoft (parent of Azure), Salesforce, and Tencent. Holdings are subject to change at any time.