Peter Lynch’s Wisdom

A rare public appearance from an investing legend.

Earlier this month, Peter Lynch was interviewed by Josh Brown, CEO of the US-based Ritholtz Wealth Management. Lynch is one of the all-time greats in the investing world. During his tenure as portfolio manager of the Fidelity Magellan Fund from 1977 to 1990, he produced an annualised return of 29%, nearly double that of the S&P 500 over the same period.

Lynch had written a number of books in the late 1980s and early 1990s (see here and here) in which he generously shared his investing philosophy techniques. But as far as I know, he has rarely given interviews since he retired as the Magellan Fund’s portfolio manager in 1990. So, when Brown’s interview of Lynch popped up on my radar, I took notes that I want to share. What’s shown between the two horizontal lines below in italics are my favourite parts of their conversation.


1. The dotcom bubble in the late 1990s saw nonsensical companies come public

Brown: I wanted to ask you if there were ever moments where you looked at something that was happening in the market, whether it was a bull market or a bear market, and said to yourself, “If I were at Magellan, I know exactly what I would be doing right now with this opportunity.” Have you had those moments?

Lynch: Yeah, I did have that moment when Pets.com came public. I said, “What? This makes no sense at all.” And then went up. I can’t short. But there were so many companies of no value. Fortunately, Fidelity didn’t own those damn things. That was a period to say, “Wow, what’s wrong here?”

2. It’s really important for investors to know the businesses of the stock they own, because the average stock goes up-and-down by 100% a year and it would be easy to be scared out of them without knowing the business

Lynch: The average range for a stock on the New York Stock Exchange, the average high, average low, every year is 100%. The stock might start at $20, sell at $28, finish at $14, finish at $20. There’s a 100% move. 

Brown: It’s 50% up, 50% down-ish. And that’s a 100% swing in the price.

Lynch: That’s the average stock. Most stocks you’re going to buy, they’re probably going to go down. If the story is powerful, like Watsco or Chrysler, you might buy it up. If you don’t know what they do and it goes down… I’ve had people say, “This stock’s gone from $50 to $1. How much can I lose?” And I say, “Wait a second. If somebody put $10,000 in at $50 and you put $50,000 at $1, if it goes to $0, who loses the most? Stocks go to $0. I’ve had them. I wasn’t buying them on the way to $0, but stocks go down. If you don’t understand what they do, if you can’t explain to an 11-year-old in a minute or less why you own it – not this sucker is going up, I’ve heard that one before. What’s the story of this company? They have good business, good balance sheet, they’re fine. That’s why I own it. If you can’t do that, you should buy a fund. 

3. Investors should not put money in the stock market if they need the money in the next few years

Lynch: The point is, if somebody has three children about to start college in two years, they shouldn’t be in the stock market. They should be in the money market fund. But if you got your house, paid down your mortgage, then you can invest and it’s been a great place to be since 1900.

4. Economic forecasts are not useful – current economic facts are

Brown: One of the more timeless things that you’ve said, and it comes off as sarcastic, but I think the last 15 years have really proven the value of this idea. Coming out of the great financial crisis, the most in-vogue style of investing was macroeconomic hedge funds, because there were a small handful of people who determined that the housing crisis would ultimately bring about a recession, and those people were revered for a couple of years. You’ve never really been big on trying to outguess everyone else on the economy. You said, “If you spend more than 13 minutes analyzing economic and market forecasts, you’ve wasted 10 minutes.” I still quote you to this day when clients call up and they want to talk about the latest labor report or what the Fed’s going to do. Tell us how long did it take you to figure that out and how much push back did you get when you said it, from people that were economists or focused on the macro?

Lynch: I don’t remember if Fidelity ever had an economist. We just buy stocks…

Brown: She’s here tonight.

Lynch: Okay. So, I’d love to get next year’s Wall Street Journal. I’d pay at least $5 for next year’s Wall Street Journal. And hands off to the people who did The Big Short. I had no idea how bad the housing market was, how bad people had second mortgages, they had home improvement loans, they were underwater in their house. I had no idea. Hats off to them. But I look at facts, like what’s happened to debt, credit card debt, you can get that now. What’s happened to savings rate? What’s happened to employment? I’d love to know what’s happening in the future. I’ve been hoping I could get that in the last 81 years. It’s not available. So I just deal with what’s now. What’s happened to used car prices? What’s happened to the price of oil? And you look at industries that have gone from miserable to getting better, like Chrysler. I remember people saying, “You were really good on that show but how could you possibly recommend Chrysler? It’s going bankrupt.” They had $2 billion in cash and they had enough money for the next three years. They weren’t going bankrupt. I think the best stocks I had, I think if 100 people did work on it, 99 would say that’s better than I expected. I use this for one of our great fund managers, Joel Tillinghast. I wrote a foreword to his book and I always said, “The person that turns the most rocks wins the game.” I said, “Joel Tillinghast is a great geologist.” Because if you look at 10 stocks, you probably find one that’s mispriced. Look at 20, you’ll find two. Look at 40, you’ll find four. And that’s what we’ve been doing at Fidelity. We look at everything.

Brown: So, you’re not discounting the value of economic data. You’re saying if it’s not from the future, the market already understands this.

Lynch: I mean I just want to know facts right now.

5. The hallmark of a great investor is the ability to change one’s mind

Lynch: So I pick up the phone. “This is Warren Buffet from Omaha, Nebraska. My annual report’s due in two weeks. I love a quote. Can I use it?” This is all in about three seconds. “What’s the quote?” He says, “Getting rid of your winners and holding the losers is like watering the weeds and cutting the flowers.” I said, “It’s yours.” He said, “If you don’t come to Omaha and see me, your name will be mud in Nebraska.” 

Brown: Did you do it?

Lynch: Oh, yeah. Many times. He’s the best. 

Brown: You built a relationship with Warren.

Lynch: Played bridge together. He’s the best. Imagine, he bought Apple like eight years after that iPod story and made fivef-old. And he had a huge position in IBM, it was going down. He says “I love stocks going down. I think IBM’s great.” He totally reversed. He got the hell out of IBM. He’s the best.

6. Investors don’t need to be chasing the fad-of-the-moment

Brown: I wanted to ask you about the modern stock market, specifically the AI boom that’s been for the last 3 years arguably the biggest driving force behind earnings growth, behind revenue growth, excitement about stocks. What do you think about it when you watch it or how involved are you with AI stocks with your own money right now?

Lynch: I have zero AI stocks. I literally couldn’t pronounce NVIDIA until about eight months ago. But we have people that are very tech. I am the lowest tech guy ever. My wife is mechanical, my daughter’s a mechanical engineer, I can’t do anything with computers. I just have yellow pads and a phone.

Brown: From your position as a third party to this, do you think investors have chased these ideas too far? Are there echoes of the 1999, 2000 era to you when you look at it, or are you open-minded about it and you say “Maybe this is not going to end as badly as that instance did?”

Lynch: I have no idea. Don’t have any. I have a lot of stocks I like, but not in that category.

7. The US economy has learnt many lessons over the course of decades and have built multiple buffers against crises, so the probability of another massive economic crash in the future is lower today than it was decades ago

Lynch: Yeah. So, we’ve had an incredible bull market since ‘82. We’ve had 10 or 12 declines, maybe a few more. So, people today, they’re not used to… 

Everybody I knew grew up, they’re warned, the big one’s coming. We’ve had 11 recessions since World War II. We’ve never had a big one. Imagine in the Depression, we didn’t have social security. There wasn’t social security. What a criminal invention. People when they retire, they got older, they moved in with their family. The family had to cut back on their spending. We didn’t have unemployment conversation. We didn’t have the SEC. The SEC did not exist. There’s so many things that are better. And we had a Federal Reserve that was asleep, to Booth. This, 1929, no one jumped out of windows. 

Brown: That was fabricated, you said.

Lynch: 1% of Americans owned stocks in 1929.

Brown: I don’t think a lot of people understand that. The losses were very contained to a small group of people.

Lynch: But we had an incredible depression. 30% of people out of work, not enough food, terrible farming environment. It was awful and people went through that. I’ve read stories about it. It was grim.

Brown: You think we have evolved the economy and the markets to the point where it would be very difficult to repeat the “Big One”.

Lynch: We’ve had 11 tests, 11 recessions since, and no one’s ever got worse than, 5%, 6% decline in GDP. There’s a lot of cushions now. 63% of Americans own their house. That was not true in the 1920s. People have IRAs that if they’re Fidelity, they’re not going to panic. People are careful with their savings. The GI Bill allowed people to buy houses with 5% down and create a lot of people with wealth. Most wealth in America is in their house. That was not true in the 20s. People were renting, rent went up. There’s so many buffers now. It’s incredible how many positives there are. We had a lot of tests. We had many opportunities to have a big one. We’ve had some probably bad presidents, some bad congresses, we’ve had bad economists, and we’ve made it through. It’s a pretty good system.

Brown: I like that message for people who are overdosing on Great Depression content on their social media feeds and constantly being fed that as a realistic possibility.

8. AI may take away some jobs in the US economy, but it’s not taking away the ingenuity of the country’s entrepreneurs, and that has been, and will be, the key driver of the country’s growth

Brown: From your point of view, the people displaced by AI and other innovations to come in the future, they’ll be doing something else. It’s unlikely they’ll be sitting there saying, “I wish I still had my job that AI took away.”

Lynch: I think more importantly, one job is going to go away. These are good paying jobs. The people that drive a truck, a tractor trailer from a manufacturing firm to a distribution center on highways, not through Beacon Hill, they go back that night. That should be automated.

Brown: And likely will be, you would say?

Lynch: I would say in 20 years, we’ll lose 500,000 jobs. And safety will be better, costs go down. That’s more important to me than AI. Those are people, working hard. They don’t need a… 

Brown: Sorry, automation is going to have a bigger impact than AI, you’re saying?

Lynch: Automation has been incredible the last 50 years. We’ve gone from 100 million jobs to 153, and Eastman Kodak’s gone down, [indecipherable] gone down. Sears has gone away. All the growth is new companies and companies with 100 to 200 employees or less. The largest 500 companies have fewer employees than they did 50 years ago. The largest 500 companies have fewer employees than they did 50 years ago. All the growth in this country is entrepreneurs starting a little shop, starting something else. That makes our country great.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I don’t have a vested interest in any company mentioned. Holdings are subject to change at any time.

More Of The Latest Thoughts From American Technology Companies On AI (2025 Q2)

A collection of quotes on artificial intelligence, or AI, from the management teams of US-listed technology companies in the 2025 Q2 earnings season.

Last month, I published The Latest Thoughts From American Technology Companies On AI (2025 Q2). In it, I shared commentary in earnings conference calls for the second quarter of 2025, from the leaders of technology companies that I follow or have a vested interest in, on the topic of AI and how the technology could impact their industry and the business world writ large. 

A few more technology companies I’m watching hosted earnings conference calls for 2025’s second quarter after I prepared the article. The leaders of these companies also had insights on AI that I think would be useful to share. This is an ongoing series. For the older commentary:

Here they are, in no particular order:

Adobe (NASDAQ: ADBE)

Adobe’s management is infusing AI across Adobe’s flagship Creative Cloud applications; there is strong adoption of the Creative Cloud Pro offering, which includes Firefly; recent new AI features in Creative Cloud applications include (1) Harmonize in Photoshop that blends composited objects with the image, and (2) Project Turntable, which rotates 2D artwork to accurately visualize different angles; Creative Cloud had strong new user acquisition, particularly in emerging markets; management will soon be unveiling new AI innovations within Adobe’s Creative Cloud applications

We’re infusing AI across our flagship Creative Cloud applications including Photoshop, Illustrator, Premiere Pro and After Effects and delivering new offerings for next generation creators with Adobe Firefly across web and mobile…

…We’re seeing strong adoption of the Creative Cloud Pro offering which includes Firefly, reflecting the value 5 professionals see in having AI integrated with power and precision creative tools…

… Recent examples include the addition of a new Harmonize feature in Photoshop that blends composited objects with the image by automatically adjusting lighting, colors and shadows. Harmonize has quickly become one of the most used features in Photoshop. We released Project Turntable, a popular sneak from MAX last year, into Illustrator, which helps users rotate their 2D artwork to accurately visualize different angles, eliminating a frequent and time-consuming task. Innovations like these directly translate into measurable value for customers by cutting production times, enabling more content output, and raising the overall quality of creative work and have driven strong migration to our new Creative Cloud Pro offer…

…Continued new user acquisition of Creative Cloud with particular strength in emerging markets like India which grew ending units 50 percent year over year…

…We’re excited to welcome our community at Adobe MAX next month. We’ll showcase incredible innovations that highlight amazing productivity features in our flagship Creative Cloud applications, breakthrough AI capabilities leveraging Firefly and third-party models, new agentic experiences for conversational editing, and significant strides in content production automation for enterprises.

Adobe’s management is making the new Firefly application the single destination for creators’ workflows; the Firefly application includes Adobe’s own AI models as well as 3rd-party models; there is strong adoption of the standalone Firefly subscription; the 3rd-party models in Firefly include Google’s  Gemini, Veo, and Imagen models, along with models from OpenAI and more; new capabilities for Firefly that were added in 2025 Q2 (FY2025 Q3) include avatar generation and sound effects generation; Firefly Services are agentic services that use custom models to automate and personalize image, video, and 3D content for many types of use cases; management recently delivered a no-code interface for Firefly Services; usage of Firefly Services and Custom Models grew 32% and 68% sequentially in 2025 Q2 (FY2025 Q3); the Firefly App for mobile has been downloaded millions of times since launch; the Firefly App’s MAU (monthly active users) was up 30% sequentially in 2025 Q2 (FY2025 Q3); first-time subscribers to Adobe from Firefly app was up 20% sequentially in 2025 Q2 (FY2025 Q3); Firefly has powered 29 billion generations (24 billion in 2025 Q1) since its launch in March 2023, with video generations up 40% sequentially in 2025 Q2 (FY2025 Q3); Nano Banana from Google was integrated with Firefly on the day it was released and the integration of Nano Banana led to an better product than the standalone Nano Banana; management sees the real strength of Adobe in the company’s ability to deeply integrate 3rd-party generative AI models into the workflows of the company’s existing applications; the integration of 3rd-party models into Adobe’s applications is not a trivial project; the majority of AI credit usage in Adobe is being used on the company’s Firefly models, but 3rd-party models are seeing a nice uptick in usage, and management is happy with the current mix

We’re delivering an end-to-end, ideation-to-creation solution in the new Firefly application to make it the single destination for creators’ workflows. It includes our own first-party, commercially safe models and leading third-party models. We are seeing strong adoption of the standalone Firefly subscription offering. We recently added Google Gemini Flash 2.5 alongside Google’s Veo and Imagen models to the roster of partner models from OpenAI, Black Forest Labs, Runway, Pika, Ideogram and others. In the rapidly evolving AI landscape, where each generative AI model has its own aesthetic style, we’re offering customers choice and flexibility to use the right model within Adobe applications, without the friction of switching between workflows and platforms…

…The Firefly app is a powerful, yet accessible AI production studio that helps creators deliver original content faster than ever before. In Q3, we added a slew of new capabilities, including avatar generation, sound effects generation and updates to the growing list of integrated generative models…

…We are delivering incredibly powerful automated content production capabilities through Firefly Services to enterprises of all sizes and across all verticals. These agentic services leverage Custom Models to automate and personalize image, video and 3D content for marketing campaigns, ad creation and postproduction video work, all while maintaining brand consistency. Additionally, we delivered a no-code interface that extends the power of Firefly Services to studio and design teams. Firefly Services are available through GenStudio as well as to individuals through Firefly app subscriptions. Consumption of Firefly Services and Custom Models grew 32 percent and 68 percent quarter over quarter, respectively…

… Millions of downloads of the Firefly App for mobile since launch; Firefly app MAU grew 30 percent quarter over quarter; Firefly app continues to attract next gen creators, with first time Adobe subscribers through the app growing 20 percent quarter over quarter; Generative AI consumption accelerated, with 29 billion generations, and video generations growing nearly 40 percent quarter over quarter…

…[Question] on sort of the demo that you guys gave on that video at the beginning. Really, again, highlighting the Adobe magic with kind of what you’re doing with Nano Banana, and — being able to manipulate images like that.

…Regarding choice, we want to make sure that all third-party models are available, you saw our announcement with Google and Nano Banana, OpenAI, Flux, Runway, Luma, Ideogram, the list continues to grow. And you call out the example of Nano Banana. We actually launched Nano Banana in the first — on the day that it was released as part of the Firefly application, and now we’re integrating it into Integrated Cloud Pro. So the core of the choice of whatever model has the most interesting thing for the thing you want to do, you know you can turn to Adobe, and it will be there.

The second part is the integration as you talked about, right? We have a lot of workflows that we have — that we pulled into the model. You noticed that in the demo you saw, and all the demos that are out there people are using Nano Banana with Photoshop. They’re doing it in a way that they’re blending the precision and the control you get with Photoshop and combining it with the generative capabilities of Nano Banana…

… The magic is clearly in our applications because we can take all of the models that exist and integrate that within our interface. And that’s a completely nontrivial task of what we have done to build. That was actually the rationale for building Firefly because we understand whether they’re diffusion or transformer models better than I think anybody can in the Creative Application. So I wouldn’t underestimate the amount of magic that we have to make it look as seamless as it has…

…[Question] On the mix of AI credit usage between your own Firefly-based solutions and third party, whether you’re seeing any pickup from the third-party models and how users are responding?

[Answer] The majority of generation continues to be Firefly given the commercial safety and the underpinnings of what that is. But we are seeing a nice uptick in usage of the other models. Especially for things like ideation and sort of edit capabilities that are integrated into Firefly. So that mix feels right to us, and we’re going to continue to optimize and drive that discovery in our applications going forward.

Adobe’s management sees Adobe GenStudio as the most comprehensive solution for AI-driven marketing automation; Adobe GenStudio now exceeds $1 billion in ARR, growing 25% year-on-year; there is accelerating adoption and usage of Adobe GenStudio; new capabilities in Adobe GenStudio for Performance Marketing are accelerating video and display ad campaign creation; marketers can produce engaging short-form video ads in Adobe GenStudio with commercially-safe Firefly models; management recently released new capabilities for display ad campaigns for Adobe GenStudio, including on-brand image generation with Firefly

Adobe GenStudio is the most comprehensive solution that brings together workflow and planning, creation and production, asset management, activation and delivery and reporting and insights to enable marketing automation with AI in the enterprise. Our Workfront, Frame, AEM Assets, Firefly Services, and GenStudio for Performance Marketing products – which are key components of the integrated GenStudio solution – now exceed $1 billion in ARR growing over 25 percent year over year…

…We’re seeing accelerating adoption and usage of Adobe GenStudio, the most comprehensive content supply chain solution, as enterprises drive content velocity with AI. New capabilities in Adobe GenStudio for Performance Marketing are accelerating video and display ad campaign creation. Marketers will be able to produce engaging short-form video ads using the commercially safe Firefly Video Model. We released new capabilities for display ad campaigns, including on-brand image generation with Firefly, as well as offerings with Amazon Ads, Google Campaign Manager 360, LinkedIn and Meta to power seamless campaign workflows.

70% of eligible AEP (Adobe Experience Platform) customers are using AEP AI Assistant; management sees AI becoming the new UI (user interface) for brand discovery by consumers; management thinks brands must deliver hyperpersonalized, immersive experiences on owned channels to drive engagement and loyalty, and this is where Adobe shines; management sees new

marketing needs such as LLM (large language model) optimization and LLM advertising as massive opportunities for Adobe; management is infusing agentic capabilities into Adobe Experience Manager; management saw LLM (large language model) traffic grow 4,700% year-on-year in July 2025; management thinks Adobe has AI-first and AI-infused solutions that can orchestrate the customer experience in the era of agentic AI; AEP has agentic capabilities and management launched the 1st phase of the AEP Agent Orchestrator in 2025 Q2 (FY2025 Q3), so that users can build, manage and orchestrate AI agents from Adobe and 3rd parties; Adobe LLM Optimizer is currently available in early access and will be generally available later in 2025 Q3 (FY2025 Q4); Adobe LLM Optimizer help brands shape how they show up in LLM results; subscription revenue for AEP and native apps was up 40% year-on-year in 2025 Q2 (FY2025 Q3)

Customers are leveraging the rich data and customer knowledge in Adobe Experience Platform to enable agentic workflows to scale the capabilities of Adobe’s category-leading customer experience orchestration applications. We’re seeing continued adoption and momentum for Adobe Experience Platform (AEP) AI Assistant with 70 percent of eligible AEP customers leveraging this functionality. 

As AI transforms consumer behavior, it’s reinventing marketing and customer experience. Brand discovery is shifting from primarily search to include generative engine optimization. AI becomes the new UI, guided by conversations rather than menu clicks. Brands must deliver hyperpersonalized, immersive experiences on owned channels to drive engagement and loyalty. In this new reality, Adobe uniquely offers an integrated customer experience platform that delivers automation, agility and scale.

The explosion of content creation and automation in the enterprise and the beginning of new marketing needs such as LLM optimization and LLM advertising are a massive opportunity for Adobe. We’re infusing AI into Adobe Experience Manager with our upcoming LLM Optimizer release, a powerful agentic app to improve brand visibility, drive acquisition and maintain engagement with customers across LLM platforms…

…Our most recent Adobe Digital Index data, which is based on online transactions across over 1 trillion visits to U.S. retail sites, shows that LLM traffic grew 4,700 percent year over year in July 2025…

…Our AI-first and AI-infused solutions spanning GenStudio for content supply chain; AEP and Apps for customer engagement and loyalty; and Adobe Experience Manager and LLM Optimizer for brand visibility and discovery, enable us to power customer experience orchestration in the era of agentic AI… 

…We are innovating on our leading AEP marketing and customer experience platform with built-in agentic functionality, empowering marketers to deliver digital experiences with greater agility and efficiency. Our intelligent agents understand intent, reason and recommend actions to drive outcomes across content, data, and journeys. Purpose-built agents are embedded in our core apps and new AI-first applications, helping brands unlock greater efficiency and precision, automate workflows and personalize experiences at scale. We launched the first phase of AEP Agent Orchestrator in Q3, empowering businesses to build, manage and orchestrate AI agents from Adobe and third parties. These capabilities power the Data Insights Agent and Product Support Agent, which are generally available now and add to our growing portfolio of agents.

Our newest innovation is Adobe LLM Optimizer, available in early access. As customers and prospects increasingly turn to generative AI search and assistants for brand discovery, LLM Optimizer helps shape how brands show up in results which is driving influence, visibility and qualified traffic…

…Strong demand for AEP and native apps with Q3 subscription revenue growing over 40 percent year over year…

…We are excited that the product will be generally available later this quarter.

Adobe’s management recently launched Acrobat Studio, which combines Acrobat and Express; Acrobat Studio has PDF Spaces, which uses AI Assistant to derive insights for users from a collection of PDFs and other content; combined monthly active users of Acrobat and Express are up 25% year-on-year; Acrobat AI Assistant brings a new conversational interface to PDF-consumption; management is seeing accelerated use of AI Assistant across desktop, web and mobile; users can easily create AI agents in PDF Spaces to perform document tasks on their behalf; Acrobat Studio has encouraging early adoption and usage trends; there is rapid adoption of Adobe Express; dentsu is using Adobe Express for its global marketing strategy across its 68,000 employees worldwide, and is seeing measurable impact; there was 40% sequential growth in units for Acrobat AI Assistant in 2025 Q2 (FY2025 Q3), and 50% sequential growth in conversations and summarisations; 14,000 organisations added Adobe Express in 2025 Q2 (FY2025 Q3), up 4x from a year ago; Express usage in Acrobat doubled sequentially; 

We’re integrating creativity with productivity for billions of users with the recent launch of Acrobat Studio, which brings together Acrobat and Express…

… The new Acrobat Studio includes PDF Spaces, which transforms collections of PDFs, web pages and other files into dynamic knowledge hubs that help people work smarter and faster using AI Assistant to derive insights. We’re seeing steady growth across our family of Acrobat and Express products with combined monthly active users growing approximately 25 percent year over year…

…The introduction of Acrobat AI Assistant brought a new conversational interface that enhances the experience of customers consuming PDFs. This unlocks increased comprehension across the trillions of PDFs in the world. We continue to see accelerated use of AI Assistant across desktop, web and mobile…

…Users can leverage PDF Spaces to organize documents and links, discover insights faster through conversational experiences and enable editing and remixing of PDF content into new formats like emails and presentations. I’m particularly excited that anyone can easily create agents to perform document tasks on their behalf. Customers can use PDF Spaces with team members for more impactful knowledge sharing and collaboration. The combination of PDF Spaces, AI Assistant and an integrated Express experience is available through Acrobat Studio, a new, premium offer in our Acrobat line-up. Early reception of Acrobat Studio has been strong, with encouraging adoption and usage trends that highlight the significant customer demand and opportunity ahead…

…We’re seeing rapid adoption of Adobe Express. In the enterprise, Express is helping organizations scale content creation while maintaining brand consistency and quality. A great example is dentsu, which has made Express a core part of its global marketing strategy. Adobe’s platform is being rolled out to all 68,000 employees worldwide and scaled across brands including Carat, iProspect, dentsu X, Dentsu Creative, Tag and Merkle. By enabling creative teams to build content in Creative Cloud and share that content through Express within an overall GenStudio solution, dentsu ensures brand alignment across global teams while empowering marketers to create and remix their own content. This is driving measurable impact at dentsu…

…Ending units for Acrobat AI Assistant grew more than 40 percent quarter over quarter and AI Assistant engagement, with conversations and summarizations grew nearly 50 percent quarter over quarter…

…Over 14,000 organizations added Express in Q3 alone, a 4x increase in the quarter versus a year ago; Express usage within Acrobat nearly doubled quarter over quarter.

Adobe’s AI-influenced ARR is now more than $5 billion (was in the “billions” in 2025 Q1); management expects AI-influenced ARR to continue to rise as a percent of Adobe’s business; Adobe’s AI-first products has already achieved management’s target of $250 million in ending ARR by end-FY2025

Our AI-influenced ARR has now surpassed $5 billion, up from over $3.5 billion exiting fiscal year 2024 and we have already surpassed our full year AIfirst ending ARR target…

…Adobe AI influenced ARR surpassed $5 billion and we expect it to continue to rise as a percent of our business. Notably, ARR from our new AI-first products, including Firefly, Acrobat AI Assistant, and GenStudio for Performance Marketing, has already achieved our end-of-year target of over $250 million.

Adobe’s management thinks that larger advertisers will still prefer to retain control over their advertising campaigns, and not hand nearly all or total control over to digital advertising platforms such as Google and Meta Platforms that are providing near- or fully-automated AI-powered solutions; management sees the large digital advertising platforms as being excited to be supported by Adobe’s performance-marketing solutions

As it relates to how people are going to create and run campaigns and ad placements in all of these different platforms. I think you’re going to see some smaller medium businesses use it. I think all of the larger companies, what we continue to hear in the enterprises, they want the ability to create campaigns, run it across multiple channels, see the attribution, as well as see — what we can do in terms of the analysis.

But in addition to that, I mean, all those advertising channels that you talked about are really excited about Adobe making it seamless which is why you’ve seen in the GenStudio for performance marketing the support third-party channels, whether that’s TikTok, Meta, Google, Amazon, all of that, we’re just going to continue to do.

Adobe’s management created LLM Optimizer after realising that Adobe has a lot of content that matches the questions users were asking AI chatbots regarding PDFs; management thinks LLM Optimizer is a great opportunity for Adobe to drive traffic to itself from AI chatbots, and for other companies to drive traffic to their properties

I was actually working internally with our team, our adobe.com team, which obviously runs a big digital business. That’s how we got going on the LLM Optimizer. We noticed that in terms of some of the traffic, it’s not only the search traffic, but a lot of our customers, our prospects were starting to ask questions within ChatGPT and Perplexity and so on. How do I edit this PDF? I have a large PDF? How do I compress it? Those kinds of questions. And we realized that we had a lot of content available that if we made it available the right channels that will get picked up by the LLMs and that would give us — our Acrobat brand a lot more visibility through the LLMs. So that’s how the idea for the product came about…

…I noticed in a lot of the preview reports folks look at web traffic, and it’s coming from different sources. That’s a really new movement. And so as people about just search traffic and what was happening in search, you really have to start to factor and we’re, I think, one of the leaders in that space, how to really take advantage of what’s happening, not just across search but also what happens across social and now increasingly what happens across LLM. So as Anil mentioned, this is not just an opportunity for us to use ourselves but I think a massive opportunity for us to help every single company deal with this new reality.

Adobe’s management is seeing a new movement of web traffic coming from AI chatbots; management thinks consumers will adopt LLMs for the entire process involving e-commerce transactions

I noticed in a lot of the preview reports folks look at web traffic, and it’s coming from different sources. That’s a really new movement…

…With the LLM, the new LLMs, the discovery to actual consideration, to purchase, maybe even the post purchase, that entire funnel is starting to consolidate and you’re going to be seeing consumers actually adopt LLMs for the entire process.

Even in the AI era, management thinks Creative Cloud has growth opportunities with seat expansion

[Question] There’s a thesis out there for software in general. That AI is the headwind to seats and the seats will need to shift to consumption, the issue is then can capture more consumption revenue than seat. How do you think about the relationship between seats and consumption in the Creative Cloud?

[Answer] On Creative Cloud specifically, we definitely view this as both as seat expansion as well as a marketing automation. And that’s part of the reason, as you know, why we — this customer grouping that we talk about, which is Creative Professional and Marketing Professionals, And in the enterprise that’s playing out exactly the way it is. It is actually still continuing to play out with seat expansion in the enterprise.

Adobe has continued to post healthy margins despite investing in AI capabilities because management has put a lot of effort into controlling training and inferencing costs, and using AI to drive internal productivity

[Question] You’ve been speaking to mid-40s margin profile, you’re still operating a bit above that this quarter. It looks like gross margins are actually up a touch versus last year. Why aren’t you seeing degradation from AI adoption, given some of the metrics you’re providing?

[Answer] I think there’s 2 vectors of productivity that the company is driving to underpin margin delivery. First one, how we drive GPU training fleets to support training, the utilization, the algorithms we use to efficiently get at model construction as well as continually loading that GPU fleet to make sure there’s high utilization over time. The second piece is inferencing. Constantly tuning the algorithms and cost per inference. We watch this maniacally, how we feel these fleets of GPUs to make sure that the reserve instances, which come in at very different price points than on demand that we constantly balance and optimize the cost structure that underpins the usage of that compute power. And then obviously from an internal working standpoint, adoption of these technologies how we drive productivity gains in the company, how we augment individual employees from a productivity standpoint as well as ways of working inside of the company, to continue to drive more and more productivity out of the world’s best employees.

Adobe’s management is seeing users of Adobe’s AI solutions having better retention

The thing that we have seen is a direct correlation between increased use of AI and retention, and we feel very good about that.

Adyen (OTC: ADYEY)

Adyen had been applying AI on payments well before AI became a hot topic; Adyen’s AI-powered Adyen Uplift technology, launched in 2024, improves conversion, strengthens fraud prevention, and reduces payment costs; Adyen Uplift has a full-funnel approach that is superior to legacy systems’ approach; Adyen Uplift uses Adyen’s access to trillions of dollars in global transaction data from 1 billion shoppers to provide the necessary recommendations; Adyen Uplift is modular in design and has 4 components, Optimize, Protect, Tokenize, and Authenticate; Optimize uses Adyen’s IPR (Intelligent Payment Routing) to maximise payment authorisations and reduce transaction costs; each component of Adyen Uplift can be used separately, but work best when used together; merchants have control to test and adjust performance settings within Adyen Uplift; nearly all users of Adyen Uplift are using Optimize, while 68% of Adyen Uplift users are using Protect; in markets such as Australia and the USA where debit cards can route payments through global or domestic networks, IPR uses machine learning to analyze real-time signals to determine the optimal route for each transaction because domestic networks can offer lower fees, but at lower performance; IPR can reduce cost while maintaining or even improving approval rates; adoption of IPR was up 8x in 2025 H1 compared to 2024 H2; US customers of IPR saw average cost reduction of 20% on debit transactions and 89 basis point improvement in authorisations; Australian customers of IPR generated average cost savings of 47%; Adyen Uplift is fully embraced across the Digital pillar; Adyen is only partly charging for Adyen Uplift’s services currently

We’ve been applying machine learning to optimize payment flows well before AI rose to the top of the industry agenda…

…Adyen Uplift was developed around three recurring needs: improving conversion, strengthening fraud prevention, and reducing payment costs…

…While legacy systems often address these issues in isolation, Adyen Uplift takes a full-funnel approach. It uses risk-based intelligence and automation to optimize decisions across the entire payment flow. With access to trillions of dollars in global transaction data from over a billion shoppers across online and in-store channels, we can detect high-risk behavior and reliably recognize trusted shoppers. This combination provides the depth of insight needed to deliver tailored recommendations that customers can test and validate in real-time…

…Adyen Uplift is modular by design so enterprise customers can adopt the capabilities most relevant to their business. Optimize is the decision engine that maximizes payment authorizations and reduces transaction costs. It uses IPR to find the optimal balance between conversion and cost for any transaction with multiple route possibilities. Protect delivers advanced fraud detection, while Tokenize ensures payment credentials remain valid and secure. Authenticate helps businesses meet local compliance requirements without adding unnecessary friction to the shopper experience. Each module can stand alone, but the product suite delivers the most value when its components work together. What seems optimal at one step of the payments flow often isn’t when viewed in full context…

…Merchants now have more control to test and adjust performance settings dynamically. Each recommendation includes clear activation instructions, the ability to test before adoption, and a projected outcome, helping them to assess potential impact and move with confidence. Examples include enabling a local payment method, fine-tuning authentication logic, or activating IPR for US debit payments…

…Optimize is available to all customers, with nearly all utilizing the module. Additionally, 68% of enterprise merchants in our 2025 cohort have adopted the Protect module from day one…

…Intelligent Payment Routing (IPR) within Adyen Uplift is a prime example. This product dynamically selects the optimal route for each transaction based on conversion and cost. We invested in direct connections with local debit networks early on. This enabled us to build a solution that not only ensures compliance but consistently enhances performance. In markets like the U.S. and Australia, dual-branded debit cards can be routed through either global or domestic networks. While local rails often offer lower fees, their performance can vary. IPR uses machine learning to analyze realtime signals, such as scheme performance, issuer behavior, and cost structures, to determine the optimal route for each transaction. The result is a product that reduces cost while maintaining or even improving approval rates.

Adoption grew 8x in H1 2025 compared to the pilot group announced in H2 2024, with major U.S. brands such as Adobe, Microsoft, 24 Hour Fitness, and Indeed using the solution. In the U.S., customers saw an average cost reduction of 20% on debit transactions and a +89 basis point improvement in authorization rates. In Australia, the launch of local routing over Eftpos supported 55 merchants, with average cost savings of 47%…

…Adyen Uplift is now fully embraced across Digital, becoming a core part of how customers optimize for performance, reduce cost, and navigate growing complexity…

…Uplift is a product that we launched in the second half of last year. We are partly charging for it. So it depends a bit on the module that you’re exactly using and some of the parts are free. Of course, ultimately, what we’re building for is that we charge for the products that we offer to our customers. So it’s currently a mix.  

Adyen’s management sees significant potential in agentic commerce and thinks Adyen is well-positioned for the shift; management thinks agentic commerce brings new demands, in particular, a new lens for looking at fraud prevention, because traditional signals used in fraud prevention are absent in agentic transactions; management sees Adyen’s tokenization capabilities as being an important enabler of agentic commerce in being able to improve authorization, reduce fraud, and enable intelligent, context-specific execution; management sees Adyen as being at the leading edge of tokenization in the context of enabling agentic commerce; Adyen’s global risk system, built on nearly €1.3 trillion in annual volume, enables consistent fraud detection in agent-initiated flows; Adyen’s MCP (model context protocol) server enables structured agent-to-business communication; management thinks Adyen’s platform will allow whatever emerges from agentic commerce to work seamlessly with existing global payment methods

One area where we see early momentum and significant long-term potential is agentic commerce: the shift from enhanced search to autonomous, agent-led purchasing. While still emerging, the rapid adoption of large language models signals rising interest and underlying demand. We’re well positioned to support this shift, helping merchants and consumers navigate the next chapter of ecommerce.

Agentic commerce brings new demands: secure information exchange, sandboxed payment permissions, dynamic authorization, and real-time context-awareness. Crucially, it requires rethinking fraud prevention. Traditional signals are often absent when agents transact on behalf of users, making it essential to rely on scalable infrastructure and intelligent risk models that operate without direct human input. Our platform is built for this. Our tokenization suite enables secure, seamless credential sharing between agents, merchants, and shoppers. Agents can initiate payments using standardized tokens that improve authorization, reduce fraud, and enable intelligent, context-specific execution. We’re at the forefront of this space, pushing the boundaries of what tokenization can do. Our recent announcement with JCB highlights how we’re advancing global credential security — Adyen is the first to offer their advanced tokenization to reduce fraud and improve authorization.

Our authentication engine supports adaptive trust models, applying the right protocol based on transaction risk, regulation, and issuer logic. Our global risk system, trained on nearly €1.3 trillion in annual volume, adds consistent fraud detection, even in agent-initiated flows, flagging misuse, and maintaining trust at scale. And with our Model Context Protocol (MCP) server, we’re enabling structured agent-to-business communication, equipping AI agents to securely interpret and act on commerce data…

… Our infrastructure ensures that whatever emerges in this space can work seamlessly with the global payment methods, regions, and consumer journeys our customers rely on today, and in the future.

MongoDB (NASDAQ: MDB)

MongoDB is adding thousands of AI-native customers

We’re adding thousands of AI native customers.

MongoDB Atlas consumption growth in 2025 Q2 (FY2026 Q2) benefitted from a strong start to consumption in May 2025, as well as broad-based strength; Atlas consumption growth in 2025 Q2 (FY2026 Q2) was consistent with 2024 Q2 (FY2025 Q2); Atlas’s growth has been driven partly by an uptick of capabilities such as Search and Vector Search

We had an impressive Atlas growth quarter, which benefited in part from the strong start to consumption in May that we referenced on our last call as well as broad-based strength, especially in larger customers in the U.S…

…In Q2, Atlas consumption growth was strong and relatively consistent with last year’s growth rates. This drove the acceleration in revenue as well as the growth in absolute revenue dollars year-to-date for the first half of fiscal ’26…

…What we’re also seeing is that there’s a great uptick of some of the other capabilities we offer like search and vector search that are also adding to that growth of those workloads.

Many of MongoDB’s recently-added customers are building AI applications and this bolsters management’s confidence that MongoDB is an important part of the AI infrastructure stack; management sees MongoDB emerging as a standard for AI applications

Many of our recently added customers are building AI applications, underscoring how our value proposition is resonating for AI and why MongoDB is emerging as a key component of the AI infrastructure stack…

…MongoDB is emerging as a standard for AI applications.

MongoDB has integrated capabilities such as search, vector search, embeddings and stream processing into its database product; the integrations mean MongoDB has so much more capabilities than competing databases such as Postgres; management thinks AI startups tend to go with Postgres first because the founders are familiar with Postgres and they do not think carefully about their database choices; what the AI startups often realise after choosing Postgres is they run into scaling challenges and then turn to MongoDB; management wants to do more developer education regarding Postgres versus MongoDB

MongoDB has redefined what’s core for the database by natively including capabilities like search, vector search, embeddings and stream processing. Comparing MongoDB to another database like Postgres is not an apples-to-apples comparison. Take a global e-commerce application that manages inventory and order data while enabling product discovery through sophisticated search across millions of SKUs. The choice for this application is not between MongoDB or Postgres, it is between MongoDB or Postgres plus other offerings like Pinecone, Elastic and Cohere for embeddings…

…[Question] Why do we hear so much about Postgres adoption for AI start-ups. You talked about the success you guys are having. But if Postgres has the disadvantages that you’ve talked about multiple times, scalability, JSON support, how come we hear so much about that kind of at least in the early stages of AI?

[Answer] What’s become clear is a lot of these startup founders don’t think that hard about their database choice, they kind of go with what they know. And what we are seeing is that as some of these startups are scaling, they’re running to real scaling challenges with Postgres. And what — and we’ve talked about this in the past, like when you add a JSON — when you use JSONB on Postgres, a 2 kilobyte document or bigger starts really creating performance problems because Postgres has to do something called off-row storage, which creates enormous performance overheads. And so the developers need a platform that can handle structured, semi-structured and unstructured data, they need obviously a platform that performs well, and they need a platform that can scale as they grow. And what we’re hearing clearly from the startup community that Postgres, in many cases, is not scaling for them, and they’re now coming to us…

…We realize we need to do more developer education and do more work. And so we’re investing a lot in the startup community. We’re running a big event in October in San Francisco with a big hackathon, and we’re inviting lots of customers to participate. But that’s just the start of a meaningful investment we’re making in the Bay Area and the AI startup community to rethink their decisions around just going with what they know.

MongoDB’s management is seeing enterprises adopt AI, but the process is still early; most AI use cases for enterprises are related to employee productivity tools and packaged solutions from ISVs (independent software vendors); enterprises are still very early in building custom AI applications; enterprises often fail when attempting to scale vibe-coded software built on relational databases; management is seeing enterprises start deploying AI agents, but it’s still very early; management is hearing from customers that AI is currently providing productivity gains, but it’s not transforming their businesses; management thinks the real value of AI will come when enterprises are able to build custom AI solutions; enterprises are sometimes hesitant to deploy AI for customer-facing applications because it’s not possible currently to guarantee the quality of output of AI models; management thinks there will not be an inflection point in enterprises suddenly adopting AI at a big scale, instead adoption will simply take time to grow

In the enterprise segment, adoption is real but early. Much of the activity today centers on employee productivity tools and packaged ISV solutions. Enterprises are still in the very early stages of building their own custom AI applications that will transform their business. We consistently hear from customers that when teams try to scale from vibe-coded prototypes built on relational back ends to enterprise-grade deployments, these platform quickly hit limits in flexibility, scalability and performance…

…Where it is being deployed is really on end user productivity, whether it’s developers with cogen tools or business users using tools to summarize documents, extract data or things like deflecting tickets from people to systems with like conversational AI. I think you are starting to see the first steps in people deploying agent-based systems, and I can talk a little bit about that, but that’s still very, very early. We’re seeing small ISVs, some of them are taking off, who are really driving most of the impact.

But the real enduring value will come. When you talk to a customer today, most of them when you ask them is AI really transforming a business, they will say no. Yes, we’re seeing some productivity gains here and there, but it’s not really transforming my business. I think the real enduring value will come when they build custom AI solutions that can truly transform the business, whether it’s to drive new revenue opportunities or dramatically reduce their existing cost structure…

…I had 2 meetings today with 2 different leaders of 2 different financial institutions here in New York, and they both talked about what they’re doing in AI. They both admitted that they’ve kind of started with low stakes use cases, but their appetite to start doing more is increasing as they get more and more comfortable with the technology, and they’re quite excited to leverage MongoDB as part of that journey. But again, I think that’s kind of a microcosm into the enterprise market where I think they’re still quite early in their AI journey…

…AI systems are probabilistic in nature, not deterministic in nature. So you can’t always guarantee the output. You can hope that you’ve trained the models well. You’ve hoped that you’ve given it the right information, but you can’t always guarantee the output. So as I mentioned, I had meetings with 2 financial services customers earlier today, and both of them are still hesitant to roll out an end user-facing AI applications for those specific reasons…

…[Question] Some of the comments you were talking about the AI slowdown, and you heard about recent MIT report about 95% AI implementation not getting any kind of return. How do you see — what’s kind of do you think the inflection point?

[Answer] It’s going to take time to be comfortable with technology. It’s going to take time where people start with low stakes use cases and start gravitating to higher state use cases. So I don’t think there’s going to be some seminal inflection point. I think it’s just going to take time. But I think that time is coming.

A leading electric vehicle company chose MongoDB Atlas and Vector Search for its autonomous driving platform; MongoDB Atlas Vector Search had superior performance over Postgres; the electric vehicle company is using MongoDB Atlas to handle over 1 billion vectors and expects 10x growth in data usage in the next 12 months

A leading electric vehicle company chose Atlas and vector search to power its autonomous driving platform. After testing vector search against Postgres PG Vector for their in-vehicle voice assistant, they selected MongoDB for superior performance at scale and stronger ROI. They now rely on Atlas to handle over 1 billion vectors and expect 10x growth in data usage by next year.

AI-native startup DevRev used MongoDB Atlas to build its AgentOS product; AgentOS handles billions of requests per month; MongoDB Atlas helped DevRev speed up product development at lower cost and helped DevRev scale globally; DevRev is using MongoDB Atlas Vector Search

DevRev, a well-funded AI-native platform with proven founders disrupting the help desk market built AgentOS, it’s a complete agentic platform that autonomously handles billions of monthly requests on Atlas. DevRev accelerated development velocity, lower cost and scale globally with low latency by using Atlas. AgentOS also leverages Atlas Vector Search for semantic search enriching its knowledge graph and LLMs with domain-specific content.

MongoDB’s management is very excited about Relational Migrator; Relational Migrator has a new product leader with strong skills around using AI to drive automation in the product; Relational Migrator also has a new go-to-market leader; management does not expect Relational Migrator to contribute much to MongoDB’s business in 2025 (FY2026)

[Question] I know you’ve been investing in Relational Migrator. You’re working with companies like Cognition to accelerate the code migration opportunity. And you’ve seen professional services ramp up a little bit. But where have you started to see sort of the time to migration or replatform improve a bit?

[Answer] we’re super excited about what we call app modernization or legacy app modernization. You’ll hear a lot more about this at Investor Day in September, Tyler. But what I will say you is that the value proposition is very clear. Customers are very, very motivated to try and modernize these legacy systems for a wide variety of reasons. We are seeing a lot of progress. We’ve actually brought in a new leader — new product leader, who brings a lot of depth and scale, especially around AI to help us build the tooling to leverage AI to really drive more automation in terms of how we analyze and refactor the code. We brought in a new leader last quarter to really help drive the delivery and the go-to-market efforts around app mod. So we’re definitely beefing up resources…

…It won’t be as pronounced in terms of this year, but we’re very, very excited about the opportunity.

MongoDB’s management is seeing OLTP (online transaction processing) be the strategic high ground for AI especially in inferencing; many database companies are struggling to develop OLTP platforms and so had to make acquisitions; management thinks MongoDB is positioned really well for the AI opportunity given its strengths as an OLTP platform

What we are seeing is that the strategic high ground for AI, especially when it comes to inference is OLTP. So we talked about this on the last call where some companies that acquired early-stage OLTP start-ups. And what it really spoke to and those companies had spoken about their organic efforts to build an OLTP platform. And I think what it spoke to was the fact that they building an OLTP platform that’s ready and mission-critical and enterprise can serve the most demanding requirements of enterprises is not trivial. And I think they basically threw in the towel and decided to do these acquisitions…

…If now customers are going to be choosing what OLTP platform that they want for AI, just given our architecture, just given the fact that we have a durable architectural advantage in terms of JSON support, which addresses messy, complicated and highly interdependent and constantly changing data structures. The fact that we integrated search and vector search, I think, really helps us position going after AI.

MongoDB’s management thinks real JSON is becoming more important now with AI; management is seeing the hyperscalers hold off on investing in JSON-related capabilities; management thinks JSON is the best way to handle messy and evolving data structures in the real world, and this positions MongoDB well for AI because it is a JSON database

[Question] I’m thinking about Lakebase from Databricks and then DocumentDB in the Linux Foundation. Can you just comment on both those things?

[Answer] Around the Linux Foundation, I think what this really also suggests — shows is that real JSON is much more important now with AI than ever before and the clones and bolt-ons that have traded off features and performance and developer experience have just not met customer expectations. And candidly, what I see this is that the hyperscalers are investing less and really handing off to the open source community to kind of really take on the bulk of the work in terms of product development. Our hyperscaler partnerships remain strong…

…We’re a JSON database. JSON is the best way to express and model the complicated and messy and highly interdependent and constantly evolving data structures that you have to deal with in the real world. So that’s point number one. So it’s much easier to do that in MongoDB than to do that on some kludge kind of set up on top of a relational database.

MongoDB’s management thinks a unique differentiator of MongoDB for AI startups is MongoDB’s database allows sophisticated retrieval of information to be done quickly; another unique differentiator of MongoDB is the presence of Voyage’s embedding models; embeddings act as a bridge between a company’s private data and the AI model, and reduces hallucinations

I would say the AI cohort was not a material driver of the growth. That being said, what we are seeing is a lot of customers very, very interested in our architecture…

…Second is that we integrate search and vector search. You can do very sophisticated things to what people call hybrid search and retrieval, you can do very sophisticated things in finding information quickly, which is a very unique differentiator for us. So what this means that rather than stitching together multiple systems, you can do this all on MongoDB, so it becomes less complexity and lower cost.

The third thing is that we’ve now embedded Voyage models on our platform, right? So the — if you control the embedding layer, you sit at the gateway of needing of AI, right? What the embedding models do is really a bridge between a company’s private data and the LLM. So that becomes really important because the better the quality of the embedding model, the better the quality of the signal of your own data. So that reduces things like hallucinations or just bad outputs. And so customers are now — as people start caring more and more about like higher state use cases, they really want to ensure those outputs are high. And the fact that it’s part of our platform, we can enable you to do auto embeddings. It becomes an incredibly compelling feature.

MongoDB’s management thinks AI agents will be using a company’s systems much more intensely than humans, so it’s important that a company’s systems can massively scale up and down; the need for massive scaling up and down of systems positions MongoDB well; management thinks MongoDB is positioned to win in a world where AI agents dominate because of (1) the strengths of the JSON database, (2) support of vector search, (3) support of memory

Agents require — if you think about — if you’re using agents, agents will use your systems much more intensely than humans will because they can do things much more quickly. So you need platforms that can massively scale up and down, which is, again, a good sign and support indicator for MongoDB…

…[Question] If we were to fast forward 5, 10 years and we start to see a real paradigm shift where instead of agents built on kind of the traditional GUI mobile interface that we’ve been in for the past 30 years, we actually entered kind of a multi-agentic world where maybe the interaction vector may move away from what we’ve been used to into more natural language. Can you talk about why MongoDB still has a strong role and some of the investments that you might be making to position yourself well for the world, understanding that’s at the very least several years away?

[Answer] We believe that agents essentially do 3 things. One, they perceive or understand the state of things. So you need essentially a way to understand the state of what’s happening in your business, then you need to decide what to do or plan. So basically, you have to come up with the plan saying, “I want to take this action or these sets of actions.” And then you have to act. You actually have to go execute those actions, right?

So why is MongoDB good for agents. One, as I said before, the JSON document database is the best of being able to model the real world, the messiness, the complicated nature. The real world does not fit easily in rows and columns. And that’s why our document database, I think, is the best way to do that. Two, we obviously support search and vector search. So you can do very sophisticated hybrid search. So that becomes super important. And then with memory, if agents didn’t have memory, they would act like goldfish. They could only react to the last thing — last piece of information that they saw.

So memory lets agents connect the dots across time and situations. So you have different kinds of memory, things like short-term context, past experiences, knowledge, skills, et cetera, they need to be able to share quickly. You need to be able to orchestrate those agents because you may have multiple agents doing a certain task. You need to register and have governance policies around those agents. We think that the underlying platform needs to be able to support those things while there’s a lot more work needs to be done, the underlying architecture that we have in MongoDB is well suited to address those needs.

Nu Holdings (NYSE: NU)

Nu Holdings’ management has seen significant improvement in its ability to do credit underwriting for credit cards, driven by (1) AI-powered improvements to Nu Holdings’ credit models, and (2) new data acquired by the company; Nu Holdings is now the leader in open finance consent, which helps in Nu Holdings collecting data; the AI-related improvements in credit models comes from Nu Holdings’ 2024 acquisition of Hyperplane; the credit models that were improved by Hyperplane are largely focused on the mass market at the moment, but management expects the AI-enabled architecture from Hyperplane to be applied to more models in the future; management expects to see meaningful changes to Nu Holdings’ models across many different use cases in the future by applying Hyperplane’s technologies 

We have been seeing kind of a fairly material improvements in our ability to do credit underwriting and to continue to expand the credit card portfolio. It has to do with the adoption of new models and technologies to how we do credit underwriting, going all the way to better kind of traditional machine learning models, but also neural networks and predictive AI technologies, but more and more by the adoption of new data that we acquired…

…So the more customers stay with us, the more data we accumulate, we are now the leaders in open finance consent. The combination of better modeling technique with more data has allowed us to consistently increase kind of credit underwriting, credit limits and utilizations…

…[Question] The Hyperplane expansion in the credit limit that you talked about, is there any particular segment of customer base where it is more targeted towards higher income or mass market or your super core segments?

[Answer] So far has been mostly focused on mass market, but we expect that a lot of these new AI enabled architecture will be now applied to a number of different models…

…We expect a number of new models coming in for a number of different segments for the different countries and for different applications, such as collections, fraud, cross-sell. So we’re very excited about this, and it’s early days of applying this new technology to a lot of the decisioning that we have across Nubank. But we expect to see meaningful changes across the board.

NVIDIA (NASDAQ: NVDA)

NVIDIA’s Data Center revenue again had very strong growth in 2025 Q2 (FY2026 Q2), driven by the Blackwell family of chips

Data center revenue grew 56% year-over-year. Data center revenue also grew sequentially despite the $4 billion decline in H20 revenue. NVIDIA’s Blackwell platform reached record levels, growing sequentially by 17%…

…The new Blackwell Ultra platform has also had a strong quarter, generating tens of billions in revenue.

NVIDIA’s management sees $3 trillion to $4 trillion of AI infrastructure by 2030; management sees $600 billion in data center capital expenditures in 2025; management expects AI infrastructure investments to continue growing, driven by (1) agentic AI’s requirement for orders of magnitude more training and inference compute, (2) sovereign AI, (3) enterprise AI adoption, and (4) robotics; NVIDIA’s management sees the market for AI inference expanding rapidly; the capital expenditures from the CSPs (cloud services providers) has doubled over the last few years to $600 billion; management expects enterprises beyond the cloud hyperscalers to contribute to the expected $3 trillion to $4 trillion of AI infrastructure spend by 2030; management sees NVIDIA’s chips accounting for the majority of spend in AI data centers

We see $3 trillion to $4 trillion in AI infrastructure spend in the — by the end of the decade…

…Capital expenditures from the cloud to enterprises, which are on track to invest $600 billion in data center infrastructure and compute this calendar year alone, nearly doubling in 2 years. We expect annual AI infrastructure investments to continue growing, driven by the several factors: reasoning agentic AI requiring orders of magnitude more training and inference compute, global build-outs for sovereign AI, enterprise AI adoption, and the arrival of physical AI and robotics…

…The market for AI inference is expanding rapidly with reasoning and agentic AI gaining traction across industries…

…The last couple of years, you have seen that CapEx has grown in just the top 4 CSPs by — has doubled and grown to about $600 billion…

…The CapEx of just the top 4 hyperscalers has doubled in 2 years. As the AI revolution went into full steam, as the AI race is now on, the CapEx spend has doubled to $600 billion per year. There’s 5 years between now and the end of the decade, and $600 billion only represents the top 4 hyperscalers. We still have the rest of the enterprise companies building on-prem. You have cloud service providers building around the world…

…Out of a gigawatt AI factory, which can go anywhere from $50 billion to plus or minus 10%, let’s say, $50 billion to $60 billion, we represent about $35 billion plus or minus of that and $35 billion out of $50 billion per gigawatt data center.

The Blackwell family of chips is seeing widespread adoption and its users include high-profile model builders; the transition from the GB200 to the GB300 has been seamless, with the current run rate for the GB300 rack at 1,000 racks per week, with acceleration in output expected throughout 2025 Q3 (FY2026 Q3); the GB300 has a 10x higher inference performance on reasoning models compared to H100; GB300 has a 10x improvement in token per watt energy efficiency compared to the previous Hopper family of chips; management thinks Blackwell is the new standard for AI inference performance; the GB300 platform has a 50x increase in energy efficiency per token compared to Hopper; management believes a company investing in GB200 can earn 10x the amount in revenue; the performance of the Blackwell family of chips has already improved by 2x since its launch because of NVIDIA’s software innovations, including a groundbreaking numerical approach to LLM (large language model) pretraining; the new numerical approach means the GB300 can achieve 7x faster training than the H100; the AI industry’s major companies have adopted the new numerical approach

The GB200 NVL system is seeing widespread adoption with deployments at CSPs and consumer Internet companies. Lighthouse model builders, including OpenAI, Meta and Mistral are using the GB200 NVL72 at data center scale for both training, next-generation models and serving inference models in production…

…The transition to the GB300 has been seamless for major cloud service providers due to its shared architecture, software and physical footprint with the GB200, enabling them to build and deploy GB300 racks with ease. The transition to the new GB300 rack-based architecture has been seamless. Factory builds in late July and early August were successfully converted to support the GB300 ramp, and today, full production is underway. The current run rate is back at full speed, producing approximately 1,000 racks per week. This output is expected to accelerate even further throughout the third quarter as additional capacity comes online.

We expect widespread market availability in the second half of the year as CoreWeave prepares to bring their GB300 instance to market as they are already seeing 10x more inference performance on reasoning models compared to H100. Compared to the previous Hopper generation, GB300 NVL72 AI factories promise a 10x improvement in token per watt energy efficiency, which translates to revenues as data centers are power limited…

…Blackwell has set the benchmark as it is the new standard for AI inference performance…

…New NVFP4 4-bit precision and NVLink 72 on the GB300 platform delivers a 50x increase in energy efficiency per token compared to Hopper, enabling companies to monetize their compute at unprecedented scale. For instance, a $3 million investment in GB200 infrastructure can generate $30 million in token revenue, a 10x return…

…NVIDIA software innovation, combined with the strength of our developer ecosystem, has already improved Blackwell’s performance by more than 2x since its launch. Advances in CUDA, TensorRT-LLM and Dynamo are unlocking maximum efficiency. CUDA library contributions from the open source community, along with NVIDIA’s open libraries and frameworks are now integrated into millions of workflows. This powerful flywheel of collaborative innovation between NVIDIA and global community contribution strengthens NVIDIA’s performance leadership. NVIDIA is a top contributor to OpenAI models, data and software.

Blackwell has introduced a groundbreaking numerical approach to large language model pretraining. Using NVFP4 computations on the GB300 can now achieve 7x faster training than the H100, which uses FP8. This innovation delivers the accuracy of 16-bit precision with the speed and efficiency of 4 bit, setting a new standard for AI factor efficiency and scalability. The AI industry is quickly adopting this revolutionary technology with major players such as AWS, Google Cloud, Microsoft Azure and OpenAI as well as Cohere, Mistral, Kimi AI, Perplexity, Reflection and Runway already embracing it. NVIDIA’s performance leadership was further validated in the latest MLPerf Training benchmarks, where the GB200 delivered a clean sweep. Be on the lookout for the upcoming MLPerf Inference results in September, which will include benchmarks based on the Blackwell Ultra.

NVIDIA’s next generation of chips, the Rubin family, are in fab now and remains on schedule for volume production in 2026; 6 different chips go into a Rubin AI supercomputer

The chips of the Rubin platform are in fab, the Vera CPU, Rubin GPU, CX9 SuperNIC, NVLink 144 scale up switch, Spectrum-X scale out and scale across switch, and the silicon photonics processor. Rubin remains on schedule for volume production next year. Rubin will be our third-generation NVLink rack scale AI supercomputer with a mature and full-scale supply chain…

…It takes 6 chips just to build — 6 different types of chips just to build a Rubin AI supercomputer.

The US government recently started reviewing licenses for sales of NVIDIA’s H20 chips to China customers; some of NVIDIA’s China customers have received licenses for H20 chips, but NVIDIA has yet to make any shipments; management sees the US government as expecting a 15% revenue-share from the sales of H20 chips to China customers, but the US government has yet to publish regulations on this; management has not included H20 sales in its 2025 Q3 (FY2026 Q3) guidance; management expects revenue of $2 billion to $5 billion in 2025 Q3 from H20 chips if they can be shipped once geopolitical uncertainty subsides; NVIDIA has capacity to fulfill more orders for H20 beyond the $5 billion expectation; management continues to advocate for the sale of Blackwell chips to China as they believe the sales will benefit the US economy; management sees the sales of Blackwell chips to China as being for commercial uses only; China revenue declined sequentially; management thinks China represents a $50 billion revenue opportunity for NVIDIA in 2025, with growth of 50% annually, if the company is able to sell chips there; management sees China as the home of AI researchers with about 50% of AI researchers being in the country; management sees China as the home of the leading open-sourced AI models, and that it’s important for American AI companies to be able to serve China because of the country’s lead in open source

In late July, the U.S. government began reviewing licenses for sales of H20 to China customers. While a select number of our China-based customers have received licenses over the past few weeks, we have not shipped any H20 based on those licenses. USG officials have expressed an expectation that the USG will receive 15% of the revenue generated from licensed H20 sales, but to date, the USG has not published a regulation codifying such requirement.

We have not included H20 in our Q3 outlook as we continue to work through geopolitical issues. If geopolitical issues reside, we should ship $2 billion to $5 billion in H20 revenue in Q3. And if we had more orders, we can bill more.

We continue to advocate for the U.S. government to approve Blackwell for China. Our products are designed and sold for beneficial commercial use, and every license sale we make will benefit the U.S. economy, the U.S. leadership. In highly competitive markets, we want to win the support of every developer. America’s AI technology stack can be the world’s standard if we race and compete globally…

…China declined on a sequential basis to low single-digit percentage of data center revenue…

…The China market, I’ve estimated to be about $50 billion of opportunity for us this year if we were able to address it with competitive products. And if it’s $50 billion this year, you would expect it to grow, say, 50% per year…

…It is the second largest computing market in the world, and it is also the home of AI researchers. About 50% of the world’s AI researchers are in China.

The vast majority of the leading open source models are created in China. And so it’s fairly important, I think, for the American technology companies to be able to address that market. And open source, as you know, is created in one country, but it’s used all over the world. The open source models that have come out of China are really excellent. DeepSeek, of course, gained global notoriety. Qwen is excellent. Kimi’s excellent. There’s a whole bunch of new models that are coming out. They’re multimodal. They’re great language models. And it’s really fueled the adoption of AI in enterprises around the world because enterprises want to build their own custom proprietary software stacks. And so open source model’s really important for enterprise. It’s really important for SaaS who also would like to build proprietary systems. It has been really incredible for robotics around the world. And so open source is really important, and it’s important that the American companies are able to address it. This is — it’s going to be a very large market. We’re talking to the administration about the importance of American companies to be able to address the Chinese market.

NVIDIA saw an increase in shipments of Hopper 100 and H200 chips in 2025 Q2 (FY2026 Q2), which indicates the breath of AI workloads that run on NVIDIA’s hardware

In the quarter was an increase in Hopper 100 and H200 shipments. We also sold approximately $650 million of H20 in Q2 to an unrestricted customer outside of China. The sequential increase in Hopper demand indicates the breadth of data center workloads that run on accelerated computing and the power of CUDA libraries and full stack optimizations, which continuously enhance the performance and economic value of our platform. 

NVIDIA’s RTX Pro servers, for world models, are now in full production; nearly 90 companies are already adopting the RTX Pro servers, including Hitachi for digital twins, Eli Lilly for drug discovery, Hyundai for factory design, and Disney for immersive story telling; management believes RTX Pro can become a multi-billion business 

NVIDIA RTX PRO servers are in full production for the world system makers. These are air-cooled PCIe-based systems integrated seamlessly into standard IT environments and run traditional enterprise IT applications as well as the most advanced agentic and physical AI applications. Nearly 90 companies including many global leaders are already adopting RTX PRO servers. Hitachi uses them for real-time simulation and digital twins, Lilly for drug discovery, Hyundai for factory design and AV validation, and Disney for immersive storytelling. As enterprises modernize data centers, RTX PRO servers are poised to become a multibillion-dollar product line.

NVIDIA’s management sees sovereign AI continuing to grow; NVIDIA is involved with Europe’s landmark AI initiatives; the European Union has plans to invest €20 billion to build 20 AI data centers; management sees NVIDIA being on track to earn $20 billion in sovereign AI revenue in 2025 (FY2026), up more than 100% from a year ago

Sovereign AI is one on the rise as the nation’s ability to develop its own AI using domestic infrastructure, data and talent presents a significant opportunity for NVIDIA. NVIDIA is at the forefront of landmark initiatives across the U.K. and Europe. The European Union plans to invest EUR 20 billion to establish 20 AI factories across France, Germany, Italy and Spain, including 5 gigafactories to increase its AI compute infrastructure by tenfold. In the U.K., the Isambard-AI supercomputer powered by NVIDIA was unveiled at the country’s most powerful AI system, delivering 21 exaflops of AI performance to accelerate breakthroughs in fields of drug discovery and climate modeling. We are on track to achieve over [ 20 billion ] in Sovereign AI revenue this year, more than double than that last year.

NVIDIA’s networking revenue had very strong sequential as well as year-on-year growth in 2025 Q2 (FY2026 Q2), driven by strong demand across Spectrum-X Ethernet, InfiniBand and NVLink; management thinks Spectrum-X Ethernet has the highest throughput and lowest latency network for Ethernet AI workloads; Spectrum-X grew double-digits sequentially and year-on-year in 2025 Q2 and has more than $10 billion in annualised revenue; management recently introduced Spectrum-XGS Ethernet technology that can double GPU-to-GPU communication speed; CoreWeave will be an initial adopter of Spectrum-XGS Ethernet technology; Infiniband’s revenue was up nearly 100% sequentially, driven by XDR technology; XDR technology has nearly 100% higher bandwidth improvement over the previous generation; management sees NVLink as the world’s fastest data switch; NVLink Fusion, which allows semi-custom AI infrastructure, has received widespread positive reception; NVLink Fusion will be used by Japan’s quantum computing research center, FugakuNEXT; the difference between NVLink 8 and NVLink 72 is that NVLink 8 makes each node a computer, whereas NVLink 72 makes each rack a computer; NVIDIA has 3 networking technologies that addresses scale up (NVLink), scale out (Inifiband), and scale across (Spectrum Ethernet); management sees NVLink 72 as being excellent at amplifying memory bandwidth

Networking delivered record revenue of $7.3 billion, and escalating demands of AI compute clusters necessitate high efficiency and low latency networking. This represents a 46% sequential and 98% year-on-year increase with strong demand across Spectrum-X Ethernet, InfiniBand and NVLink.

Our Spectrum-X enhanced Ethernet solutions provide the highest throughput and lowest latency network for Ethernet AI workloads. Spectrum-X Ethernet delivered double-digit sequential and year-over-year growth with annualized revenue exceeding $10 billion. At Hot Chips, we introduced Spectrum-XGS Ethernet, a technology design to unify disparate data centers into giga-scale AI super factories. [ CoreWeave ] is an initial adopter of the solution, which is projected to double GPU-to-GPU communication speed.

InfiniBand revenue nearly doubled sequentially, fueled by the adoption of XDR technology, which provides double the bandwidth improvement over its predecessor, especially valuable for the model builders.

The world’s fastest switch, NVLink, with 14x the bandwidth of PCIe Gen 5 delivered strong growth as customers deployed Grace Blackwell NVLink rack scale systems. The positive reception to NVLink Fusion, which allows semi-custom AI infrastructure, has been widespread. Japan’s upcoming FugakuNEXT will integrate Fujitsu’s CPUs with our architecture via NVLink Fusion. It will run a range of workloads, including AI, supercomputing and quantum computing. FugakuNEXT joins a rapidly expanding list of leading quantum supercomputing and research centers running on NVIDIA’s CUDA-Q quantum platform, including [ ULIC ], AIST, [ NNF ] and NERSC, supported by over 300 ecosystem partners, including AWS, Google Quantum AI, Quantinuum, QuEra and PsiQuantum…

…This last year, we transitioned from NVLink 8, which is a node scale computing, each node is a computer, to now NVLink 72, where each rack is a computer…

…We now offer 3 networking technologies. One is for scale up. One is for scale out and one for scale across. Scale up is so that we could build the largest possible virtual GPU, the virtual compute node. NVLink is revolutionary. NVLink 72 is what made it possible for Blackwell to deliver such an extraordinary generational jump over Hopper’s NVLink 8. At a time when we have long thinking models, agentic AI reasoning systems, the NVLink basically amplifies the memory bandwidth, which is really critical for reasoning systems. And so NVLink 72 is fantastic.

We then scale out with networking, which we have 2. We have InfiniBand, which is unquestionably the lowest latency, the lowest jitter, the best scale-out network. It does require more expertise in managing those networks…

…For those who would like to use Ethernet because their whole data center is built with Ethernet, we have a new type of Ethernet called Spectrum Ethernet. Spectrum Ethernet is not off the shelf. It has a whole bunch of new technologies designed for low latency and low jitter and congestion control. And it has the ability to come closer, much, much closer to InfiniBand than anything that’s out there. And that is — we call that Spectrum-X Ethernet.

NVIDIA’s new robotics computing platform, Jetson Thor is now available, and it delivers an order of magnitude higher AI performance and energy efficiency than its predecessor; NVIDIA’s full stack robotics platform is growing rapidly with more than 2 million developers and 1,000-plus hardware-software applications; leading enterprises involved with robotics, including Amazon Robotics and Boston Dynamics, have adopted Jetson Thor

Jetson Thor, our new robotics computing platform, is now available. Thor delivers an order of magnitude greater AI performance and energy efficiency than NVIDIA AGX Orin. It runs the latest generative and reasoning AI models at the edge in real time, enabling state-of-the-art robotics.

Adoption of NVIDIA’s robotics full stack platform is growing at rapid rate, over 2 million developers and 1,000-plus hardware software applications and sensor partners taking our platform to market. Leading enterprises across industries have adopted Thor, including Agility Robotics, Amazon Robotics, Boston Dynamics, Caterpillar, Figure, Hexagon, Medtronic and Meta.

Robotic applications require exponentially more compute on the device and in infrastructure, representing a significant long-term demand driver for our data center platform. NVIDIA Omniverse with Cosmos is our data center physical AI digital twin platform built for development of robot and robotic systems. This quarter, we announced a major expansion of our partnership with Siemens to enable AI automatic factories. Leading European robotics companies, including Agile Robots, NEURA Robotics and Universal Robots are building their latest innovations with the Omniverse platform.

Singapore was 22% of NVIDIA’s 2025 Q2 (FY2026 Q2) revenue; Singapore is an important geography for NVIDIA because its US customers use Singapore

Singapore revenue represented 22% of second quarter’s billed revenue as customers have centralized their invoicing in Singapore. Over 99% of data center compute revenue billed to Singapore was for U.S.-based customers.

Management shipped GeForce RTX 5060 desktop GPUs in 2025 Q2 (FY2026 Q2); the RTX 5060 desktop GPU has double the performance of the previous generation; management will soon bring Blackwell to the GeForce NOW; management thinks RTX GPUs brings the best on-device AI performance; NVIDIA has partnered with OpenAI to optimise their GPT models for inference on RTX-powered Window devices

This quarter, we shipped GeForce RTX 5060 desktop GPU. It brings double the performance along with advanced ray tracing, neural rendering and AI-powered DLSS 4 gameplay to millions of gamers worldwide. Blackwell is coming to GeForce NOW in September… 

…For AI enthusiasts, on-device AI performs the best RTX GPUs. We partnered with OpenAI to optimize their open source GPT models for high-quality, fast and efficient inference on millions of RTX-enabled Window devices. With the RTX platform stack, Window developers can create AI applications designed to run on the world’s largest AI PC user base.

AI workloads on NVIDIA’s chips have now transitioned strongly to inference; NVIDIA’s management is seeing a huge jump in inference demand; major NVIDIA customers, such as OpenAI, Microsoft, and Google, are seeing huge leaps in AI token generation; Microsoft processed 100 trillion tokens in 2025 Q1, up 5x year-on-year; inference-serving startups have tripled their token generation rate and revenues

AI workloads have transitioned strongly to inference…

…We are witnessing a sharp jump in inference demand. OpenAI, Microsoft and Google are seeing a step-function leap in token generation. Microsoft processed over 100 trillion tokens in Q1, a fivefold increase on a year-over-year basis…

…Inference serving startups are now serving models using B200, tripling their token generation rate and corresponding revenues for high-value reasoning models such as DeepSeek-R1 as reported by artificial analysis.

NVIDIA’s automotive revenue had strong growth in 2025 Q2, driven by self-driving technologies; NVIDIA has started shipping Thor SoC (system on a chip); management sees the self-driving automotive market shifting towards a vision language model architecture, generative AI, and higher levels of autonomy; NVIDIA’s full stack Drive AV software platform is now in production and management thinks it can produce billions in new revenue opportunities for NVIDIA

Automotive revenue, which includes only in-car compute revenue, was $586 million, up 69% year-on-year, primarily driven by self-driving solutions. We have begun shipments of NVIDIA Thor SoC, the successor to Orin. Thor’s arrival coincides with the industry’s accelerating shift to vision language model architecture, generative AI and higher levels of autonomy. Thor is the most successful robotics and AV computer we’ve ever created. Thor will power. Our full stack Drive AV software platform is now in production, opening up billions to new revenue opportunities for NVIDIA while improving vehicle safety and autonomy.

NVIDIA’s management sees agentic AI requiring 100-1,000x the amount of computation compared to 1-shot AI models; agentic AI is driving tremendous growth in the amount of computation; management thinks agentic AI has reduced hallucination significantly; management thinks agentic AI has helped deliver breakthroughs in robotics 

Where chatbots used to be one shot, you give it a prompt and it would generate the answer, now the AI does research. It thinks and does a plan, and it might use tools. And so it’s called long thinking; and the longer it thinks, oftentimes, it produces better answers. And the amount of computation necessary for 1 shot versus reasoning agentic AI models could be 100x, 1,000x and potentially even more as the amount of research and basically reading and comprehension that it goes off to do. And so the amount of computation that has resulted in agentic AI has grown tremendously…

…Because of agentic AI, the amount of hallucination has dropped significantly. You can now use tools and perform tasks. Enterprises have been opened up. As a result of agentic AI and vision language models, we now are seeing a breakthrough in physical AI, in robotics, autonomous systems.

NVIDIA’s management sees NVIDIA’s chips as having plenty of advantages over ASICs (application-specific integrated chips); management thinks very few ASICs go into production because the problem set of delivering an accelerated computing platform, which is a full-stack design, is really complicated; management thinks building a data center with NVIDIA brings the best utility as compared to ASICs; management sees NVIDIA’s platform has the most energy efficient, with the best performance per watt; management thinks a world where data centers are limited by power is one where performance per watt is incredibly important

NVIDIA builds very different things in ASICs. So let’s talk about ASICs first. A lot of projects are started. Many start-up companies are created. Very few products go into production. And the reason for that is it’s really hard. Accelerated computing is unlike general-purpose computing. You don’t write software and just compile it into a processor. Accelerated computing is a full-stack co-design problem. And AI factories in the last several years have become so much more complex because of the scale of the problems have grown so significantly…

…The models are changing incredibly fast from generative based on auto regressive to degenerative based on diffusion to mixed models to multi-modality. The number of different models that are coming out that are either derivatives of transformers or evolutions of transformers is just daunting…

…The diversity of our platform, both in the ability to evolve into any architecture, the fact that we’re everywhere, and also, we accelerate the entire pipeline, everything from data processing to pretraining to post training with reinforcement learning, all the way out to inference. And so when you build a data center with NVIDIA platform in it, the utility of it is best. The lifetime usefulness is much, much longer…

…People talk about the chip itself. There’s one ASIC, the GPU that many people talk about. But in order to build Blackwell the platform and Rubin the platform, we had to build CPUs that connect fast memory, low — extremely energy-efficient memory for large KB caching necessary for agentic AI to the GPU to a SuperNIC to a scale up switch, we call NVLink, completely revolutionary, we’re in our fifth generation now, to a scale out switch, whether it’s Quantum or Spectrum-X Ethernet, to now scale across switches so that we could prepare for these AI super factories with multiple gigawatts of computing all connected together…

…We’re in every cloud for a good reason. Not only are we the most energy efficient, our perf per watt is the best of any computing platform. And in a world of power-limited data centers, perf per watt drives directly to revenues.

The US currently represents 60% of the world’s compute

United States represents about 60% of the world’s compute.

NVIDIA’s management thinks AI will accelerate global GDP growth

You would think that artificial intelligence would reflect GDP scale and growth and so — and would be, of course, accelerating GDP growth.

NVIDIA’s management is seeing year-to-date AI startup funding at already $180 billion and this compares with $100 billion for the whole of 2024; AI startups’ revenues are expected to increase by 10x to $20 billion in 2025; management thinks it’s reasonable that AI startups’ revenues could 10x again in 2026

Native-AI start-ups was $100 billion was funded last year. This year, the year is not even over yet, it’s $180 billion funded. If you look at AI native, the top AI-native start-ups that are generating revenues last year was $2 billion. This year, it’s $20 billion. Next year be 10x higher than this year is not inconceivable.

NVIDIA’s AI products are sold out

The buzz is everything sold out. H100 sold out. H200s are sold out. Large CSPs are coming out renting capacity from other CSPs.

Okta (NASDAQ: OKTA)

Okta’s management’s approach to securing nonhuman identities (NHIs), which are effectively AI agents, is to give them the same level of visibility, access control, governance and remediation as human identities; management believes no other company can deliver the level of sophistication Okta can to secure AI agents; the Auth0 for AI Agents product from Okta’s Auth0 platform enables developers to build AI agents that are secure by design; management thinks AI agents will significantly amplify the identity-security problems related to machine identities that are currently faced by enterprises; management is hearing from the leaders of the largest companies in the world that they will not be able to get projects involving AI agents to work if their identity-security problems are not addressed; management is building a new product that will model the identity of an AI agent so users can have even more control in managing the security of the AI agent; the new product is still in its very early days because management is seeing very few companies putting AI agents into production despite many companies testing out these agents; management wants to eventually have Okta be the system of record for AI agents so the AI agents can choose what technologies they want to work with

Take our approach to securing nonhuman identities, or NHIs. Okta’s unified platform helps ensure they receive the same level of visibility, access control, governance and remediation as human identities. This includes the ability to detect and discover NHIs wherever they exist, provision and register them properly, authorize and protect them with appropriate policies and govern and monitor their behavior continuously. That’s the power of an identity security fabric enabled with Okta’s unparalleled breadth of modern identity security products. No other company can deliver that level of sophistication.

With our Auth0 platform, we’re enabling developers to build agents that are secure by design and identity security fabric-ready from day 1. Auth0 for AI Agents, formerly known as Auth for GenAI, delivers user authentication that works seamlessly with AI workflows, token vaults that securely manage credentials, async authorization that lets agents work autonomously while maintaining user control and fine grained authorization that permits AI agents to only access authorized data…

…Our perspective is quite simple. It’s that you have many problems today in your enterprise that are clear and present and you can get a lot of security benefit by addressing these problems. These are the problems that we talk about a lot. These are service accounts. These are machine identities. These are putting the right vaulting and governance workflows around all of these things. These are like the bread and butter of our identity platform across Governance and Privileged Access and Identity Threat Protection with Okta AI and the bread and butter of what we’re talking about. These are clear and present things today. In addition to that, every company is going to make a huge investment in AI agents. And what that’s going to do, first and foremost, is it’s going to make that problem I just described 5x worse because every agent wants to connect to 10 service accounts and is going to have its own tokens…

…The last week, I’ve had conversations with CIOs of massive companies that everyone’s heard of that say, there’s no way we’re going to be able to do this AI stuff if we don’t get our identity foundation in order…

…There are investments we are making in innovation we’re building that is going to even take it a step further, which is actually modeling the identity of an agent and giving more power to the customer to manage and secure these things because it’s a native thing inside of Okta, which is also very exciting. 

But that’s very early because the amount of companies that are actually playing with AI agents is 100%. The ones that are actually putting them in production at scale is very small. So the timing is right here to solve this problem they all have today, the surface accounts and token vaulting, et cetera. 

And then over time, be the system of record for the AI agents themselves, and give them choice and flexibility on if they want to use Salesforce or what they want to use Salesforce for, agents, or ServiceNow agents or build their own agents and give them the fundamentals across all of that, which are security, control and governance.

Okta’s management recently introduced a new open standard, Cross App Access, that helps with securing AI; Cross App Access enables AI agents to safely connect with other technologies; management is seeing strong interest from Okta’s partners and ISVs (independent software vendors) for Cross App Access; management has been working on Cross App Access for 3 years; Cross App Access is an industry-wide effort that started with other SaaS companies wanting to have the ability to connect their products with their customers’ other products; management sees the emergence of AI as aggravating the problem of product-to-product connections

Securing AI is the next frontier, and our introduction of a new open standard called Cross App Access is a key part of the solution. This is an important innovation that helps control what AI agents can access, allowing us to help make our customers and ISVs more secure and providing better end-user experience. In short, Cross App Access allows for support of AI agents within the identity security fabric and the flexibility to safely connect to other technologies. Already, there is strong interest in Cross App Access from partners and ISVs, including AWS, Boomi, Box, Ryder and Zoom, and we had over 1,100 attendees at our Identity Summit on the topic earlier this month…

…Cross App Access is an industry-wide effort. It’s actually 3 years old. We’ve been working on this for 3 years. And it came out of Mike from Atlassian and Eric from Zoom and many other SaaS leaders wanted a way to standardize how when they sold their products into companies, how those products were then hooked up to everything else in the company. So Zoom wants to connect to your calendar, wants to connect to a note taking. Atlassian wants to connect to all of your other software development tools. So we invented this protocol and this concept and have published this open standard to solve a very important problem. How do you give your IT teams and your security teams visibility into all these application connections that happen between apps. Now guess what? That’s a problem that’s existed for a long time. And guess what’s happening with AI. AI is supercharging this problem. Now every agent gets what it wants to do. It wants to connect to 15 applications and guess what you need. You need an open protocol for all of those applications that are letting those agents connect, publish and share that information with the security team so they can have visibility and control and audit that. So that’s why Cross App Access is so important.

Okta’s management is not seeing any difference in terms of the Okta products that AI native companies are choosing compared to other types of customers; AI native companies are aware that they are very attractive targets for hackers, so they are really investing in identity security; management thinks Okta can help with AI native companies’ identity security needs

[Question] When we look at the AI native cohort, are there any interesting adoption trends that you’re seeing there in terms of what products they’re taking, how they’re using the platform?

[Answer] It doesn’t seem dramatically different than other cohorts in terms of the adopting workforce solutions or Auth0. It looks pretty much the same, except they’re growing very fast. I guess that’s a difference, especially, actually, the revenue metrics. It’s growing very fast, and we think we’re well positioned in that cohort. And I think similar to every company, they’re trying to figure out how they can be secure internally as they’re growing very fast. They know from a workforce identity and identity security perspective for their internal operations, they’re sitting on a lot of very valuable data and definitely hackers want to attack them like they want to attack every important company. So they’re really investing in identity security, and Okta helps them with that.

Okta’s management thinks that Okta’s 2 open standards, IPSIE and Cross App Access, will help the entire identity market become valuable; management thinks about the monetisation of the open standards from the perspective of the open standards making machine and AI agent identities more widely accepted and thus making Okta’s products more important for customers

These are 2 open standards we’re pushing out there with the ecosystem. And the effect of both of these things for Okta is going to be basically identity providers are going to be more valuable tools to the customers. So they’re going to have better control, fine-grained control, into resources, better policies, more value. So the whole identity market gets more valuable and bigger…

…The clear and present issue today, which is service accounts, nonhuman identities. We monetize that through Okta Privileged Access and Identity Security Posture Management. So Identity Security Posture Management detects the nonhuman identities and the risks in a proactive way that’s comprehensive across all platforms. And Okta Privileged Access and Okta Identity Governance can vault the credentials and rotate the credentials and have the right governance workflows…

…In a world of AI agents, our belief is strong that you are going to manage AI agents with your identity system. And so that’s how we’re going to monetize that. You’re going to — when you put a bunch of AI agents inside Okta, that’s going to be more valuable from an identity security perspective and we’re going to be able to have — we’re going to be able to charge for that with our customers…

…But it all is kind of predicated on a vibrant, healthy growing AI agent ecosystem, which I think there’s a lot of different thoughts on how that exactly play out, but who’s the vendor going to be, who’s the platform, SaaS vendors versus custom development, whatever. I think whatever happens, you’re going to need to manage this stuff.

Salesforce (NYSE: CRM)

Salesforce has won 12,500 AgentForce deals since it was launched 3 quarters ago, of which, 6,000 are paid; 40% of new AgentForce bookings in 2025 Q2 (FY2026 Q2) came from existing Salesforce customers; AgentForce had 60% sequential increase in customers going from pilot to production in 2025 Q2; AgentForce now can support the public sector and has FedRAMP High certification, so Salesforce can now sell more to the US government than before; management thinks Salesforce’s consumption model is showing strong early success; management recently announced new flexible payment options for AgentForce, with Flex Credits accounting for 80% of AgentForce new bookings in 2025 Q2 (FY2026 Q2); DIRECTV is one of Salesforce’s biggest Flex Credits customers; Falabella refilled the Flex Credits tank 3 times in a quarter

In the 3 quarters since we launched Agent Force, we have now won more than 6,000 paid deals and more than 12,500 overall…

…40% of our Agent Force new bookings this quarter came from existing customers extending their investment with Salesforce. And it’s demonstrating the value that they’re getting and how the flywheel is really working. We’ve seen a 60% increase quarter-over-quarter in customers who’ve gone from pilot to production and they’re expanding use cases and scaling consumption…

…Now with Agent Force for public sector and FedRAMP High certification, we’re able to sell more to the government than ever before because we’re bringing the power of the agentic enterprise directly to the government…

…Our consumption model is showing strong early success…

…Last month, we announced new flexible payment options for Agent Force, including pay-as-you-go, to lower the barrier to adoption and encourage experimentation. And following their launch last quarter, Flex Credits now account for 80% of Agent Force Q2 new bookings…

…Marc alluded to DIRECTV. Incredible business value. This is one of the biggest flex credit customers that we have globally…

…There is a customer that in just 3 or 4 months, they refilled the tank 3 times. I gave you the example of Falabella.

Salesforce’s management sees all of Salesforce’s customers becoming agentic enterprises; management sees AI agents represent a complete transformation for Salesforce and its customers; management sees the end goal of agentic AI as humans and AI agents working together with trusted data; management is adding native agentic capabilities into all of Salesforce’s products; Salesforce is pairing every salesperson with an AI agent and is using AI agents in Sales Cloud to call every single person back; in customer service, agents are handling millions of conversations, with AI agents handling 1.5 million conversations in 9 months within Salesforce’s help site; in field service, AI agents are helping technicians orchestrate scheduling and logistics, and helping technicians solve problems; the new version of Salesforce’s Tableau has AI agents that surface insights and recommendations instantly; Salesforce’s marketing product will soon have AI agents that can turn every one-way email to customers into 2-way conversations; Salesforce employees are using Slack as the interface for communicating with AI agents built with AgentForce; management thinks Salesforce will lead the way in this agentic enterprise wave because it has (1) the software infrastructure, and (2) the metadata platform; management will soon unveil all of Salesforce’s agentic products at Dreamforce; management is seeing very healthy growth in the pipeline for agentic transformation among enterprises

One thing is extremely clear to me, every single one of our customers is becoming an agentic enterprise…

…This isn’t simply just some automating some existing business process these agentic enterprises. Well, for Salesforce, it’s certainly true. It’s a complete transformation. And for our customers, the agentic enterprise is a complete reinvention in many cases of who they are and what their potential is. It’s a shift from traditional hierarchies to reshaping the entire company from busy work to orchestrating workflows, from siloed teams to seamless collaboration, from clicking and routing to natural conversations…

…But ultimately, it’s about this. It’s about humans and agents working together with every decision grounded in trusted data…

…Across our portfolio, we are adding these native agentic capabilities into every single one of our products…

…Our Sales Cloud for years has been an app that thousands or millions of salespeople use to manage their sales every single day. But now riding alongside every salesperson is an agentic salesperson. And that agentic salesperson is calling every single person back. And how that relates to Salesforce, well, let me tell you that, well, maybe somewhere between 20 million and 100 million people who have contacted Salesforce in the last 26 years, they haven’t been called back. It’s just because we didn’t have enough people. But now with our new agentic sales, everybody is getting called back…

…In service, we’ve been talking about that now for months, you can see our agents are handling millions of conversations while humans are delivering the empathy and expertise. Well, it’s a bigger story than that, where you know that we have delivered in the last 9 months about 1.5 million conversations just for our own company on help.salesforce.com

…In field service, agents orchestrate scheduling and logistics so technicians can focus on solutions. I saw it myself at my home. I have this incredible device from Eaton, one of our large customers using our field service product. And it actually connects my air stream trailer to my house. And when the technician comes out to work on it, well, they’re able to use the agentic capability to learn as much as possible about the product that I’m using and how to fix it and how to repair it, while also managing the traditional system of record that’s on the field service capability, managing all the field service and service operations through the field service capability…

…We’ve been showing now for a few months, starting at our Tableau conference, the new version of Tableau, where agents surface insights and make recommendations instantly and where agents and humans are working together to make smarter, faster decisions…

…We’re demonstrating to our customers and about to release our new e-mail platform that provides every one-way conversation into a 2-way conversation. And agents are going to turn these one-way e-mails into 2-way conversations…

…If you’ve seen anyone from Salesforce recently, have them show you how we’re using Slack as our interface to our own agentic enterprise where we have dozens of agents with people and apps and LMs, all in one conversational agentic workspace. It’s pretty cool. And these agents are operating across apps, departments, silos, all running off of our data cloud, all running off of AgentForce…

…Salesforce is going to lead the way. There’s no question about that. We’ve built the software infrastructure for the agentic enterprise, we have our metadata platform unifying our apps, our data and agents into one powerful agentic operating system. We are rebuilding every single 1 of our products to be agentic. We’re delivering almost every single one of those products at Dreamforce. And at Dreamforce, you’re going to see all of these products…

…I see the pipeline into H2. Pipeline is growing in the high teens. And for big deals, it’s actually approaching 20% growth. That’s a really good sign. We haven’t seen that kind of pipeline in a long time. The agentic enterprise is really the next incredible investment cycle.

Data Cloud is a critical foundation for Salesforce’s agentic ambition because it provides the data and metadata for accurate output by AI agents; management believes Data Cloud enables Salesforce to have the most accurate AI agents in the industry, with about 90%-ish accuracy; management thinks Data Cloud will be the most strategic and important business for Salesforce; Data Cloud is now a $7 billion business; Data Cloud had 140% year-on-year growth in customers, and usage numbers are growing rapidly; more than half of Fortune 500 companies are on Data Cloud; FedEx is using Data Cloud to save a lot of costs and grow the percentage of customers who signed a contract and proceed to start shipping by double-digits; Salesforce’s Data Cloud and AI ARR (annual recurring revenue) reached $1.2 billion in 2025 Q2, or FY2026 Q2 (was $1 billion in 2025 Q1, up 120% year-on-year), up 120% year-on-year; Salesforce closed 60 deals in 2025 Q2 (FY2026 Q2) exceeding $1 million that included Data Cloud and AI; management sees Informatica, together with Data Cloud and Mulesoft, as the 3 components for every company’s AI foundation

Data Cloud is the heart and soul of the success of these agents because it is providing the data and the metadata that you need and the context to get the accuracy. We probably have the highest accurate agents in the industry, and the way that we’re achieving that is through our data cloud. It’s this Data Cloud as well as Tableau and MuleSoft and soon Informatica, all working together to really helping our customers to clean and harmonize their data and provide it in a way that can be consumed by our Agent Force platform to provide this level of accuracy.

I think the data business is probably the most strategic and most important business for Salesforce going forward. And already, it’s a $7 billion business. And Data Cloud is having a great year. It had 140% year-over-year growth in customers and 326% growth in row access by zero-copy integration. The usage numbers are really just off the charts. But over half of the Fortune 500 are already on Data Cloud, but it’s really just the very, very beginning…

…FedEx, and you’re going to see them at Dreamforce, their Chief Operating Officer, Richard Smith, is coming to be part of my keynote. Well, let me tell you that they’ve got unified data across all their platforms now with Data Cloud, and the numbers that they’re telling us that they’re saving, well, I’m not going to — I’m not going to take away Richard’s punchline from the Dreamforce keynote, it’s like numbers I’ve never heard in terms of what the amount that can be saved by technology. And now if a business customer [ isn’t ] actively shipping, our own marketing cloud campaign is automatically triggered and sales reps are alerted and it’s all happening through our Data Cloud. And this idea that FedEx has seen a double-digit increase in the percentage of customers who signed the contract and proceeded to start shipping, it’s dramatically surprised them what has been possible in such a short period of time…

…Data Cloud and AI ARR continues to scale, reaching $1.2 billion in Q2, growing 120% year-on-year…

…Data and AI products were in 60 deals greater than $1 million…

…Because AI, as we all know, these large language models only have a certain level of accuracy and it’s not 100%. It’s probably about in the 90s when it really gets well-architected with our data cloud and with all the different kind of capabilities and kind of really advanced techniques that we’ve come up with to make our AI as accurate as it can…

…We think that every customer is going to need an Informatica, every customer is going to need a MuleSoft and every customer is going to need a Data Cloud. And together, we think that’s called the AI foundation. And that AI foundation is the Data Cloud plus MuleSoft plus Informatica. And if you’re going to roll out Agent Force, you’re going to need an AI foundation made up of those 3 things.

DIRECTV used AgentForce to (1) save billing reps 300 hours of inquiry-handling and (2) execute 50,000 actions in a week with Employee AI Agent; enGen expects to save millions of dollars annually by cutting call times with AgentForce; PenFed expects to save millions of dollars annually by using AgentForce for loan underwriting; Under Armor used AgentForce to double its case deflection rate and increase its customer satisfaction rate by double digits, all in less than 60 days; Reddit used AgentForce to reduce average resolution times from 8.9 minutes to 1.4 minutes; Telepass used AgentForce to power 275,000 agentic conversations over 5 months, and has become one of the fastest-growing AgentForce customers; Pandora has scaled from 1 agent to 3 agents with AgentForce in a single quarter; Indeed has doubled the number of actions taken by its customer-facing agents and has added another agent for internal productivity; Williams Sonoma has deployed AgentForce for only a few weeks, but has expanded from the initial use case of customer support for 1 brand, to customer support for 8 brands and other use cases; the US army is planning to use AgentForce to support its Human Resource Command; Salesforce has expanded 24/7 instant support to 6 new languages and agents now cover 94% of its global case volume; Salesforce recently launched many new agents for internal use cases; management is aware of the recent MIT study showing that 94% of AI projects in enterprises have failed, but Salesforce’s customers are getting great results; Falabella is using Salesforce’s AI agents to track its order locations and has seen its NPS (net promoter score) increase, its call volume drop by 25%, and 70% of its conversations shift to WhatsApp

DIRECTV save billing reps nearly 300 hours of inquiry handling with Agent Force. And Employee AI Agent executed 50,000 actions in a week…

…enGen, an incredible company, projecting millions in annual savings by cutting call times.

PenFed, we talked about, many scripts that we’ve had, already projecting millions in annual savings by using Agent Force in its loan underwriting…

…Under Armour and Kevin Plank, well, he more than doubled his case deflection rate and boosted customer satisfaction by double digits. And they did it in under 60 days…

…A lot of our employees are excited about Reddit because they’ve reduced average resolution times from 8.9 minutes to 1.4 minutes…

…Telepass, well, they’ve powered more than 275,000 agentic conversations over 5 months. And the way they got it in the script is “We can’t believe the speed and growth of these conversations just in the last few weeks,” a conversation with the management level that they’ve become one of our fastest-growing AgentForce customers…

…Pandora, the amazing jewelry retailer, Alex’s entire team scaled from 1 agent to 3 in a single quarter…

…Indeed have more than doubled the number of actions taken by their customer-facing agents and added another agent in Slack to drive internal productivity…

…Williams Sonoma, and we’ve only been live for a few weeks, started with Agent Force powering customer support for just one of their brands. I think you know they have like quite a few amazing brands like Pottery Barn and West Elm and others. Well, now it’s rolled out along 8 of their brands and as well as agents for other use cases, including a sous chef agent, that is helping customers choose cookware and guiding them step-by-step through recipes. They are finding incredible new ways to use the Agent Force platform. And they’re doing it side by side across their entire sales force deployment…

…The Army is already planning to launch a digital front door for its Human Resource Command, providing 24/7 powered service and support to all soldiers and personnel and millions of veterans…

…In Q2, we expanded 24/7 instant support to 6 new languages, which combined with English now cover over 94% of our global case volume. Earlier this year, we launched our IT and HR agents in Slack to support our employees. And in July, we launched dozens more specialized agents in Slack…

…Over the weekend, I read that MIT study that’s becoming very popular, which really goes to show that a lot of companies have thought they were on the right path with generative AI, building their own models, doing it themselves, hooking it all up. And now they’re claiming about 94% of those projects have failed. But we’ve been saying that was going to happen for the last several years, as you know. But that’s not what our customers are saying. Our customers are saying that they’re getting phenomenal results and that they have humans and agents working together to create a new level of customer success, or we say it at Salesforce as an agentic enterprise…

…Falabella, is the largest retailer in Latin America. Their main use case, they have several, but their main use case is: Where is my order? And they solved that question to the customers across the web, in-app and WhatsApp. The pilot took 2 months from idea to production. They access their OMS system. They leverage the CRM data in Salesforce, knowledge articles that we put in Data Cloud. They connect Data Cloud to GCP. And the value is extraordinary. The NPS has increased by 10%, 10 points, from 70% to 70%. All the digital interactions, most of them, 70% of them have shifted to WhatsApp, and the call volume has dropped by 25%.

Salesforce will soon launch its agentic IT service platform; many Salesforce customers have been asking for IT services from Salesforce; the agentic IT service platform will be integrated with Slack; the agentic IT service platform will see every IT request become a conversation; management thinks the agentic IT service platform will be a huge growth driver for Salesforce; management thinks traditional ITSM (IT service management) products have served only the very high-end market, but Salesforce’s agentic IT service platform can serve a much wider demographic of customers; Salesforce itself is the first customer of its agentic IT service platform 

The world of ITSM and IT service. It’s an application area that we just haven’t gone to before. But I’m very excited that next month, and you’re going to see this at Dreamforce as well, that we’re launching our own agentic IT service platform. A lot of our existing customers have been asking for this. We’re bringing a whole new level of capability. It’s agent-first and it’s Slack-first, that is right inside Slack, you’re going to be using our agentic IT service capability. It’s natively embedded where employees already work with 0 learning curve…

…With agentic IT service, well, every request is becoming a conversation where agents work hand-in-hand with IT teams proactively fixing their problems. It’s going to be an incredible growth driver for the company…

…It’s a very democratic platform. A lot of the ITSM products have only served the very highest end of the market with maybe 1,000 customers here or 1,000 customers there. But the thing about Slack is that it’s used by about 1 million customers worldwide. And I think all of them are going to be able to be able to benefit from this IT service platform. No one else is delivering this level of agenda capability and digital labor at scale. Now we know how to do this because our own first customer for this, well, it’s us. We are Customer 0.

Salesforce’s management thinks being agent-first will expand Salesforce’s margins in the long run; Salesforce has cut its customer support workforce by 40% because of the efficiency of AI agents

We believe that being agent-first is a key driver of our own long-term margin expansion…

…[Question] We’ve heard software companies say that they have held their head count flat in their support organizations. We haven’t heard anyone saying that they reduced head count by close to 40% there like you have.

Salesforce’s management thinks AI is an extension of SaaS, and not an eliminator of SaaS, because there are still problems that AI cannot solve

There’s a lot that we can resolve automatically through these agents with the customers, but there’s also a lot that cannot be resolved. And that has to be escalated to the humans. And so it’s humans and agents working together to satisfy customer success. And this is what has been extremely important…

…So it’s not about the fundamental, I would say, elimination of SaaS. What I would say, it’s the fundamental extension of SaaS…

…Nothing lasts forever, okay? But I just look at how I’m running my own business and the business of our customers, I don’t understand what the replacement is. So I just look at this incredible next-generation transformational capability, and I’m going to lay it all out at Dreamforce. And by the way, my keynote, I kind of threw away all my slides and I said, let’s just have 12 CEOs of the largest companies on the planet just show you exactly what they’re doing with this technology, because it’s crystal clear what the value proposition is. But to hear some of this nonsense that’s out there in social media or in other places, people say the craziest things, but it’s not grounded in any customer truth.

Salesforce’s management sees Salesforce as being the only company that can bring together deterministic workflows and agentic reasoning

We are the only platform, the only software infrastructure that can bring the deterministic workflows, the data and the agentic reasoning and actioning on the same platform.

Salesforce’s management thinks AGI (artificial general intelligence) will not be coming any time soon

The idea that there is, I’ll just say, again, an AGI, that seems like a fantastical term. I know it’s coming in the next week or 2 evidently. But this idea that there’s some kind of AGI that’s about to take over the whole world. Well, let me just help everybody understand that’s not exactly what’s about to happen.

Salesforce’s management thinks Salesforce is going to see incredible growth in the next 2 years because of AI

We think we’re going to see some incredible growth over the next 6 to 8 quarters…

…My focus is accelerating bookings. I’m very happy with the execution of my team. I’m very positive about what is coming ahead, not just in H2, but also what is coming in the next fiscal year. We’re already thinking about the next fiscal year. We wouldn’t be investing at the rate that we are investing with very — a lot of intentionality in the areas that are growing, in the areas that have higher margin if we didn’t see a great opportunity.

Sea Ltd (NYSE: SE)

Sea’s management is using AI to improve Shopee’s advertising business; sellers who used Shopee’s advertising products rose 20% in 2025 Q2, and sellers who used Shopee’s advertising products grew their ad spend by more than 40% from a year ago

Since early last year, our dedicated ad-tech team has worked hard to improve algorithms, enhance traffic allocation efficiency, and deploy AI technologies to better serve our ad-paying sellers. And we have seen very encouraging results. During the second quarter, the number of sellers using our ad products

rose by around 20%, and ad-paying sellers’ average quarterly ad-spend grew by more than 40% year-on-year. Our tech enhancements have allowed us to more effectively optimize Shopee’s GMV and advertising revenue at the same time. We saw an 8% uplift in Shopee purchase conversion rates and improved our ad take rate by almost 70 basis points this quarter, year-on-year.

Sea’s management has provided AI tools for Shopee sellers to produce high-quality video content; livestreaming and short-form video orders in Southeast Asia accounted for more than 20% of Shopee’s physical goods order volume from the region; there are now 7 million Youtube videos with Shopee product links embedded, up 60% sequentially (was 4 million in 2025 Q1)

Our AI tools empower Shopee sellers to produce high-quality video content, helping them improve user conversion and make more money without having to invest in their own studio set-up. In Southeast Asia, orders from livestreaming and short-form videos accounted for more than 20% of our total physical goods order volume in the second quarter. Our collaboration with YouTube has also continued its strong momentum. As of June, more than seven million YouTube videos featured Shopee product links across our Southeast Asian markets, an increase of more than 60% quarter-on-quarter. 

Sea’s management sees Monee has having 3 unique advantages, namely, (1) integration with Shopee, (2) a large user base who are growing their credit records with Monee, and (3) use of AI to improve credit models; 

…Three unique advantages that Monee has. First, deep and seamless integration with our Shopee ecosystem. Second, a very large base of users who are growing their credit track records with us over the years. Third, our increasing use of AI to improve our credit models. Together, these advantages uniquely enhance our underwriting capabilities in each market, enabling us to very effectively push for growth across our three credit product lines: on-Shopee SPayLater, offShopee SPayLater, and cash loan products.  

Sea’s management has been using AI a lot in general recommendations, leading to improvement in conversion rates as the system can better understand user intention

We also use AI a lot our general recommendations, and this improved our conversion rate quite a lot by understanding user intention better, by understanding the buyer’s query better.

Sea’s management is using AI to generate images for product descriptions

We also spent a lot of effort on the AIGC initiatives that we can generate a lot more attractive pictures for the product descriptions.

Shopee’s customer service chatbot is 80% managed by an AI agent; the use of AI in Shopee’s chatbot helps sellers both reduce cost and increase the potential for upselling when interacting with consumers

On the customer interaction side, we — our customer service chatbot is 80% managed by AI agent. We’re also helping the seller to interact with the buyers through the CS chat by agent as well, not only reducing the cost for the sellers, but also improve the upselling potential for the sellers while talking to the buyers.

Sea’s management is actively using AI to improve Sea’s internal operations

The second type is to improve our internal operations. For example, obviously, the product development side, but also many of our daily operations like, for example, if you look at the way we run our marketing campaigns, a lot of my campaign are very automated right now through AI tools. Many of the process to process the payment are AI-enabled through the agent, et cetera.

Sea’s management is very excited about the use of AI in the gaming industry; management thinks the gaming industry will be among the first batch of industries to benefit from advancements in AI; management has seen AI improve productivity in game development by generating art work; management thinks AI agents can improve the gaming experience for players who prefer to play solo games; management wants to explore the use of AI to generate content and have personalised gaming experiences instead of the current format where the gaming experience is preset

We are very, very excited about the AI perspective in the game industry. And personally, I believe game industry will be among the first batch of industries largely benefited by the AI advancements and the technologies.

And so far, like we have seen a lot of kind of upside on the — actually on the development and the production side. And say, for example, like for — to develop any new content new map, we need to generate a lot of original arts. And now a lot of like very, very basic arts can be generated by AI. So it’s — the quality is very, very decent in terms of the efficiency, the volumes are generated and the varieties are generated is I mean, you can imagine it’s much, much better than what human can do. So this has largely improved our productivity, and it’s really, really exciting.

And like on the — as you mentioned from the gamers like engagement perspective, like — so there is a very, very clear opportunity we have seen in the use cases like we do believe like, say, for example, Free Fire is a very, very social game. It’s designed for team play. So it’s like there’s much, much more fun if you play with other players, and there’s a much more combination of the strategy, the technique you can use than you play as a solo gamer. But we observed in Free Fire, we still have a very, very sizable gamers like only play solo games. I mean they enjoyed, but I think they haven’t really fully experienced the amazing part of the game. And maybe because of they’re shy, they don’t know how to reach out to other players. So as we think like the AI-enabled bots, it’s kind of like their — it’s an AI game agent like as their teammates as peers for them to play the game together kind of play a brother’s roles, sister’s roles and coach roles in the game and give them a little bit flavor of how this interaction will kind of feel and taste in the game play and as an encouragement for them to reach out to play as a team rather than individuals. I think that largely helped on the retention.

And furthermore, I think we are very actively experiencing and trying to figure out how to kind of leverage the generative AI to let gamers and to generate the content rather than, okay, so now all today’s game experience are preset and how the experience will look like. And I think with the AI tools, actually, this experience can be much more immersive and much more interactive and much more individualized.

Tencent (OTC: TCEHY)

Tencent’s management added AI-powered citation to content on Weixin; management is using LLMs (large language models) to help merchants with customer inquiries and personalized product recommendations; Yuanbao, Tencent’s AI chatbot, can now be added as a Weixin contact for users to interact with; management is enhancing the Yuanbao app and is pushing for growth in DAUs (daily active users)

On the AI front, we added AI-powered citation to content so that users reading official accounts articles or video accounts comments can activate contextual AI commentary on related information. We upgraded Mini Shops customer service with large language model capabilities to provide merchants with more intelligent responses to customer inquiries and personalized product recommendations. We enabled Yuanbao as a Weixin contact to interpret and summarize video accounts content. Meanwhile, we are rapidly enhancing the functionalities of our AI native app Yuanbao, and we’ll share more details about how we are growing the DAU later this year.

Tencent’s management is seeing AI becoming an increasingly important driver of growth in Tencent’s Domestic Games and International Games businesses; management is applying more AI tools to increase the speed and scale of content production in Tencent’s games; AI allows Tencent to provide more human-like virtual teammates to solo-gamers and more realistic non-player characters in games; management is using AI in marketing activities for its games for more efficient targeting

Reviewing the progress of our game business domestically and internationally in recent months, AI has become an increasingly important driver of its growth in terms of game content, game engagement and game monetization. We’re increasingly applying AI tools to boost the speed and scale of content production across our major games. AI allows us to provide more human-like virtual teammates in our competitive PvP games and to power more realistic nonplayer characters in our story-driven PvE games. And we’re using AI in our game marketing activities to more efficiently target marketing spending towards the users most likely to activate and remain in each game.

The Marketing Services segment’s revenue was up 20% year-on-year in 2025 Q2 because of AI upgrades in its advertising platform, and more closed-loop advertising involving Weixin’s ecosystem; the AI upgrades included better AI capabilities in ad creation, placement, recommendation and performance analysis; the AI upgrades led to higher click-through rates, conversions and ROI for advertisers; Video Accounts’ Marketing Services revenue grew 50% year-on-year in 2025 Q2; Mini Programs’ Marketing Service revenue grew 50% year-on-year in 2025 Q2; Weixin Search revenue grew 60% year-on-year in 2025 Q2, driven by the use of Tencent’s LLM (large language model) to deepen understanding of merchandise and of user consumption intent; most of the advertising revenue growth in 2025 Q2 came from higher revenue per impression partly because of AI-driven increases in the click-through rate

For Marketing Services, revenue grew 20% year-on-year to RMB 36 billion in the quarter, benefiting from AI-powered adtech upgrades and from increased closed-loop advertising arising from Weixin’s transactional ecosystem. We expanded AI capabilities in areas including ad creation, placement, recommendation and performance analysis, which had the effect of boosting click-through rates, conversions and ROI for advertisers. Specifically, we upgraded our ad platform architecture by deploying a scaled-up foundation model, which analyzes advertisement click-through rates and transactions across multiple apps and services as well as user interactions across text, image and video to determine user interest and optimize ad performance in real time.

By property, Video Accounts marketing services revenue rose approximately 50% year-on-year due to more traffic and more transactional activity within Video Accounts. Mini Programs marketing services revenue also increased about 50% year-on-year. Activity within Mini Games and Mini Dramas created a flywheel effect, which drives more developers to use our closed-loop marketing solutions to promote their services. And Weixin Search revenue grew around 60% year-on-year due to more consumer and advertiser interest in Mini Program search results and to enhance ad relevance as we leverage our large language model to deepen understanding of merchandise and of user consumption intent…

…In the second quarter, the majority of the advertising revenue growth of 20% year-on-year arose from higher revenue per impression. And that, in turn, was primarily due to a higher click-through rate arising from deploying AI, although also to higher revenue per click arising from more closed-loop activity with mini shops and mini games.

Within the Fintech and Business services segment, Business Services revenue grew in the teens year-on-year in 2025 Q2; Cloud Services revenue accelerated in 2025 Q2 from increased revenue from providing GPUs and API tokens for customers’ AI needs; management is focused on growing Business Services at an accelerated rate without being hampered by fluctuations in GPU supply Business services revenue grew at a teens rate year-on-year. Cloud services revenue growth accelerated versus recent quarters, benefiting from increased revenue from providing GPUs and API tokens for customers’ AI needs. Fees collected on Mini Shops transactions continue to grow at a rapid rate and business services gross margin rose year-on-year due to improved efficiency and positive mix shifts…

…We’ve put our cloud business onto a more sustainable base as well as improve the cost competitiveness of the supply chain for our cloud business, we can — we are refocusing on growing revenue at an accelerated rate versus the prior rate without depending too much on the vagaries of the GPU supply situation. So if we do have sufficient GPUs that we can rent out more in the cloud, then we’ll do so. But our cloud strategy is not dependent on the GPUs. We’re also growing in CPU, in storage, in database, in CDN and so forth. So that’s on the cloud side.

Tencent’s management has enhanced the data quality and diversity of Hunyuan, Tencent’s proprietary foundation model; Hunyuan 3D model has become the No.1 3D generative model on Hugging Face; game developers, 3D-printing companies and designers are increasingly using Hunyuan 3D; management wants to continue improving Hunyuan, and sees many dimensions for doing so; when Hunyuan improves, all of Tencent’s AI services also improve

For HunYuan, we enhanced our data quality and diversity through data augmentation and synthesis and implemented more effective pretraining and post-training scaling. HunYuan 3D model has become the top ranked 3D generative model on Hugging Face due to its geometric precision, texture fidelity and prompt 3D alignment capabilities. Game developers, 3D printing enterprises and design professionals are increasingly using the HunYuan 3D model for their digital asset generation needs…

…In terms of the model, I would say there’s actually a lot to be done, right? And I would say sort of in the broad bucket, there is the large language model itself, and we want to keep improving the LLM itself. And that actually involves improvement along a number of different dimensions, including making sort of the data sort of higher quality and more comprehensive. That includes making the pretraining more efficient and more effective and improving the pretraining model that includes improving the post-training and reinforced learning processes in basically extracting the capability of the pretrained model and that includes improving our infrastructure so that we can actually train more efficiently as well as inference more efficiently, right?…

…When we have an improved LLM, it’s actually sort of the foundation for all our AI services. And in particular, it would improve our search and productivity-related services…

…We also want to improve the multimodal capability of our model so that we can actually provide more customized functions for the users in Yuanbao, right? Within Yuanbao, it’s not — people are not just using it for search and productivity-related activities. They are using it for all kinds of different multimodal activities. They may want to speak, they may want to turn text into pictures, turning pictures into text and there are a lot of multimodal conversions within Yuanbao, which we actually need to have very strong capability for…

…I think the third broad category is actually coding and agents, right? So that if we can sort of keep improving, then basically, we can provide much better coding environment for both ourselves as well as our enterprise customers. And at the same time, that would enable a better agent and instruction follow capability for our agent. I think that’s particularly important for Weixin going forward and as we build an agent for Weixin that can be personalized assistant to the Weixin users in a personalized way.

Tencent’s management thinks Tencent’s advertising revenue growth can grow at a healthy rate for a long time; the drivers of future growth for the advertising revenue come from (1) higher click-through rate, where AI delivers better targeting and thus more clicks, (2) more traffic, including traffic within Tencent’s AI-native experiences, (3) higher revenue per click, as generative AI used for creating the ads results in more ad demand, (4) closed-loop e-commerce transactions driving higher advertising demand, and (5) higher advertising load; management does not expect any meaningful impact to Tencent’s advertising business from the new advertising law for gaming company sales and marketing because the advertising business has ample diversification, and the AI-related improvements management is making is a far more important variable; management could crank the lever for advertising-growth if the cost of deploying AI throughout Tencent suddenly spikes

On the advertising and the potential, we continue to believe that we enjoy a long and lengthening runway for continuing to grow our advertising revenue at a reasonably healthy rate. And that length of the runway reflects upside in a number of the key variables that determine our marketing services revenue, including the click-through rate where AI delivers better targeting and thus more clicks, including traffic where we see growth in video accounts traffic and search traffic over time, in traffic within our AI native experiences, including revenue per click as generative AI used for creating the ads results in more ad demand as well as e-commerce closed-loop transactions resulting in more ad demand. And then finally, in ad load, where, as you know, for short video, our ad load is currently in the low to mid-single digits versus our peers who are in the low to mid-teens…

…[Question] About the impact on the new advertising law for gaming company sales and marketing. Under the new ad regulation effective in July, sales and marketing spending in excess of 15% of revenue will need to pay an additional 25% tax. So how do you expect this to affect our advertising income, especially for mini games, which heavily rely on traffic acquisitions, i.e., the sales and marketing could easily surpass this 15% revenue threshold?

[Answer] We don’t expect a meaningful impact. Our advertising business has become quite broad-based over time. And if you look at the second quarter, there was an adverse impact from the food delivery companies and some of the e-commerce companies ramping up in food delivery, reducing their advertising spend as they invested more in subsidies. But despite that, our advertising revenue grew 20% year-on-year. So in our view, there’s always going to be individual blips up and down in terms of individual categories. But what we’re doing in terms of deploying AI within advertising is a much more important variable…

…Now of course, if the cost of deploying AI, including GPU depreciation was suddenly to step up and become very burdensome, we could accelerate the advertising monetization, but we don’t see the need to do that right now.

There are 4 broad categories of AI features across Tencent’s ecosystem, namely (1) the AI-native app Yuanbao, (2) AI-enabled search, (3) features within games, and (4) features within productivity tools; management thinks it’s still early in observing user behaviour 

In terms of the AI features, right, I think there is sort of broadly speaking, a number of these features. One is obviously our Yuanbao, which is an AI native app. And then I would say it’s related to search, AI-enabled search. So that lands on our browser that also lands on WeChat search. And then there’s a whole host of different features within even games, right, when we have AI-enabled players or in our productivity tool, for example, summary of meetings in our Tencent Meeting and assistance within our Tencent docs, right, to help people to write. I would say we’re still at an early stage in observing the user behavior.

Tencent’s management has so far not seen any major negative impact on Search from the use of AI to produce search results

The one sort of negative impact that you are pointing to is when there is AI-assisted search, whether it would just show the content rather than leading people to the pages. We have not seen a very big impact on that. I think overall, people tend to be more satisfied in getting the answer directly. And if they want to explore the topic more, they would click on the different links and articles. So I think overall, it’s actually not that much of an impact.

Tencent’s management is currently providing a lot of AI features for free and they are managing the AI-related costs of these features in a granular way such as using smaller models when applicable and improving the efficiency of inference with software; management wants to eventually monetise these AI features, but they think it is really hard for the user-paid model – popular in the US now for monetising AI models – to work in China; management currently prefers monetisation through advertising; management is seeing AI being monetised in Tencent by contributing to the growth of the overall business

[Question] You guys continue to offer increasingly more AI features to consumer free of charge, the delivery of these AI features is a lot more expensive than mobile Internet services, which will potentially hurt Tencent’s cost structure. Will management consider to start directly monetizing these consumer-facing AI features in the next 1 or 2 years?

[Answer] We are actually managing the cost in a relatively granular way, right? I think there are a lot of places in which if we can use smaller models, we’ll be using smaller models and the cost will be sort of much lower than using the flagship model. And so in a lot of these use cases, the cost is manageable if we can use smaller models. And at the same time, if we continue to improve the efficiency of inference through software upgrades.

And as it relates to whether we would be monetizing eventually — I think eventually, there should be some monetization. I think in China, in reality, it’s actually very hard to use the user paid model, which now populates the U.S. AI tools. And I think over time, we’ll try to figure out whether there will be some ad-supported way of monetizing. But at the same time, I want to point out that AI is already contributing to the growth and monetization of our existing businesses in different ways, right? So somehow we could also fund part of this “subsidy” for AI usage by the users through the growth in our other businesses.

Tencent’s management does not have a definitive answer on the import of US chips for AI but Tencent has sufficient chips for model training; management thinks Tencent has many options for chip-providers for AI inference; management is using software to drive inference efficiencies

With respect to the acquisition of chips, especially the U.S. chips, right, the answer is that we don’t really have a definitive answer on the import situation yet. I think there’s a lot of discussion between the 2 governments, right, and waiting to see what exactly come out of that.

But from our own perspective, we do have enough chips for training and continuous upgrade of our existing models. And we also have many options for inference chips. And we are also executing a lot of software improvement and upgrade in order to drive efficiency gain in inference so that we can actually put more workload on the same number of chips.

Tencent’s management sees higher depreciation expenses in the future because of AI-related investments but Tencent’s business is also growing because of the use of AI; the increase in expenses and revenue may not always match up, but both are definitely growing

I would say the depreciation cost related to AI will definitely continue to go up. But at the same time, we also see that we continue to reap the benefits of AI. And the issue is that these 2 may not match each other completely, but I think both of them will be moving in the same general direction.

Tencent’s management is tracking Tencent’s progress in AI in a number of ways, namely, (1) how AI is helping Tencent’s existing businesses, (2) performance and quality of Hunyuan, (3) usage of the Yuanbao app, (4) progress in AI products within the entire Tencent ecosystem 

We do track our AI development progress very closely. And I think there are a number of indicators that we use right in tracking the progress.

And the first one is that we focus on tracking how AI is actually helping our existing businesses such as ads, such as games, such as FinTech. And I think that’s one area. And when we see that AI is actually being applied in driving the efficiency gain as well as the growth of these businesses, then that’s good. 

Secondly, we focus on tracking the performance and quality of our large language model, HunYuan. And I think there’s a lot of metrics that we actually have to use in order to track the capability as well as the quality of the model.

The third one is we do track how our AI app is actually growing. How many users are using our AI app. And that would include users of our Yuanbao and users of our browser and user of our AI-powered search.

And finally, I would say we do track what’s the progress in the design of other AI-related innovative products within our entire ecosystem. And that would include, for example, the AI agent for WeChat that would include agents within our productivity tools. And these are the metrics that I think we will use in terms of tracking the progress of our AI development.

Veeva Systems (NYSE: VEEV)

Veeva’s management has made great progress with Veeva AI, an initiative launched in April 2025 that will see the company build industry-specific AI agents within its applications; the first AI agents under Veeva AI, for Vault CRM and commercial content, is on track for a December 2025 launch; management plans to release new AI agents and improve existing AI agents 3 times a year; management plans to deliver a host of new AI agents in 2026 and will launch Clinical data agents in 2027; management sees Veeva Business Consulting as an important part of Veeva AI because AI enables new ways of working for Veeva’s customers; Veeva is already working on its first AI-related Business Consulting project; management thinks Veeva AI will increase the value of integration between clinical data management and clinical operations for customers; management thinks Veeva will lead in industry-specific AI agents in Life Sciences because of the deep data that resides in Veeva’s software products; management will will allow customers to create their own AI agents with Veeva AI; management thinks Veeva AI will create billions of dollars of value in the Life Sciences industry and Veeva will be able to capture its fair share of value creation; management does not expect any material revenue contribution from Veeva AI in 2026 or 2027; management thinks it’s still early for customers to all-in on AI with Veeva because the company has not released any AI agents yet; management will enable Veeva’s AI agents to communicate with AI agents from other software platforms because that is of great benefit to customers

We are making great progress on Veeva AI which adds agentic AI to the Vault Platform and industry-specific AI agents in all Veeva applications. With agentic AI in the Vault Platform, we have an integrated platform that manages data, content, and agents together in a secure and maintainable way. Customers can use and extend our application agents and create custom agents of their own. This is a very fundamental change in the Vault Platform…

…Our first agents are on track for December release in CRM and commercial content. We will release new agents and improve existing agents with our releases three times a year. In 2026 we plan to deliver agents for clinical operations, regulatory, safety, quality, medical, and commercial. Clinical data agents are planned for 2027.

Veeva Business Consulting is a critical part of Veeva AI, helping customers with change management because AI enables new ways of working. We are already working on our first Business Consulting project for AI in the commercial content area…

…We continue to see customers looking for an integrated clinical platform across clinical data management and clinical operations. The value of integration is compelling and will only increase with Veeva AI…

… Veeva Vault platform, we started that in 2010, actually, late 2010. It was around this. They had content and it had data and they could do both. And that was very unique and users work with content and data and so we were able to make integrated suites in clinical and quality and regulatory and safety. And that’s what we’ve been doing for the last 15 years and working very hard at it and making these deep industry applications, the business rules around all the data and the content. Now this is the next phase where we’re going to have agents. We still have our data, we have our content. We have our agents and the users are going to interact with all and the agents also interact with the content of the data. So it’s a fundamental new thing. And what we — we’ve led really and are leading in this industry cloud area, industry-specific cloud applications. I think we’re going to lead in industry-specific agents and certainly inside life sciences…

…Customers can create their own custom agents, but mainly our industry-specific agents that they’ll get when they buy Veeva AI. With the model, MCP model context protocol, agent-to-agent, interoperability is really easy and also vault-to-vault interoperability. We will — in terms of monetizing that, we will create billions of dollars of value for the industry. No doubt about that. No doubt about that. Sometimes making humans much more efficient, sometimes reducing the need for certain people doing certain types of tasks. So there’s a tremendous amount of value to be captured by the industry, and we’ll get our fair share of that for sure…

…I don’t expect any material revenue contribution for ’26 or ’27, for example, but I expect it’s a significant increase in our market size. And that will play out over many years…

…I think it’s early for customers to be going all in on AI with EVA because we haven’t even released any agents yet, so we’ve got to work with our first early adopters and work that out…

…We’re architecting in that way that if you have an agent inside a Veeva, it can talk to an agent that might be inside of SAP or Workday or a different sales force one and vice versa. That’s I think that’s going to be one of the unheralded people don’t realize how much of a benefit that is when you have agents that can talk to agents across systems because they’re all following a common protocol much less brittle than you’re wiring things up with a mule soft and transferring data back and forth. I’m really excited about that potential, and it can expose from system to system communication but also for a user. I might be in my Microsoft Office, and I might say, “File this document in TMF.” Well, the Microsoft Office copilot may have that agent, the TMF filing agent from Veeva registered with it. So it says, “Any of the agents know how to do this? The TMF agent with AI sure do.” Okay. I’ll hand the document over to you and the way it goes.

Veeva’s management thinks Veeva has a structural advantage in AI in the Life Sciences industry because the company’s products are a system of record for customers, and the company has deep applications

[Question] Going back to the idea around the opportunity with AI, how you’re kind of thinking about Veeva’s platform approach, the network you’ve built, the scale you’ve built, giving you kind of that right to win as you embed more NII functionality across the platform?

[Answer] We refer to that as a structural advantage. When you have an application that’s a system of record, be it the e-mail system or the supply chain system or all the 50 sort of applications of Veeva has that are deep in life sciences and the CRM system to the drug safety system to the clinical trial management system. When you have that system of record with the users in there, you have the right to win the deep industry-specific agents because it’s in the user’s workflow. Think about it, if you use Google for your e-mail and your calendar, you would love an agent from Google that works seamlessly with that, if you could get it. So we have a right to win there. You called it right to win, I call it a structural advantage. We can knit that technology together so that it’s a seamless platform that handles the agents, the content the data. Another thing that Veeva has is we have a platform that’s broad. We make about 50 applications with our platform. So we can touch a lot of things with our platform. We put it in the Vault platform once, and it can extend area everywhere. So we have a structural advantage…

…[Question] Around AI and agents. Could you just sort of articulate what you view as the unique differentiator from an architecture perspective of Vault versus agent force or even the back end of IQVIA? Like what do you think puts you at an advantage?

[Answer] Our main advantage is that we have the deep applications. So if we just take a clinical example, again, we have the clinical trial management application. So that houses all the people that deal with clinical and all the data about clinical and all the business rules and all the content and all the security about clinical trials. So with Veeva AI, when we build an application agent, that’s built inside of the Vault platform. So it inherently knows all the security rules and have to deal with that. and it is running in the Vault application server. So it also has transaction control. So we can update the data in the content. It can act on behalf of the user inside of a workflow in a transactionally sound way. So that’s a structural advantage if you have the application.

Veeva’s management thinks AI agents will be doing some of the things humans will do, which will either free up productive-time for humans, or reduce the need for humans; 

If you look at areas within safety and clinical, there’s some areas where there’s a lot of outsourced hundreds of millions of dollars of outsourced labor used to do processing type things. I think agentic AI can maybe remove the need for half of that. If you look at a clinical trial master file, agent is going to be pretty good at putting a document where it should go and telling you if you have all the documentation you need for that trial based on the protocol. And is any document blurry, is there any document eligible, et cetera, it’s going to be really darn good at that stuff. So it will be different by each area, but agentic AI is going to do things — some of the things that humans can do, agentic AI is going to be able to do that. That either frees up more human time for humans to be more productive on what they need to do or reduces the need for humans.

Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I have a vested interest in Adobe, Adyen, Alphabet, Amazon, Meta Platforms, Microsoft, MongoDB, Nu Holdings, Okta, Salesforce, Sea Ltd, Tencent, and Veeva Systems. Holdings are subject to change at any time.

An Investing Legend’s Thoughts on Investing in Thrift Conversions

Notes from an investing legend’s book on how we can research and invest in thrift conversions 

Earlier this year, I had written a number of articles on The Good Investors on investing in thrift conversions (see here, here, and here). An important part of my learning process on thrifts came from investing legend Peter Lynch, who is revered for his track record when managing the Fidelity Magellan Fund. From 1977 to 1990, Lynch generated an annualised return of 29%, nearly double that of the S&P 500 over the same period.

Although his investing book One Up on Wall Street is well-known and highly popular, Lynch actually wrote a few other lesser-known books on investing including Beating The Street. The latter is the source of what I learnt about investing in thrift conversions from Lynch. 

Because Beating The Street is not widely known, and because I find studying thrift conversions as potential investments to be a fascinating activity, I thought it would be useful to share my notes from Beating The Street on how Lynch thought about investing in thrift conversions. 

What’s shown between the two horizontal lines below, besides the section-headers, are direct quotes from Lynch’s book. Do note that the emphases are mine.


On investing in S&Ls (Savings & Loans institutions)

Prior to the 1980s, Golden West was one of the few S&Ls that was a public company.  Then in a rash of stock offerings in meid-defcaded, hundreds of the formerly private thrifts, operating as “mutual savings banks,” went public more or less simultaneously. I acquired many of these for the Magellan Fund. I was so selective in my purchases during this period that anything that had the word  “first” or “trust” in it, I bought. Once, I confessed to the Barron’s panel that I’d invested in 135 of the 145 thrifts whose prospectuses had landed on my desk. The response from Abelson was typical: “What happened to the others?”

There are two explanations for my indiscriminate and sometimes fatal attraction for S&Ls. The first is that my fund was so big and they were so small that to get enough nourishment out of them I had to consume large quantities, like the whales who are forced to survive on plankton. The second is the unique way that S&Ls came public, which made them an automatic bargain from the tart. (To learn how you, too, can get something for nothing, turn to page 215.)

On acquisition statistics for S&Ls

The experts at SNL Securities in Charlottesville, Virginia, who keep tabs on all the thrifts in existence, recently provided me with an update on what happened to the 464 S&Ls that came public after 1982. Ninety-nine of these were subsequently taken over by bigger banks and S&Ls, usually at a large profit to the shareholders. (The watershed example is the Morris County [New Jersey] Savings Bank. The initial offering price in 1983 was $10.75 a share, and Morris was bought out three years later for $65.) Sixty-five of the public traded S&Ls have failed, usually at a total loss to the shareholders. (I know this from personal experience because I owned several in this category.) That leaves 300 still in business.

On how to study an S&L

If you decide to pursue the subject of undervalued S&Ls – which to me is much more exciting than any trip to Hawaii – you’d be well advised to seek out the latest copy of The Thrift Digest at the local library or to borrow one from your broker. I borrowed mine from Fidelity. 

I spent so much time with my nose in this book before dinner, during dinner, and after dinner that Carolyn began to refer to it as the Old Testament. The Old Testament in hand, I devised my own S&L scorecard, listing 145 of the strongest institutions by state and jotting down the following key details. This, in a nutshell, is everything you need to know about an S&L:

Current price

Self explanatory.

Initial offering price

When an S&L is selling below the price at which it came public, it’s a sign that the stock may be undervalued. Other factors, of course, must be considered.

Equity-to-assets ratio 

The most important number of all. Measures financial strength and “survivability.” The higher the E/A, the better. E/As have an incredible range, from as low as 1 or 2 (candidates for the scrap heap) to as high as 20 (four times stronger than J.P. Morgan). An E/A of 5.5 to 6 is average, but below 5, you’re in the danger zone of ailing thrifts. 

Before I invest in any S&L, I like to see that its E/A ratio is at least 7.5. This is not only for disaster protection, but also because an S&L with a high E/A ratio makes an attractive takeover candidate. This excess equity gives it excess lending capacity that a larger bank or S&L might want to put to use.

Dividend

Many S&Ls pay better-than-average dividends. When one of them meets all the other criteria and also has a high yield, it’s a plus.

Book Value

Most of the assets of a bank or an S&L are in its loans. Once you assure yourself that an S&L has avoided high-risk lending (see below), you can begin to have confidence that its book value, as reported in the financial statements, is an accurate reflection of the institution’s true worth. A lot of the most profitable Jimmy Stewarts are selling at well below book value today.

Price-Earnings ratio

As with any stock, the lower this number, the better. Some S&Ls with annual growth rates of 15 percent a year have p/e ratios of 7 or 8, based on the prior 12 months’ earnings. This is very promising, especially in the light of the fact that overall p/e of the S&P500 was 23 when I did this research. 

High-Risk Real-Estate Assets

These are the common problem areas, especially commercial loans and construction loans, that have been the ruination of so many S&Ls. When high-risk assets exceed 5-10 percent, I begin to get nervous. All else being equal, I prefer to invest in an S&L that has a small percentage of its assets in the high-risk category. Since it’s impossible for the casual investor to analyse a commercial lending portfolio from afar, the safest course is to avoid investing in S&Ls that made such loans.

Even without The Thrift Digest, it’s possible to do your own calculation of high-risk assets. Check the annual report of the dollar value of all construction and commercial real estate lending, listed under “assets.” Then find the dollar value of all outstanding loans. Divide the latter into the former, and you’ll arrive at a good approximation of the high-risk percentage.

90-Day Non-performing assets

These are the loans that have already defaulted. What you want to see here is a very low number, preferably less than 2 percent of the S&L’s total assets. Also you’d like this number to be falling and not rising. An extra couple of percentage points’ worth of bad loans can wipe out an S&L’s entire equity.

Real Estate Owned

This is property on which the S&L has already foreclosed. The REO category, as it’s called, is an index of yesterday’s problems, because whatever shows up here has been written off as a loss on the books. 

Since this financial “hit” has already been taken, a high percentage of real estate owned isn’t as worrisome as a high percentage of non-performing assets. But it’s worrisome when REO is on the rise. 

S&Ls aren’t in the real-estate business, and the last thing they want is to repossess more condos or office parks that are expensive to maintain and hard to sell. In fact, where there’s a lot of ROE, you have to assume that the S&L is having trouble getting rid of it. 

Why larger banks want to acquire S&Ls

An S&L with excess equity, excess lending capacity, and a loyal depositor base is a prize that commercial banks covet. Commercial banks can take in deposits only in their home states (this rule is changing, to some degree), but they can lend money anywhere. This is what makes taking over an S&L a very tempting proposition.

If I were the Bank of Boston, for instance, I’d be sending love notes to Home Port Bancorp of Nantucket, Massachusetts. Home Port has a 20 percent equity-to-assets ratio, making it perhaps the strongest financial institution in the modern world. It also has a captive island market with crusty New England depositors, who aren’t about to change their banking habits and run off to a new-fangled money-market fund. 

Maybe the Bank of Boston doesn’t want to make loans on Nantucket, but once it acquires Home Port’s equity and its deposit base, it can use the excess lending capacity to make loans in Boston, or anywhere else around the country.

During 1987-90, a terrible period for S&Ls, more than 100 were acquired by larger institutions that saw the same sort of the potential the Bank of Boston ought to see in Home port. Banks and thrifts will continue to consolidate at a rapid rate, and with good reason. Currently , the U.S. has more than 7,000 banks, thrifts, and other assorted deposit takers – which is about 6,500 too many. 

How an S&L’s business model works

An S&L needs loyal depositors to keep money in their savings and checking accounts. It needs to make money on that money by lending it out – but not to borrowers who default. And it needs low operating expenses in order to maximise its profits. Bankers like to live on threes and sixes: borrow money at 3, lend money at 6, play golf at 3.

Examples of S&Ls that Lynch recommended

GLACIER BANCORP

I’d opened my Glacier Bancorp file. The stock was selling for $12 a share, a 60 percent gain over the year before. This was a 12-15 percent grower selling at 10 times earnings – not a spectacular bargain, but there wasn’t much risk in it either.

Glacier Bancorp used to be called the First Federal Savings and Loan of Kalispell, and I wish they’d kept the old name. It sounded antiquated and parochial, which to me is always reassuring. I’d rather have antiquated and parochial than trendy and sophisticated, which usually means a company is desperate to improve its image.

I like companies that stick to business and let the images take care of themselves. There is this unfortunate tendency among financial institutions to take the “bank” out of their names and replace it with “bancorp.” I know what a bank is, but “bancorp” makes me nervous.

Anyway, whoever answered the phone at Glacier Bancorp in Kalispell told me they were having a retirement party for one of the officers, but they’d inform chairman Charles Mercord that I called. They must have dragged him out of the party, because a few minutes later Mercord called me back.

Asking a president or a CEO about a company’s earnings is a ticklish proposition. You’re not going to get anywhere by blurting out, “ What are you going to earn next year?” First you have to establish rapport. We chatted about the mountains. I said that the entire Lynch family had been to all the Western states to see the national parks, and that we loved Montana…

…Then I begin to slip in more serious investment-type questions, such as “What’s the population out there?” and “what’s the elevation of the town?,” leading up to the more substantive “Are you adding any new branches or standing pat with what you’ve got?” I was trying to get a sense of the mood at Glacier.

“Anything unusual in the third quarter?” I continued. “You made thirty-eight cents, I see.” It’s best to pepper these inquiries with bits of information, so that your source thinks you’ve done your homework. 

The mood at Glacier Bancorp was upbeat. Non-performing loans were almost nonexistent. In all of 1991, this bancorp had had to write off only $16,000 in bad loans. It had raised its dividend for the 15th year in a row. It had just bought out two other thrifts with wonderful names: the First National Banks of Whitefish and Eureka, respectively

This is how many of the stronger S&Ls are going to speed up growth in the next few years. They are acquiring the valuable deposits of troubled and defunct S&Ls. Glacier can fold the First National of Whitefish into its own system and make more loans with the additional Whitefish deposits. It can also do more administrative cost-cutting, since two S&Ls together can live more cheaply than one. 

“You’re building up a nice asset here,” I said, introducing the Whitefish subject. “I’m sure it’s a good move, accountingwise.” My only worry was that Glacier may have overpaid for its acquisition, a topic I approached obliquely. “I assume you had to pay way over book value for this,” I said, inviting Glacier’s president to admit the worst. But no, Glacier hadn’t overpaid.

We talked about Glacier’s 9.2 percent of commercial loans, the sole troubling statistic I’d gleaned from The Thrift Digest. If this had been a New England thrift, that high number would have scared me away, but Montana wasn’t Massachusetts. The Glacier president assured me that his S&L wasn’t loaning money to developers of empty office towers or unsalable vacation condos. Glacier’s commercial loans were mostly in multifamily housing, which was in great demand. Montana’s population was growing. Every year, thousands of escapees from California smog and taxes were taking up residence in the Big Sky, small government state.

SOVEREIGN BANCORP 

In the November 25, 1991, issue of Barron’s, I came across an article entitled “Hometown Lender to the Well-Heeled.” It described how Sovereign Bancorp serves a wealthy element in southeastern Pennsylvania from its headquarters in Reading. I liked the part about how a bell goes off in a Sovereign branch every time a mortgage loan is approved.

This was not the only time in my career I was introduced to a stock by a weekly magazine. I checked the annual and the quarterlies. In every important category, Sovereign got good marks. Nonperforming loans were 1 percent of assets. Commercial and construction lending was 4 percent. Sovereign had set aside sufficient reserves to cover 100 percent of its nonperformers.

Sovereign had acquired two New Jersey thrifts from the Resolution Trust Corporation, which boosted its deposits and eventually would boost its earnings. To review some of the details, I called Jay Sidhu, Sovereign’s Indian-born president. We chatted about Bombay and Madras, which I’d visited the year before on a charity trip.

When we got around to serious subjects, Mr. Sidhu said that management was determined to “grow” the business by at least 12 percent a year. Meanwhile, based on the latest analysts’ estimates for 1992, the stock was selling at a p/e ratio of 8. 

The only negative detail was that Sovereign had sold an additional 2.5 million shares in 1991. We’ve already discussed how it’s usually a good thing when a company buys back its shares, as long as it can afford to do so. Conversely, it’s a bad thing when a company increases the number of shares. This has the same result as a government printing more money: it cheapens the currency.

At least Sovereign wasn’t squandering the proceeds from its stock sale. It was using the proceeds to buy more troubled thrifts from the Resolution Trust.

Mr. Sidhu’s model for success, I was pleased to discover, was Golden West. Basically, he wanted to copy the penurious Sandlers by increasing loan originations and cutting expenses. With the payroll that Sovereign inherited from its recent acquisitions, the overhead was 2.25 percent, much higher than Golden West’s 1 percent, but Mr. Sidhu seemed devoted to bringing that down. The fact that he owned 4 percent of the stock gave him a considerable incentive to carry out this plan.

Instead of holding on to the mortgages as many thrifts do, Sovereign had decided to specialize in making loans and then selling them to packagers such as Fannie Mae or Freddie Mac. This strategy enabled Sovereign to get its money back quickly and plow it into new mortgages, profiting from the points and other upfront fees. The risk of owning the mortgages was transferred to others.

Even so, Sovereign was being very conservative in the kinds of loans it would approve. It was devoted to residential mortgages. It hadn’t made a single commercial loan since 1989. Its average residential loan didn’t exceed 69 percent of the value of the property on which the loan was made. The few bad loans were thoroughly investigated so that Sovereign could learn who or what went wrong and not repeat its mistakes.

As often happens in my conversations with companies, I learned something new from Sidhu. He described a sneaky method by which unscrupulous banks and S&Ls camouflage their problem loans. If a developer, say, asks to borrow $1 million for a commercial project, the bank offers him $1.2 million on the basis of an inflated appraisal. The extra $200,000 is held in reserve by the bank. If the developer defaults on the loan, the bank can use this extra money to cover the developer’s payments. That way, what has turned into a bad loan can still be carried on the books as a good loan—at least temporarily.

I don’t know how widespread this practice has become, but if Sidhu is right, it’s another reason to avoid investing in banks and S&Ls with large portfolios of commercial real estate

Why thrift conversions are such good bargains

Imagine buying a house and then discovering that the former owners have cashed your check for the down payment and left the money in an envelope in a kitchen drawer, along with a note that reads: “Keep this, it belonged to you in the first place.” You’ve got the house and it hasn’t cost you a thing. 

This is the sort of pleasant surprise that awaits investors who buy shares in any S&L that goes public for the first time. And since 1,178 S&Ls have yet to take this step, there will be many more chances for investors to be surprised.

I learned about the hidden cash-in-the-drawer rebate early in my career at Magellan. This explains why I bought shares in almost every S&L and mutual savings bank (another name for the same sort of institution) that appeared on my Quotron.

Traditionally, the local S&L or mutual savings bank has no shareholders. It is owned cooperatively by all the depositors, in the same way that rural electric utilities are organized as co-ops and owned by all the customers. The net worth of a mutual savings bank, which may have been built up over 100 years, belongs to everyone who has a savings account or a checking account in one of the branches. 

As long as the mutual form of ownership is maintained, the thousands of depositors get nothing for their stake in the enterprise. That and $1.50 will get them a glass of mineral water

When the mutual savings bank comes to Wall Street and sells stock in a public offering, a fascinating thing happens. First of all, the S&L directors who put the deal together and the buyers of the stock are on the same side of the table. The directors themselves will buy shares. You can find out how many in the offering circular that accompanies the deal. 

How do directors price a stock that they themselves are going to buy? Low. 

Depositors as well as directors will be given the opportunity to buy shares at the initial offering price. The interesting thing about this is that every dollar that’s raised in the offering, minus the underwriting fees, will end up back in the S&L’s vault. 

This is not what happens when other kinds of companies go public. In those cases, a sizable chunk of the money is carted away by the founders and original shareholders, who then become millionaires and buy palazzi in Italy or castles in Spain. But in this case, since the mutual savings bank is owned by the depositors, it would be inconvenient to divvy up the proceeds from a stock sale to thousands of sellers who also happen to be buyers. Instead, the money is returned to the institution, in total, to become part of the S&L’s equity. 

Say your local thrift had $10 million in book value before it went public. Then it sold $10 million worth of stock in the offering—1 million shares at $10 apiece. When this $10 million from the stock sale returns to the vault, the book value of this company has just doubled. A company with a $20 book value is now selling for $10 a share.

This doesn’t guarantee that what you’re getting for free will necessarily turn out to be a good thing. You could be getting a Jimmy Stewart S&L, or it could be a lemon S&L with inept management that’s losing money and eventually will lose all its equity and go bankrupt. Even in this can’t-lose situation, you ought to investigate the S&L before you invest in it.

The next time you pass a mutual savings bank or an S&L that’s still cooperatively owned, think about stopping in and establishing an account. That way, you’ll be guaranteed a chance to buy shares at the initial offering price. Of course, you can always wait until after the offering to buy your shares on the open market, and you’ll still be getting a bargain. 

But don’t wait too long. Wall Street seems to be catching on to the cash-in-thedrawer trick, and the increase in stock prices of mutual savings banks and savings and loans that have converted to public ownership since 1991 is nothing short of remarkable. It’s been a bonanza almost anywhere you look, from one end of the country to the other.

In 1991, 16 mutual thrifts and savings banks came public. Two were taken over at more than four times the offering price, and of the remaining 14, the worst is up 87 percent in value. All the rest have doubled or better, and there are four triples, one 7-bagger, and one 10-bagger. Imagine making 10 times your money in 32 months by investing in Magna Bancorp, Inc., of Hattiesburg, Mississippi. 

In 1992, another 42 mutual thrifts came public. The only loser in this group has been First FS&LA of San Bernardino, and it’s down a modest 7.5 percent. All the rest have advanced—38 of them by 50 percent or more, and 23 by 100 percent or more. These gains have come in 20 months! 

Table 13-1. MUTUAL THRIFT AND SAVINGS BANK IPOs COMPLETED IN 1991†

Table 13-2. THE 10 BEST AND 10 WORST RESULTS: MUTUAL THRIFT AND SAVINGS BANK IPOs COMPLETED IN 1992 

Table 13-3. THE 10 BEST AND 10 WORST PERFORMING MUTUAL THRIFT AND SAVINGS BANK IPOs COMPLETED IN 1993 THROUGH 9/30/93

There are two quadruples in the group—Mutual Savings Bank of Bay City, Michigan, and United Postal Bancorp in St. Louis. A portfolio of the five top performers taken together has produced a 285 percent return. Even a person who was unlucky enough to have chosen the five worst-performing thrifts that came public in 1992 has made 31 percent on his money through September 1993. Investing in the five worst has beaten the S&P 500 and most of the equity mutual funds. 

Through the first nine months of 1993, another 34 mutual thrifts have come public, and in this shorter period the worst is up 5 percent, 26 are up 30 percent or better, 20 are up 40 percent or better, and 9 are up 50 percent or better. (All the above numbers were provided by the skillful crunchers at SNL Securities.) 

From Asheboro, North Carolina, to Ipswich, Massachusetts, on the East Coast; from Pasadena, California, to Everett, Washington, on the West; from Stillwater, Oklahoma, to Kankakee, Illinois, to Rosenberg, Texas, in the middle, neighborhood S&Ls have been the best investments that hundreds of thousands of people have ever made. This is the ultimate example of how individual investors can succeed by ignoring companies that are widely held by institutions and by investigating what’s close to home. What could be closer to home than the local thrift where you keep your safety deposit box and your checking account? 

An account in any one of these thrifts or savings banks entitles you to participate in the IPO if and when it happens, but you certainly aren’t required to do so. You can go to the meeting where the deal is explained to potential shareholders, see whether the insiders are buying the shares, read the prospectus to find out the book value, the p/e ratio, what the earnings are, the percentage of nonperforming assets, the quality of the loan portfolio, etc., and thus get all the information you need to make an informed decision. It’s an opportunity to take a close look at a local company—and it’s free. If you don’t like the deal, the organization, or the management, you simply don’t invest.

There are still 1,372 mutual savings banks that have not yet come public. Check to see whether any of these are located in your area. By opening a savings account in any of them, you’ll have the right to participate in the IPO when it happens. Sit back and await developments. 


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I don’t have a vested interest in any company mentioned. Holdings are subject to change at any time.

What The USA’s Largest Bank Thinks About The State Of The Country’s Economy In Q2 2025

Insights from JPMorgan Chase’s management on the health of American consumers and businesses in the second quarter of 2025.

JPMorgan Chase (NYSE: JPM) is currently the largest bank in the USA by total assets. Because of this status, JPMorgan is naturally able to feel the pulse of the country’s economy. The bank’s latest earnings conference call – for the second quarter of 2025 – was held last week and contained useful insights on the state of American consumers and businesses. The bottom-line is this: the US economy remains resilient, but significant risks persist

What’s shown between the two horizontal lines below are quotes from JPMorgan’s management team that I picked up from the call.


1. The US economy remained resilient in 2025 Q2 but significant risks persist

The U.S. economy remained resilient in the quarter. The finalization of tax reform and potential deregulation are positive for the economic outlook, however, significant risks persist – including from tariffs and trade uncertainty, worsening geopolitical conditions, high fiscal deficits and elevated asset prices.

2. Net charge-offs for the whole bank (effectively bad loans that JPMorgan can’t recover) rose from US$2.2 billion a year ago; Consumer & Community Banking’s net charge-offs was relatively flat compared to a year ago 

Credit costs were $2.8 billion, with net charge-offs of $2.4 billion, and a net reserve build of $439 million…

…Now let’s go to our businesses, starting with CCB…

…Credit costs were $2.1 billion, reflecting net charge-offs of $2.1 billion, relatively flat year-on-year, in line with expectations.

3. JPMorgan’s credit card outstanding loans was up 9% year-on-year in 2025 Q2 

Card outstandings were up 9% due to strong new card acquisition.

4. Auto originations were up year-on-year

In Auto originations were up 5%, driven by higher lease volumes.

6. JPMorgan’s investment banking fees had good growth in 2025 Q2, with growth in debt underwriting fees but a decline in equity underwriting fees; management sees a robust pipeline for capital markets activities among companies and the outlook is upbeat, but they’re also aware that sentiment can change in a heartbeat

IB fees were up 7% year-on-year. We continue to rank #1 with wallet share of 8.9%. In advisory fees were up 8%, benefiting from increased sponsor activity. Debt underwriting fees were up 12%, primarily driven by a few large deals. In equity underwriting fees were down 6% year-on-year. Our pipeline remains robust, and the outlook along with the market tone and sentiment is notably more upbeat…

…You’ve seen how rapidly pipelines can grow and shrink. And so that lesson we’ve learned over and over, it may stay wide open for 1.5 years. Something may happen geopolitically that all of a sudden that pipeline slows a little bit. And so I’m always a little cautious to guess what that’s going to be.

7. Management continues to expect credit card net charge-offs for 2025 to be around 3.6% 

On credit, we continue to expect the Card net charge-off rate to be approximately 3.6%.

8. The consumer looks fine to management given the low unemployment rate, although there is a little it more stress in lower income consumers compared to higher income consumers

[Question] If you can expand that into the consumer, any areas of stress from a credit quality perspective that you’re beginning to get more concerned today versus 3 or 6 months ago?

[Answer] We look at it very closely. It obviously matters a lot for us as a company. But we continue to struggle to see signs of weakness. We just — the consumer basically seems to be fine. Now a few things are true. Like if you look at indicators of stress, not surprisingly, you see a little bit more stress in the lower income bands than you see in the higher income bands. But that’s always true. That’s pretty much definitionally true. And nothing there is out of line with our expectations. Our delinquency rates are also in line with expectations. You saw that we kept our net charge-off guidance unchanged. So all that looks kind of fine. And to be honest, as we’ve said before, fundamentally, while there are nuances around the edges, consumer credit is primarily about labor markets. And in a world with 4.1% unemployment rate, it’s just going to be hard, especially in our portfolio to see a lot of weakness.

9. JPMorgan experienced a jump in non-accrual loans within consumer lending, but that is because of forbearance related to wildfires in the Los Angeles area, and the actual loss expectation is de minimis

[Question] In terms of the NPAs, the nonaccruals in consumers seem to have a bit of a jump. Is there something technical there?

[Answer] There is something technical, which has to do with customers in the — Home Lending customers in the L.A. area, using our forbearance availability as a result of the wildfires. So that is resulting in an uptick in the nonperforming. But when you think about land value, and the insurance there, the actual loss expectation is de minimis, I would say.

10. Management thinks tariff-related risks have reduced a little; management has not seen any pressure on loans because of tariffs 

When it comes to tariffs, I think the initial Liberation Day, now there’s more talk as more things getting done, a couple have been announced, a couple have been delayed, that reduces that risk a little bit. And hopefully, they’ll get done. So there’s still risk out there, but I am hopeful that some of these frameworks are completed soon, at least before August 1…

…What’s the tariff pressure with pressure on loans or debt. The answer is no.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I don’t have a vested interest in any company mentioned. Holdings are subject to change at any time.

An Important Perspective on US Government Debt

The US government has a lot of debt, but what about its assets?

I’ve noticed that when there’s public discussion on US government finances, the prevailing stance is that the government is heavily in debt and it is a terrible situation for the country to be in. For example:

  • CNN quoted Maya MacGuineas, President of the Committee for a Responsible Federal Budget in January 2024: “Though our level of debt is dangerous for both our economy and for national security, America just cannot stop borrowing”
  • In June 2025, Market Watch wrote: “America’s current debt level stands at roughly 121% of GDP… The debt burden is no longer just a distant concern. It is a present and pressing problem”
  • Ray Dalio, who is the founder of one of the largest – if not the largest – hedge fund in the world, BridgeWater, commented in June 2025 on American government debt: “[The US government] has accumulated a big debt—approximately six times the amount that it is bringing in each year (about $30 trillion), which equals about $230,000 per household that you have to take care of”

The thing about debt is that there are two sides to the coin. A balance sheet for a company has both assets and liabilities and the same goes for a country. So while the US government has plenty of debt, which are liabilities, it also has assets.

And what does the US government’s assets look like? According to the Federal Reserve, the US government’s assets have a value of just US$5.6 trillion as of September 2024, which is far lower than its liabilities of US$45.5 trillion, most of which are US28.3 trillion in government debt. This does not look good.

But, according to the Institute of Energy Research, the US government has ownership of a huge mineral estate, consisting of natural resources such as oil, natural gas, and coal, which had a value of US$150 trillion as of January 2013. The value of these assets are not recorded on the Federal Reserve’s accounting of the US government’s balance sheet. The prices of oil, natural gas, and coal today are within the same ballpark as what they were in January 2013 and this means the US government’s US$150 trillion in mineral assets back then would have around the same value today. In other words, the US government’s assets are much higher than its liabilities.

One more point worth noting is that Federal Reserve data show American households have a total net worth – that would be household assets minus household liabilities – of US$170 trillion in the first quarter of this year. This net worth is again much higher than US government liabilities. The US$230,000 in debt per US household that Ray Dalio said the US government has saddled the country’s population with, turns out to be much lower than US households’ net worth. 

When it comes to the idea of the US government being heavily in debt, I think the reality is different. Yes, the US government has been borrowing like a drunken sailor, with a budget deficit that currently runs at around 7% of GDP – this is absolutely not sustainable in the long run. But right now the balance sheet of the US government is still really healthy when the true value of its assets is considered and this gives the government plenty of buffer time to right the ship. 

In public discussions of US government debt, I find that the asset-part of the balance sheets of the US government and households is often missing – and this is an important perspective we should all be aware of.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I do not have a vested interest in any companies mentioned. Holdings are subject to change at any time.

Can The (Micro)Strategy Bitcoin Playbook Last Forever?

Strategy’s amazing financial engineering.

Strategy (recently renamed from Microstrategy) is one of the top performing companies in the US stock market in recent years. The stock price of the highly controversial “Bitcoin holding company” is up 210% in the last year alone and up a staggering 3,300% in the last five years.

One reason why Strategy has done so well is because it is one of the best at raising cheap capital. How does this work?

Self-fulfilling cycle

Strategy’s Bitcoin playbook is pretty simple and yet quite ingenious. The “Bitcoin holding company” basically takes advantage of its stock price trading at a premium to book value by selling new shares for cash. 

Imagine a company that has a book value of $1 million and has 1 million shares. Each share, hence, has a book value of $1. But let’s say that for some reason, someone is willing to buy the shares at $2 each. The company can take advantage of this and sell new shares to this buyer. Let’s say the company sells 1 million new shares for $2 million. After the share issuance, the company now has 2 million shares outstanding and $3 million in book value. The book value per share is also now magically $1.50. The process can become a self-fulfilling cycle where the company raising shares above book value actually leads to the book value per share increasing.

This is exactly what Strategy has done. Its book value per share has risen by using this simple financial engineering trick. But Strategy then also uses proceeds from its share issuance to buy Bitcoin. If Bitcoin’s price rises, Strategy’s book value per share will increase yet again.

In 2023, Strategy raised US$2.0 billion from issuing shares. In 2024, the company raised an even larger sum of US$16.3 billion from ordinary share sales. As of its last quarterly earnings update for the first quarter of 2025, it has raised another US$5.7 billion through sales of common shares and preferred shares.

But Strategy has gone yet one step further. The company has also raised capital through debt markets to buy more Bitcoin, in effect leveraging up its balance sheet and increasing its exposure to Bitcoin. Strategy’s total debt has increased from US$2.2 billion in 2023 to US$7.2 billion in 2024, and US$8.1 billion in the first quarter of 2025.

What the bulls believe

Investors who are bullish on Strategy believe that this virtuous cycle can continue forever. They believe that Strategy’s premium to book value will exist for many years as there are sufficient buyers of the stock who believe in this self-fulfilling cycle. 

If true, Strategy will become a compounding machine simply by issuing new shares at a premium and juicing its book value per share. There’s also the Bitcoin purchases, which adds another growth-factor for Strategy’s book value per share.

But as I mentioned earlier, there’s also leverage at play with Microstrategy because the company has used debt to buy more Bitcoin that it can actually afford. Microstratregy’s book value will therefore swing more than Bitcoin’s price. If Bitcoin’s price rises, Microstrategy’s book value will go up faster. 

When will the party end?

“I applaud Strategy’s playbook. But there are some risks that shareholders need to be wary of. The obvious one is if Bitcoin’s price falls. When this happens, Strategy’s book value per share will fall faster because of the leveraged nature of the company’s balance sheet. As of 31 March 2025, Strategy had US$43.5 billion worth of Bitcoin but only US$32.2 billion in equity. If Bitcoin’s price falls by 50%, Strategy’s book value would drop to US$10.5 billion, or roughly a 66% fall. For Strategy to enter negative book value territory, Bitcoin will need to fall by around 74% from the 31 March Bitcoin price. 

The other major risk is if stock market participants decide that Strategy’s stock price simply does not deserve to trade at a premium to book value. In other words, buyers of the stock only want to pay book value to buy shares. This throws Strategy’s ability to raise capital cheaply out the window. But it also means that Strategy’ shareholders who first invested at a premium to book value could face a potential heavy loss.

As of Bitcoin’s price at the time of writing, Strategy’s book value is worth around US$38 billion. But based on the company’s current stock price, its market capitalisation is around US$108 billion, or a 180% premium to its book value. Even if Bitcoin’s price remains stable, but Strategy’s stock price reverts to no premium on book value, this could still lead to a painful 64% reduction in the stock price price.

For now, momentum and the current environment suggests that market participants are unlikely to bid down Strategy’s stock price so drastically so soon. But things can change during “risk-off” environments and when market participants become more cautious.

A double whammy for Strategy shareholders can happen if both Bitcoin’s price falls and Strategy’s premium to book value narrows.

The bottom line

Whatever you think about Michael Saylor and his Bitcoin views, he certainly has mastered the dark arts of financial manoeuvring. In most assets, fundamentals drive price. Saylor has managed to turn the script around, making price drive fundamentals.

But this comes with risks. If Strategy’s stock price collapses, the virtuous engine stops running. Saylor seems to be wary of these risks. While Strategy continues to issue shares to buy Bitcoin, Saylor is constantly selling his Strategy shares.

Despite the risks, market participants seem hungry for more of such companies. Besides Strategy, there are now a number of copy cats around the world, such as Metaplanet in Japan which has seen a meteoric rise in its share price this year. Its stock price is at an eye-popping 7 times book value.

For such companies, the party will end when there are no more greater fools to sell to (both for Bitcoin and for new shares of the company). Whether – or more likely, when – that happens is anybody’s guess. Just be careful not to be the last one holding the bag.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I do not have a vested interest in any companies mentioned. Holdings are subject to change at any time.

Passive Income

Knowing what you want to achieve, and what passive income is, is important

Passive income is not just income earned outside of your job. The real meaning of passive income is money that you earn with little to no effort. 

Some may think that money earned in the stock market or from properties is passive income. Yet, this may not be the case. Stock market investors can sometimes be so caught up in trading and looking at stock prices that investing becomes a huge part of their lives and can even be considered another job. Property investing can also turn out to be tedious if you manage your properties yourself.

I know of friends who want to earn passive income but instead end up spending so much time on their investments. Don’t get me wrong. I love spending time learning and investing but this isn’t really “passive”.

Here’s how you can earn real passive income.

Stop trading the stock market

First, stop short-term trading. My definition of trading is buying stocks in the very short-term based on price action and charts. This is not investing and can become a part-time or full-time job as it requires a lot of time and effort. The more trades you need to make, the more effort is required.

We should aim to invest in a way that reduces the number of trades and amount of work that we need to do. 

One way to do this is by investing long-term in set-and-forget investments. Investing in stocks that have the potential to grow earnings (and thus the share price) reliably over the long-term is one good strategy that lowers the time spent on investing.

You can also invest in passive index ETFs that track the performance of broad market indexes. Stock indexes have historically increased in value over a sufficiently long period of time and provide a good way to gain exposure to some of the biggest and most profitable companies.

You can also invest in dividend stocks that reliably pay a dividend. Investors from Singapore enjoy tax-free dividends when they invest in Singapore-listed dividend-paying stocks.

Outsource your investing

Another way to reduce time and effort spent on investing is to outsource your investing to an expert.

One way to do this is by employing financial experts who can advise you on stocks to buy or funds to purchase. You can also use robo advisors which can help you allocate your portfolio into a variety of investments.

While you will need to pay a fee for these services, having someone to invest on your behalf or advise you frees you from the hassle of doing everything yourself and saves you a ton of time.

Invest in other passive assets

You can also invest in other assets beside the stock market.

Assets such as long-term fixed deposits, government bonds, or even professionally-managed real estate may be a good way to grow your wealth without doing much work.

If you can find long-term investments that require little effort on your part but can provide a stable passive return, this is a potentially good asset to invest in and reduce your investment effort.

Know your goals

What do you really want to achieve? Do you want to grow your wealth as quickly as possible?

Then by all means go ahead and dig through annual reports, scour the market for undervalued stocks, sell weekly put options, or even manage your own AirBnB property for higher rental yields. There is nothing wrong with this and is my preferred style of investing.

But this is not passive income.

If you really want passive income, invest in long-term assets, find a professional or robo advisor to manage your wealth or build passive income by dollar cost averaging into funds or long-term assets.

While this may not always give you the best fee-adjusted returns, this is a true passive income strategy and frees up your time for other things in life.

Ultimately, when it comes to investing there is no one-size-fit-all strategy and knowing what you want to achieve can help determine how you should approach investing your spare capital (and time).


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I do not have a vested interest in any companies mentioned. Holdings are subject to change at any time.

Why Do Employees Love Stock-Based Compensation?

Stock-based compensation has its risks, but it can still be an attractive proposition for an employee

In the past, stock-based compensation was more common with fledging startups that had to find ways to preserve the little cash they had.

But today, stock-based compensation is used by almost every major company in the world. Even big firms with lots of cash continue to use stock-based compensation. One reason is because employees want to get paid in stock.

How it works

To understand why this is, we need to look at the mechanics of how stock-based compensation works.

In a typical compensation package, an employee may be offered an annual contract with, say, 33% of the compensation coming in the form of restricted stock units and the rest in cash. This means that an employee who is on a $100,000 annual package will get $33,000 worth of shares per year.

But here’s the catch. This $33,000 worth of shares is based on the share price at the time of signing the employment contract. If the share price rises, the amount that the employee receives each year will be more than $33,000.

For instance, many Nvidia employees who were hired before the massive run up in its stock price the last few years are now receiving shares every quarter that are worth so much than when they joined the company.

Here’s how the math works. Let’s say you were hired by Nvidia five years ago. Back then, its shares were trading at a split-adjusted price of US$6.10 each. You were given a US$200,000 annual package for five years that consists of US$134,000 in cash and US$66,000 in stock . Using the stock price of US$6.10, the US$66,000 in stock-based compensation means you will receive 10,819 shares each year. The number of shares that you receive each year is fixed, even if the stock price goes up or down. Fast forward to today, and that 10,819 shares that you receive each year is now worth US$1.1 million.

Typically, stock grants only last for a few years before they expire and new grants will be made at the current stock price. This is why some employees may want to leave the company after the stock price has run up a lot and they have collected all their shares from the initial grant.

Stock-based compensation lets an employee enjoy the potential upside from a company’s stock without having to put down their own capital to buy shares. For instance, you, the Nvidia employee who was hired five years ago, essentially “bought” US$330,000 (US$66,000 multiplied by 5) worth of Nvidia shares five years ago. That’s a huge bet for most people, but stock-based compensation allows an employee to enjoy the returns of this bet without actually having to buy shares.

Potential downsides

However, there are potential downsides for an employee who takes a pay package that has a significant component in stock-based compensation.

For public-listed companies, employees can sell the shares when they vest. But for private companies, the shares are illiquid and employees may not have an easy way to convert the shares to cash. In addition, employees who work for a small startup and get shares in the startup have a high risk that the startup fails and the company’s shares ends up worthless.

I know of friends who are stuck with shares in companies they previously worked for. These companies may be struggling or have no clear path to an IPO or to be acquired, leading my friends to be stuck with shares of the companies without any real means to cash out.

The risk for employees of public-listed companies who receive stock-based compensation is that the share price falls. This is the case for many US-listed technology companies after 2021, with many of their stock prices still down by 50% or more from their 2021 peak.

Imagine if you joined Okta in April 2021 and received a 4-year pay package of US$200,000 consisting of US$134,000 in cash and US$66,000 in shares. Back then the shares were trading at around US$244 each, so you would receive 270 shares per year. Today, 270 Okta shares are worth US$27,049 – a materially smaller sum compared to the grant value of US$66,000. You would have been better off taking US$200,000 in all-cash compensation.

Bottom line

All things considered, despite some drawbacks, stock-based compensation is still an attractive proposition for an employee as it allows them to make a huge “bet” at the grant date stock price of a company without laying out any capital at all.

If the stock surges like Nvidia’s, then the employee could be set for life. However, if the stock fails, employees still get the cash portion of the annual pay package.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I have a vested interest in Okta. Holdings are subject to change at any time.

What The USA’s Largest Bank Thinks About The State Of The Country’s Economy In Q1 2025

Insights from JPMorgan Chase’s management on the health of American consumers and businesses in the first quarter of 2025.

JPMorgan Chase (NYSE: JPM) is currently the largest bank in the USA by total assets. Because of this status, JPMorgan is naturally able to feel the pulse of the country’s economy. The bank’s latest earnings conference call – for the first quarter of 2025 – was held last week and contained useful insights on the state of American consumers and businesses. The bottom-line is this: the US economy is facing turbulence, with a multitude of problems, but consumers and businesses still remain financially healthy

What’s shown between the two horizontal lines below are quotes from JPMorgan’s management team that I picked up from the call.


1. The US economy is facing turbulence, with problems including tariffs, trade wars, inflation, and high asset prices

The economy is facing considerable turbulence (including geopolitics), with the potential positives of tax reform and deregulation and the potential negatives of tariffs and “trade wars,” ongoing sticky inflation, high fiscal deficits and still rather high asset prices and volatility. As always, we hope for the best but prepare the Firm for a wide range of scenarios.

2. Net charge-offs for the whole bank (effectively bad loans that JPMorgan can’t recover) rose from US$1.9 billion a year ago; management increased the probability weightings for downside scenarios in its CECL (current expected credit losses) framework for credit allowances in 2025 Q1 because of higher risks and uncertainties from the environment seen in the last few weeks; the increase in allowance is not driven by deterioration in credit performance; Consumer & Community Banking’s net charge-offs surged significantly from US$0.72 billion a year ago 

Credit costs were $3.3 billion with net charge offs of $2.3 billion and a net reserve bill of $973 million…

…With this quarter’s reserve bill, firm’s total allowance for credit losses is $27.6 billion. Let’s take a second to add a little bit of context to our thinking surrounding this number in light of the unique environment of the last several weeks. Our first quarter allowance is anchored on the relatively benign central case economic outlook, which was in effect at the end of the quarter. But in light of the significantly elevated risks and uncertainties at the time, we increased the probability weightings associated with the downside scenarios in our CECL framework. As a result, the weighted average unemployment rate embedded in our allowance is 5.8%, up from 5.5% last quarter, driving the $973 million increase in the allowance. So with that in mind, the consumer build of $441 million was driven by changes in the weighted average macroeconomic outlook. The wholesale build of $549 million was predominantly driven by credit quality changes on certain exposures and that lending activity, as well as changes in the outlook…

…The increase in the allowance is not to any meaningful degree driven by deterioration in the actual credit performance in the portfolio which remains largely in line with expectations…

…Credit costs were $2.6 billion, reflecting net charge-offs of $2.2 billion, up $275 million year on year, pred ominantly driven by the seasoning of recent vintages and card with delinquencies and losses in line with expectations.

3. Management is seeing recent downtrends in consumer and small business sentiment, but consumers and small businesses remain financially healthy; management is seeing consumers front-load spending ahead of tariffs; management is seeing small businesses face more challenges than large businesses because of tariffs-related uncertainty; management is seeing a drop in travel-spending among consumers, but it’s not indicative of broader patterns; management is seeing relatively weaker spending from lower-income consumers, but they are not in distress 

Consumers and small businesses remain financially healthy despite the recent down trends in consumer and small business sentiment. Based on our data, spend, cash buffers, payment to income ratios, and credit utilization are all in line with our expectations…

…On the consumer side, the thing to check is the spending. And to be honest, the main thing that we see there would appear to be a certain amount of front-loading of spending ahead of people expecting price increases from tariffs…

…In terms of our corporate clients, obviously, they’ve been reacting to the changes in tariff policy… Across the size of the clients, I think smaller clients, small business, and smaller corporates are probably a little bit more challenged. I think the larger corporates have a bit more experience dealing with these things and more resources to manage…

…We obviously saw the airlines discuss what they are seeing as headwinds for them, specifically in airline travel. And we’re seeing that too through the card spend. It’s not obvious to us that that’s necessarily an indicator for broader patterns…

…When we look at our card data and also our cash buffers and people checking accounts, of course, it is true that it is relatively weaker in the lower income segment. But when you take a step back and you ask, are we seeing signs of distress in the lower income segment? The answer is no. So sure, the margin cash buffers are lower, and you see some rotation of spend and spending is a little bit weaker than it was in the peak spending moments. But actually, some of the increases in spending that we’re seeing in April are actually coming from the lower income segment. So no evidence of distress, I would say.

4. JPMorgan’s credit card outstanding loans was up double-digits year-on-year

Card outstandings were up 10% due to strong account acquisition.

5. Auto originations were up year-on-year

In auto, originations were $10.7 billion, up 20%, driven by higher lease volume.

6. JPMorgan’s investment banking fees had good growth in 2025 Q1, with growth in debt underwriting fees but a decline in equity underwriting fees, signalling higher appetite for refinancing activity from companies; management is seeing companies adopting a wait-and-see attitude when it comes to capital markets activities because of tariffs-related uncertainty in the current environment

IB fees were up 12% year on year, and we ranked number one with wallet share of 9%. In advisory, fees were up 16%, benefiting from the closing of deals announced in 2024. Debt underwriting fees were up 16%, primarily driven by elevated refinancing activity, particularly in leveraged finance. And equity underwriting fees were down 9% year on year, reflecting challenging market conditions. In light of market conditions, we are adopting a cautious stance on the investment banking outlook. While client engagement and dialogue is quite elevated, both the conversion of the existing pipeline and origination of new activity will require a reduction in the current levels of uncertainty…

…In terms of our corporate clients, obviously, they’ve been reacting to the changes in tariff policy. And at the margin, that shifts their focus away from more strategic priorities with obvious implications for the investment banking pipeline outlook towards more short-term work, optimizing supply chains, and trying to figure out how they’re going to respond to the current environment. So as a result, I think we would characterize what we’re hearing from our corporate clients as a little bit of a wait-and-see attitude.

7. Management expects credit card net charge-offs for 2025 to be in line with previous guidance because of the mechanical way credit card charge-offs work, and not because management thinks credit card net charge-offs will really be healthy as the year progresses

On credit, we expect the card net charge-off rate to be in line with our previous guidance of approximately 3.6%…

…[Question] No change to the full year credit card net charge-off forecast. How do we square that with the rising recession risk?

[Answer] We should have not given you that forecast. We don’t know what the number is going to be. I would say that’s a short-term number. And based on what’s happening today is there’s a wide range of potential outcomes… There are some mechanical elements to the way card charge-off works. That means that it’s pretty baked, pretty far out of time a couple of quarters… It just doesn’t necessarily tell you that much about what might actually happen through the end of the year, even if unemployment were to increase significantly, it probably wouldn’t flow through the charge-offs until later.

8. Management is now incorporating 3 interest rate cuts for 2025, up from the previous expectation of 1 cut

If you remember last quarter we said that we had one cut in the curve. I think, latest curve has something like three cuts.

9. JPMorgan’s economists think there’s a 50% chance of a recession

What I would say is our excellent economists, Michael Feroli, I call him this morning, specifically to ask him how they’re looking at their forecast today. And they think it’s about 50-50 for a recession. So I’ll just refer to that.

10. Management thinks inflation in the US will be sticky

We have sticky inflation. We had that before. I personally have told you I don’t think that’s going to go away, and that relates to that.

11. Management thinks the US dollar will remain the reserve currency globally

Obviously, the US dollar still is the reserve currency, and that isn’t going to change though some people may feel slightly differently about it.

12. Management thinks that the current situation is different from past cycles

[Question] You’ve been through many cycles. And I think we’re all interested in understanding how you think this next cycle is likely to progress. And I’m wondering, is there anything that you’ve seen in the past that looks like this or that you would suggest if any slowdown coming forward, is it more likely to be similar to what kind of prior cycle you’ve seen?

[Answer] This is different, okay? This is different. This is the global economy. And please read my chairman’s letter. The most important thing to me is the Western world stays together economically when we get through all this and militarily to keep the world safe and free for democracy. That is the most important thing… We obviously have to follow the law of the land, but it’s a significant change we’ve never seen in our lives.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I don’t have a vested interest in any company mentioned. Holdings are subject to change at any time.

What The USA’s Largest Bank Thinks About The State Of The Country’s Economy In Q4 2024

Insights from JPMorgan Chase’s management on the health of American consumers and businesses in the fourth quarter of 2024.

JPMorgan Chase (NYSE: JPM) is currently the largest bank in the USA by total assets. Because of this status, JPMorgan is naturally able to feel the pulse of the country’s economy. The bank’s latest earnings conference call – for the fourth quarter of 2024 – was held earlier this week and contained useful insights on the state of American consumers and businesses. The bottom-line is this: the US economy remains resilient, but two significant risks remain, namely, persistent inflation and dangerous geopolitical conditions 

What’s shown between the two horizontal lines below are quotes from JPMorgan’s management team that I picked up from the call.


1. The US economy remains resilient, with low unemployment and healthy consumer spending; businesses are now more optimistic about the economy

The U.S. economy has been resilient. Unemployment remains relatively low, and consumer spending stayed healthy, including during the holiday season. Businesses are more optimistic about the economy, and they are encouraged by expectations for a more pro-growth agenda and improved collaboration between government and business.

2. Management sees two significant risks, namely, persistent inflation, and the most dangerous geopolitical conditions since World War II; management thinks a high level of optimism is embedded in asset prices; management is focused on being prepared for a wide range of scenarios

Two significant risks remain. Ongoing and future spending requirements will likely be inflationary, and therefore, inflation may persist for some time. Additionally, geopolitical conditions remain the most dangerous and complicated since World War II…

…We think it’s important to acknowledge the tension in the risks and uncertainties in the environment and the degree of optimism embedded in asset prices and expectations. In that context, we remain upbeat about the strength of the franchise, but we are focused on being prepared for a wide range of scenarios.

3. Net charge-offs for the whole bank (effectively bad loans that JPMorgan can’t recover) rose from US$2.2 billion a year ago; Consumer & Community Banking’s net charge-offs rose by US$0.4 billion from a year ago

Credit costs were $2.6 billion, reflecting net charge-offs of $2.4 billion and a net reserve of $267 million…

…In terms of credit performance this quarter, credit costs were $2.6 billion, reflecting net charge-offs of $2.1 billion, up $428 million year-on-year driven by card. The net reserve build was $557 million predominantly driven by higher card revolving balances.

4. JPMorgan’s credit card outstanding loans was up double-digits; management expects card loans to grow in 2025, but at a slower pace than in 2024

Card outstandings were up 11% due to strong account acquisition and revolvers…

… We expect healthy card loan growth again this year but below the 12% pace we saw in 2024 as tailwinds from revolver normalization are largely behind us. 

5. Auto originations were up

In auto, originations were $10.6 billion, up 7%, reflecting higher lease volume on robust new vehicle inventory. 

6. JPMorgan’s investment banking fees had strong growth in 2024 Q4, with strong growth in debt underwriting and equity underwriting fees, signalling higher appetite for capital-markets activity from companies; management is optimistic about companies’ enthusiasm towards capital markets activities

IB fees were up 49% year-on-year, and we ranked #1 with wallet share of 9.3% for 2024. Advisory fees were up 41%, benefiting from large deals and share growth in a number of key sectors. Underwriting fees were up meaningfully with debt up 56% and equity up 54% primarily driven by favorable market conditions. In terms of the outlook for the overall Investment Banking wallet, in light of the positive momentum, we remain optimistic about our pipeline. 

7. Management is seeing companies paydown bank loans and is not seeing loan growth, but the lack of loan growth is not necessarily a negative thing, as it involves companies having wide access to capital markets

Global Corporate and Investment Banking loans were down 2% quarter-on-quarter driven by paydowns and lower short-term financing, primarily offset by originations. In Commercial Banking, middle market loans were also down 2% driven by paydowns, predominantly offset by new originations. And commercial real estate loans were flat as new originations were offset by paydowns…

…I think given the significant improvement in business sentiment and the general optimism out there, you might have expected to see some big open loan growth. We are not really seeing that. I don’t particularly think that’s a negative. I think it’s probably explained by a combination of wide open capital markets and so many of the larger corporates accessing the capital markets and healthy balance sheets in small businesses and maybe some residual caution. And maybe there are some pockets in some industries where some aspects of the policy uncertainty that we might be facing are making them a little bit more cautious than they otherwise would be about what they’re executing in the near term. But we’ll see what the new year brings. The current optimism starts getting tested with reality one way or the other.

8. Management is incorporating interest rate cuts in 2025

We expect 2025 NII ex Markets to be approximately $90 billion. Going through the drivers, as usual, the outlook assumes that rates follow the forward curve. It’s worth noting that the NII decrease is driven by both the cut expected in 2025 and the impact of the 100 basis points of cuts in the back half of 2024. 

9. Management expects credit card net charge-offs in 2025 of 3.6%, up from 3.34% in 2024

On credit, we expect the 2025 card net charge-off rate to be in line with our previous guidance of approximately 3.6%.

10. Management has extra capital for JPMorgan as they think there’s a good chance the bank can deploy the capital at better prices in the future, but they’re not increasing the size of the extra capital

The way we’re thinking about it right now is that we feel very comfortable with the notion that it makes sense for us to have a nice store of extra capital in light of the current environment. We believe there is a good chance that there will be a moment where we get to deploy it at better levels essentially in whatever way than the current opportunities would suggest. And so that feels like a correct kind of strategic and financial decision for us. Having said that, having studied it quite extensively over the last 6 months and have all these debates you would expect, we’ve concluded that we do have enough. We have not [indiscernible]. And given that, we would like to not have the excess grow from here.

11. The mortgage market for housing looks poor given the high interest rates there

You know well the state of the mortgage market given rates. 

12. Management thinks that the biggest sources of risk to the credit market are unemployment and stagflation

Just the biggest driver of credit has been and always will be unemployment, both on the consumer side and it feeds into the corporate side. It feeds into mortgages, subprime, credit card. So really it’s your forecast of unemployment. You have to make your own, which will determine that over time. And so the second thing you said vulnerabilities. It’s unemployment, but the worst case would be stagflation. High rates with higher unemployment will drive higher credit losses literally across the board. I’m not — we’re not predicting that, but you just ask for the vulnerabilities. That’s the vulnerabilities.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I don’t have a vested interest in any company mentioned. Holdings are subject to change at any time.