More Of The Latest Thoughts From American Technology Companies On AI

A vast collection of notable quotes on artificial intelligence, or AI, from the management teams of US-listed technology companies.

Nearly a month ago, I published The Latest Thoughts From American Technology Companies On AI. In it, I shared commentary in earnings conference calls for the second quarter of 2023, from the leaders of technology companies that I follow or have a vested interest in, on the topic of AI and how the technology could impact their industry and the business world writ large. 

A few more technology companies I’m watching hosted earnings conference calls for 2023’s second quarter after the article was published. The leaders of these companies also had insights on AI that I think would be useful to share. Here they are, in no particular order:

Adobe (NASDAQ: ADBE)

Adobe is using its rich datasets to create foundation models in areas where the company has expertise; Firefly has generated >2 billion images in 6 months 

Our rich datasets enable us to create foundation models in categories where we have deep domain expertise. In the 6 months since launch, Firefly has captivated people around the world who have generated over 2 billion images.

Adobe will allow users to create custom AI models using their proprietary data as well as offer Firefly APIs so that users can embed Firefly into their workflows

Adobe will empower customers to create custom models using proprietary assets to generate branded content and offer access to Firefly APIs so customers can embed the power of Firefly into their own content creation and automation workflows.

Adobe is monetising its generative AI features through generative credits; the generative credits have limits to them, but the limits are set in a way where users can really try out Adobe’s generative AI functions and build the use of generative AI into a habit

We announced subscription offerings, including new generative AI credits with the goal of enabling broad access and user adoption. Generative credits are tokens that enable customers to turn text-based prompts into images, vectors and text effects, with other content types to follow. Free and trial plans include a small number of monthly fast generative credits that will expose a broad base of prospects to the power of Adobe’s generative AI, expanding our top of funnel. Paid Firefly, Express and Creative Cloud plans will include a further allocation of fast generative credits. After the planned specific number of generative credits is reached, users will have an opportunity to buy additional fast generative credits subscription packs…

…First of all, it was a very thoughtful, deliberate decision to go with the generative credit model. And the limits, as you can imagine, were very, very considered in terms of how we set them. The limits are, of course, fairly low for free users. The goal there is to give them a flavor of it and then help them convert. . And for paid users, especially for people in our Single Apps and All Apps plans, one of the things we really intended to do is try and drive real proliferation of the usage. We didn’t want there to be generation anxiety, put in that way. We wanted them to use the product. We wanted the Generative Fill and Generative Expand. We wanted the vector creation. We want to build the habits of using it. And then what will happen over time as we introduce 3D, as we introduce video and design and vectors, and as we introduce these Acrobat capabilities that Shantanu was talking about, the generative credits that are used in any given month continues to go up because they’re getting more value out of it. And so that’s the key thing. We want people to just start using it very actively right now and build those habits.

Brands around the world are using Adobe’s generative AI – through products such as Adobe GenStudio – to create personalised customer experiences at scale; management sees Adobe GenStudio as a huge new opportunity; Adobe itself is using GenStudio for marketing its own products successfully and it’s using its own success as a selling point

Brands around the globe are working with Adobe to accelerate personalization at scale through generative AI. With the announcement of Adobe GenStudio, we are revolutionizing the entire content supply chain by simplifying the creation-to-activation process with generative AI capabilities and intelligent automation. Marketers and creative teams will now be able to create and modify commercially safe content to increase the scale and speed at which experiences are delivered…

…Shantanu and David already talked about the Adobe GenStudio, and we’re really excited about that. This is a unique opportunity, as you said, for enterprises to really create personalized content and drive efficiencies as well through automation and efficiency. And when you look at the entire chain of what enterprises go through from content creation, production workflow and then activation through DX through all the apps we have on our platform, we have the unique opportunity to do that. We already have deployed it within Adobe for our own Photoshop campaign, and we’re working with a number of agencies and customers to do that. So this is a big net new opportunity for us with Adobe GenStudio…

…And if I could actually just add one quick thing at the GenStudio work that Anil team has been doing, we’ve actually been using that within the Digital Media business already to release some of the campaigns that we’ve released this quarter. So it’s one of these things that it’s great to see the impact it’s having on our business and that becomes a selling point for other businesses, too.

Inferencing costs for generative AI are expensive, but Adobe’s management is still confident of producing really strong margins for FY2023

[Question] We’ve been told generative AI is really expensive to run. The inference and training costs are really high. 

[Answer] Our customers have generated over 2 billion images. And I know it’s not lost on people, all this was done while we’re delivering strong margins. But when we take a step back and think about these technologies, we have investments from a COGS standpoint, inferencing, content; from an R&D standpoint, training, creating foundation models. And David alluded to it in his prepared comments, the image model for Firefly family of models is out, but we’re going to bring other media types to market as well so we’re making substantive investments. When I go back to the framing of my prepared comments, we really have a fundamental operating philosophy that’s been alive at the company for a long time: growth and profitability. We’re going to prioritize, we’re going to innovate and we’re going to execute with rigor…

…As we think about going — the profile going forward, what I’ll come back to is when we initially set fiscal 2023 targets, implicit in those targets was a 44.5% operating margin. If you think about how we just guided Q4… implicit in that guide is an operating margin of around 45.5%.

So as you think about us leading this industry, leading the inflection that’s unfolding in front of us, that mid-40s number, we think, is the right ballpark to think about the margin structure of the company as we continue to drive this technology and leadership. 

Adobe’s management thinks about generative AI’s impact on the company’s growth through two lenses: (1) acquiring new users, and (2) growing the spend of existing customers; for growing the spend of existing customers, Adobe has recently increased the pricing of its products

Yes, Shantanu said that we look at the business implications of this through those two lenses: new user adoption, first and foremost; and then sort of opportunity to continue to grow the existing book of business. On the new user side, we’ve said this for years: our focus continues to be on proliferation. We believe that there — we have a massive number of users in front of us. We continue to have our primary focus being net user adds and subscribers. And so the goal here in proliferation is to get the right value to the right audience at the right price…

…The second thing is going to be on the book of business. And here, we’re — basically, the pricing changes, just as a reminder, they have a rolling impact. 

Adobe’s management took a differentiated approach with Firefly when building the company’s generative AI capabilities, with a focus on using licensed content for training where Adobe has the rights to use the content 

So from the very beginning of Firefly, we took a very different approach to how we were doing generative. We started by looking at and working off the Adobe Stock base, which are contents that are licensed and very clearly we have the rights to use. And we looked at other repositories of content where they didn’t have any restrictions on usage, and we’ve pulled that in. So everything that we’ve trained on has gone through some form of moderation and has been cleared by our own legal teams for use in training. And what that means is that the content that we generate is, by definition, content that isn’t then stepping on anyone else’s brand and/or leveraging content that wasn’t intended to be used in this way. So that’s the foundation of what we’ve done.

Adobe is sharing the economic spoils with the creators of the content it has been training its generative AI models on

We’ve been working with our Stock contributors. We’ve announced, and in fact, yesterday, we had our first payout of contributions to contributors that have been participating and adding stock for the AI training. And we’re able to leverage that base very effectively so that if we see that we need additional training content, we can put a call to action, call for content, out to them, and they’re able to bring content to Adobe in a fully licensed way. So for example, earlier this quarter, we decided that we needed 1 million new images of crowd scenes. And so we put a call to action out. We were able to gather that content in. But it’s fully licensed and fully moderated in terms of what comes in. So as a result, all of the content we generate is safe for commercial use.

Adobe’s management is seeing that enterprise customers place a lot of importance on working with generated AI content that is commercially safe

The second thing is that because of that, we’re able to go to market and also indemnify customers in terms of how they’re actually leveraging that content and using it for content that’s being generated. And so enterprise customers find that to be very important as we bring that in not just in the context of Firefly stand-alone but we integrated into our Creative Cloud applications and Express applications as well. 

Adobe’s management has been very focused on generating fair (in population diversity, for example) and safe content in generative AI and they think this is a good business decision

We’ve been very focused on fair generation. So we look intentionally for diversity of people that are generated, and we’re looking to make sure that the content we generate doesn’t create or cause any harm. And all of those things are really good business decisions and differentiate us from others. 

One of the ways Adobe’s management thinks generative AI could be useful in PDFs is for companies to be able to have conversations with their own company-wide knowledge base that is stored in PDFs – Adobe is already enabling this through APIs

Some of the things that people really want to know is how can I have a conversational interface with the PDF that I have, not just the PDF that I have opened right now but the PDF that are all across my folder, then across my entire enterprise knowledge management system, and then across the entire universe. So much like we are doing in Creative, where you can start to upload your images to get — train your own models within an enterprise, well, it is often [ hard-pressed ]. The number of customers who want to talk to us now that we’ve sort of designed this to be commercially safe and say, “Hey, how do we create our own model,” whether you’re a Coke or whether you’re a Nike, think of them as having that. I think in the document space, the same interest will happen, which is we have all our knowledge within an enterprise associated with PDFs, “Adobe, help me understand how your AI can start to deliver services like that.” So I think that’s the way you should also look at the PDF opportunity that exists, just more people taking advantage of the trillions of PDFs that are out there in the world and being able to do things…

… So part of what we are also doing with PDFs is the fact that you can have all of this now accessible through APIs. It’s not just the context of the PDF, the semantic understanding of that to do specific workflows, we’re starting to enable all of that as well. 

When it comes to generative AI products, Adobe’s goal for enterprises and partners is to provide (1) API access, (2) ability to train their own models, and (3) core workflows that gel well with Adobe’s existing products; management is thinking about extending the same metering concepts as Adobe’s generative credits to API calls too

Our goal right now, for enterprises and third-parties that we work with, is to provide a few things. The first is this ability, obviously, to have API access to everything that we are building in, so that they can build it into their workflows and their automation stack. The second thing is to give them the ability to extend or train their own models as well. So if — as we mentioned earlier, our core model, foundation model is a very clean model. It generates great content and you can rely on it commercially. We want our customers and partners to be able to extend that model with content that is relevant to them so that Firefly is able to generate content in their brand or in their style. So we’ll give them the ability to train their own model as well. And then last, but certainly not least, we’ll give them some core workflows that will work with our existing products, whether it’s Express or whether it’s Creative Cloud or GenStudio as well, so that they can then integrate everything they’re doing onto our core platform.

And then from a monetization perspective, you can imagine the metering concepts that we have for generative credits extending to API calls as well. And of course, those will all be custom negotiated deals with partners and enterprises.

Adobe is its own biggest user of the AI products it has developed for customers – management thinks this is a big change for Adobe because the extent of usage internally of its AI products is huge, and it has helped improve the quality of the company’s AI products

So I think the pace of innovation internally of what we have done is actually truly amazing. I mean relative to a lot of the companies that are out there and the fact that we’ve gone from talking about this to very, very quickly, making it commercially available, I don’t want to take for granted the amount of work that went into that. I think internally, it is really galvanized because we are our own biggest user of these technologies. What we are doing associated with the campaigns and the GenStudio that we are using, as David alluded to it, our Photoshop Everyone Can Campaign or the Acrobat’s Got It campaign or how we will be further delivering campaigns for Express as well as for Firefly, all of this is built on this technology. And we use Express every day, much like we use Acrobat every day. So I think it’s really enabled us to say are we really embracing all of this technology within the company. And that’s been a big change because I think the Creative products, we’ve certainly had phenomenal usage within the company, but the extent to which the 30,000 employees can now use our combined offering, that is very, very different internally

DocuSign (NASDAQ: DOCU)

DocuSign has a new AI-powered feature named Liveness Detection for ID verification, which has reduced the time needed for document signings by 60%

Liveness Detection technology leverages AI-powered biometric checks to prevent identity spoofing, which results in more accurate verification without the signee being present. ID Verification is already helping our customers. Our data shows that it has reduced time to sign by about 60%.

DocuSign is already monetising AI features directly

Today, we’re already monetizing AI directly through our CLM+ product and indirectly through its use in our products such as search. 

DocuSign is partnering with AI Labs to build products in closer collaboration with customers

Our next step on that journey is with AI Labs. With AI Labs, we are co-innovating with our customers. We provide a sandbox where customers can share a select subset of agreements, try new features we’re testing. Our customers get early access to developing technology and re-receive early feedback that we will incorporate into our products. By working with our customers in the development phase, we’re further reinforcing the trusted position we’ve earned over the last 20 years. 

DocuSign’s management is excited about how AI – especially generative AI – can help the company across the entire agreement workflow

We think AI will impact practically all of our products at every step with the agreement workflow. So I don’t know that there’s a — just a one call out. But maybe to off a couple that I’m most interested in, I certainly think that the broader, should we say, agreement analytics category is poised to be completely revamped with generative AI. 

DocuSign has been an early investor in AI but had been held back by fundamental technology until the introduction of generative AI

We were an early investor in that category. We saw that coming together with CLM 4 or 5 years ago and made a couple of strategic investments and been a leader in that space, but have been held back by fundamental technology. And I think now with generative AI, we can do a substantially better job more seamlessly, lighter weight with less professional services. And so I’m very excited to think about how it transformed the CLM category and enables us to deliver more intelligent agreements. I think you mentioned IDV [ID Verification]. I agree 100%. Fundamentally, that entire category is AI-enabled. The upload and ingestion of your ID recognition of it and then that Liveness Detection where we’re detecting who you are and that you are present and matching that to ID, that would simply not be possible without today’s AI technology and really takes just dramatically reshapes the ability to trade off risk and convenience. So I think that’s a good one. 

MongoDB (NASDAQ: MDB)

There are 3 important things to focus on when migrating off a relational database, and MongoDB’s management thinks that generative AI can help with one of them (the rewriting of the application code)

So with regards to Gen AI, I mean, we do see opportunities essentially, the reason when you migrate off using relational migrator, there’s really 3 things you have to focus on. One is mapping the schema from the old relational database to the MongoDB platform, moving the data appropriately and then also rewriting some, if not all, of the application code. Historically, that last component has been the most manually intensive part of the migration, obviously, with the advance of cogeneration tools. These opportunities to automate the rewriting of the application code. I think we’re still in the very early days. You’ll see us continue to add new functionality to relational migrator to help again reduce the switching costs of doing so. And that’s obviously an area that we’re going to focus. 

MongoDB introduced Atlas Vector Search, its vector database which allows developers to build AI applications, and it is seeing significant interest; management hopes to bring Atlas Vector Search to general availability (GA) sometime next year, but some customers are already deploying it in production

We also announced Atlas Vector Search, which enables developers to store, index and query Vector embeddings, instead of having to bolt on vector search functionality separately, adding yet another point solution and creating a more fragmented developer experience. Developers can aggregate and process the vectorized data they need to build AI applications while also using MongoDB to aggregate and process data and metadata. We are seeing significant interest in our vector search offering from a large and sophisticated enterprise customers even though it’s only — still only in preview. As one example, a large global management consulting firm is using Atlas Vector Search for internal research applications that allows consultants to semantically search over 1.5 million expert interview transcripts…

…Obviously, Vector is still in public preview. So we hope to have a GA sometime next year, but we’re really excited about the early and high interest from enterprises. And obviously, some customers are already deploying it in production, even though it’s a public preview product.

MongoDB’s management believes that AI will lead developers to write more software and these software will be exceptionally demanding and will thus require high-performance databases

Over time, AI functionality will make developers more productive to the use of code generation and code assist tools that enable them to build more applications faster. Developers will also be able to enrich applicants with compelling AI experiences by enabling integration with either proprietary or open source large language models to deliver more impact. Now instead of data being used only by data scientists who drive insights, data can be used by developers to build smarter applications that truly transform a business. These AI applications will be exceptionally demanding, requiring a truly modern operational data platform like MongoDB. 

MongoDB’s management believes MongoDB has a bright future in the world of AI because (1) the company’s document database is highly versatile, (2) AI applications need a high-performant, scalable database and (3) AI applications have the same requirements for transactional guarantees, security, privacy etc as other applications

In fact, we believe MongoDB has even stronger competitive advantage in the world of AI. First, the document models inherent in flexibility and versatility renders it a natural fit for AI applications. Developers can easily manage and process various data types all in one place. Second, AI applications require high performance, parallel computations and the ability to scale data processing on an ever-growing base of data. MongoDB supports its features, with features like shorting and auto-scaling. Lastly, it is important to remember AI applications have the same demands as any other type of application: Transactional guarantees, security and privacy requirements, tech search, in-app analytics and more. Our developer data platform that gives developer a unified solution to smarter AI applications.

AI startups as well as industrial equipment suppliers are using MongoDB for their AI needs 

We are seeing these applications developed across a wide variety of customer types and use cases. For example, observe.ai is an AI start-up that leverages 40 billion parameter LLM to provide customers with intelligence and coaching that maximize performance of their frontline support and sales teams. Observe.ai processes and run models on millions of support touch points daily to generate insights for their customers. Most of this rich, unstructured data is stored in MongoDB. Observe.ai chose to build on MongoDB because we enable them to quickly innovate, scale to handle large and unpredictable workloads and meet their security requirements of their largest enterprise customers. On the other end of the spectrum is one of the leading industrial equipment suppliers in North America. This company relies on Atlas and Atlas Device sync to deploy AI models at the edge. To their field teams mobile devices to better manage and predict inventory in areas with poor physical network connectivity, they chose MongoDB because of our ability to efficiently handle large quantities of distributed data and to seamlessly integrate between network edge and their back-end systems.

MongoDB’s management sees customers saying that they prefer being able to have one platform handle all their data use-cases (AI included) rather than stitching point solutions together

People want to use one compelling, unified developer experience to address a wide variety of use cases of which AI is just one of them. And we’re definitely hearing from customers to being able to do that on one platform versus bolting on a bunch of point solutions is far more the preferable approach. And so we’re excited about the opportunity there.

MongoDB is working with Google on a number of AI projects

On the other thing on partners, I do want to say that we’re seeing a lot of work and activity with our partner channel on the AI front as well. We’re working with Google in the AI start-up program, and there’s a lot of excitement. Google had their next conference this week. We’re also working with Google to help train Codey, their code generation tool to help people accelerate the development of AI and other applications. And we’re seeing a lot of interest in our own AI innovators program. We’ve had lots of customers apply for that program. So we’re super excited about the interest that we’re generating.

MongoDB’s management thinks there’s a lot of hype around AI in the short term, but also thinks that AI is going to have a huge impact in the long-term, with nearly every application having some AI functionality embedded within over the next 3-5 years

I firmly believe that we, as an industry, tend to overestimate the impact of a new technology in the short term and underestimate the impact in the long term. So as you may know, there’s a lot of hype in the market right now, in the industry right around AI and in some of the early stage companies in the space, have the valuations to the roof. In some cases, almost — it’s hard to see how people can make money because the risk reward doesn’t seem to be sized appropriately. So there’s a lot of hype in the space. But I do think that AI will be a big impact for the industry and for us long term. I believe that almost every application, both new and existing, will have some AI functionality embedded into the application over the next — in your horizon 3 to 5 years.

MongoDB’s management thinks that vector search (the key distinguishing feature of vector databases) is just a feature and not a product, and it will eventually be built into every database as a feature

Vector Search is really a reverse index. So it’s like an index that’s built into all databases. I believe, over time, Vector Search functionality will be built into all databases or data platforms in the future. There are some point products that are just focused solely on Vector Search. But essentially, it’s a point product that still needs to be used with other technologies like MongoDB to store the metadata, the data to be able to process and analyze all that information. So developers have spoken loudly that having a unified and elegant developer experience is a key differentiator. It removes friction in how they work. It’s much easier to build and innovate on one platform versus learning and supporting multiple technologies. And so my strong belief is that, ultimately, Vector Search will be embedded in many platforms and our differentiation will be a — like it always has been a very compelling and elegant developer experience

MongoDB’s management thinks that having vector search as a feature in a database does not help companies to save costs, but instead, improves the overall developer experience

Question: I know that we’re talking about the developers and how they — they’re voting here because they want the data in a unified platform, a unified database that preserves all that metadata, right? But I would think there’s probably also a benefit to having it all in a single platform as well just because you’re lowering the TCO [total cost of ownership] for your customers as well, right? 

Answer: Vectors are really a mathematical representation of different types of data, so there is not a ton of data, unlike application search, where there’s a profound benefits by storing everything on one platform versus having an operational database and a search database and some glue to keep the data in sync. That’s not as much the case with Vector because you’re talking about storing essentially an elegant index. And so it’s more about the user experience and the development workflow that really matters. And what we believe is that offering the same taxonomy in the same way they know how to use MongoDB to also be able to enable Vector Search functionality is a much more compelling differentiation than a developer have to bolt on a separate vector solution and have to provision, configure and manage that solution along with all the other things they have to do.

MongoDB’s management believes developers will become more important in organisations than data scientists because generative AI will position AI in front of software

Some of the use cases are really interesting, but the fact is that we’re really well positioned because what generative AI does is really instantiate AI in front of — in software, which means developers play a bigger role rather than data scientists, and that’s where you’ll really see the business impact. And I think that impact will be large over the next 3 to 5 years.

Okta (NASDAQ: OKTA)

Okta has been using AI for years and management believes that AI will be transformative for the identity market

AI is a paradigm shift in technology that is transformative opportunities for identity, from stronger security and faster application development to better user experiences and more productive employees. Okta has been utilizing AI for years with machine learning models for spotting attack patterns and defending customers against threats, and we’ll have more exciting AI news to share at Oktane.

Okta’s management believes that every company must have an AI strategy, which will lead to more identities to be protected; a great example is how OpenAI is using Okta; Okta’s relationship with OpenAI started a few years ago and OpenAI is now a big customer, accounting for a significant chunk of the US$100m in TCV (total contract value) Okta had with its top 25 transactions in the quarter

Just like how every company has to be a technology company, I believe every company must have an AI strategy. More companies will be founded on AI, more applications will be developed with AI and more identities will need to be protected with a modern identity solution like Okta. A great example of this is how Okta’s Customer Identity Cloud is being utilized for the massive number of daily log-ins, in authentications by OpenAI, which expanded its partnership with Okta again in Q2…

…So OpenAI is super interesting. So they’re — OpenAI as a Customer Identity Cloud customer, which so when you log in, in ChatGPT, you log in through Okta. And it’s interesting because a developer inside of OpenAI 3 years ago picked our Customer Identity Cloud because it had a great developer experience and from the website and started using it. And this Chat — and at the time, it was the log-in for their APIs and then ChatGPT took off. And now, as you mentioned, we’ve had really pretty sizable transactions with them over the last couple of quarters. And so it’s a great testament to our strategy on Customer Identity, having something that appeals to developers.

And you saw they did something pretty interesting — and so this is really a B2C app, right, of ChatGPT but they — now they recently launched their enterprise offering, and they want to connect ChatGPT to enterprises. So this is — Okta is really good at this, too, because our customer identity cloud connects our customers to consumers, but also connects our customers to workforces. So then you have to start supporting things like Single Sign-On and SAML and Open ID and authorization. And so it’s just open API continues to get the benefits of being able to focus on what they want to focus on, which is obviously their models in the LLMs and the capabilities, and we can focus on the identity plumbing that wires it together.

So the transaction was — it was one of the top — I mentioned the top 25 transactions. The total TCV of all this transaction was — this quarter was $100 million. It was one of those top 25 transactions, but I don’t — I haven’t done the math on the TCV for how much of the $100 million it was. But it was one of our — it was on the larger side this quarter.

Okta’s management thinks that identity is a key building block in a number of digital trends, including AI

It’s always a good reminder that identity is a key building block for Zero Trust security, digital transformation, cloud adoption projects and now AI. These trends will continue in any macroeconomic environment as organizations look for ways to become more efficient while strengthening their security posture.

Salesforce (NYSE: CRM)

Salesforce is driving an AI transformation to become the #1 AI CRM (customer relationship management)

And last quarter, we told you we’re now driving our AI transformation. We’re pioneering AI for both our customers and ourselves leading the industry through this incredible new innovation cycle, and I couldn’t be happier with Srini and David and the entire product and technology team for the incredible velocity of AI products that were released to customers this quarter and the huge impact that they’re making in the market and showing how [ tran ] Salesforce is transforming from being not only the #1 CRM, but to the #1 AI CRM, and I just express my sincere gratitude to our entire [ TNP ] team.

Salesforce’s management will continue to invest in AI

We’re in a new AI era, a new innovation cycle that we will continue to invest into as we have over the last decade. As a result, we expect nonlinear quarterly margins in the back half of this year, driven by investment timing, specifically in AI-focused R&D.

Salesforce’s management believes the world is at the dawn of an AI revolution that will spark a new tech buying cycle and investment cycle

AI, data, CRM, trust, let me tell you, we are at the dawn of an AI revolution. And as I’ve said, it’s a new innovation cycle which is sparking amounts of tech buying cycle over the coming years. It’s also a new tech investment cycle…

…And when we talk about growth, I think it’s going to start with AI. I think that AI is about to really ignite a buying revolution. I think we’ve already started to see that with our customers and even some of these new companies like OpenAI. And we certainly see that in our customers’ base as well. 

Salesforce has been investing in many AI startups through its $500 million generative AI fund

 We’ve been involved in the earliest rounds many of the top AI start-ups. Many of you have seen that, we are in there very early…

… Now through our $500 million generative AI fund, we’re seeing the development of ethical AI with amazing companies like Anthropic, [ Cohere ], Hugging Face and some others,

Salesforce has been working on AI early on

But I’ll tell you, this company has pioneered AI, and not just in predictive, a lot of you have followed up the development and growth of Einstein. But also, you’ve seen that we’ve published some of the first papers on prompt engineering in the beginnings of generative AI, and we took our deep learning routes, and we really demonstrated the potential for generative AI and now to see so many of these companies become so successful.

Every CEO Salesforce’s leadership has met thinks that AI is essential to improving their businesses

So every CEO I’ve met with this year across every industry believes that AI is essential to improving both their top and bottom line, but especially their productivity AI is just augmenting what we can do every single day…

…I think many of our customers and ultimately, all of them believe they can grow their businesses by becoming more connected to their customers than ever before through AI and at the same time, reduce cost, increase productivity, drive efficiency and exceed customer expectations through AI. 

All management teams in Salesforce are using Einstein AI to improve their decision-making

Every single management team that we have here at Salesforce every week, we’re using our Einstein AI to do exactly the same thing. We go back, we’re trying to augment ourselves using Einstein. So what we’ll say is, and we’ve been doing this now and super impressive, we’ll say, okay, Brian, what do you think our number is and we’ll say, okay, that’s very nice, Brian. But Einstein, what do you really think the number is? And then Einstein will say, I think Brian is sandbagging and then the meeting continues. 

Salesforce’s management thinks that every company will undergo an AI transformation with the customer at the centre, and this is why Salesforce is well positioned for the future

The reality is every company will undergo an AI transformation with the customer at the center, because every AI transformation begins and ends with the customer, and that’s why Salesforce is really well positioned with the future.

Salesforce has been investing a lot in Einstein AI, and Einstein is democratising generative AI for users of Salesforce’s products; Salesforce’s management thinks that the real value Salesforce brings to the world is the ability to help users utilise AI in a low code or no code way 

And with this incredible technology, Einstein that we’ve invested so much and grown and integrated into our core technology base. We’re democratizing generative AI, making it very easy for our customers to implement every job, every business in every industry. And I will just say that in the last few months, we’ve injected a new layer of generative AI assistance across all of the Customer 360. And you can see it with our salespeople who are now using our Sales Cloud GPT, which has been incredible, what we’ve released this quarter to all of our customers and here inside Salesforce. And then when we see that, they all say to themselves, you know what, in this new world, everyone can now be in Einstein.

But democratizing generative AI at scale for the biggest brands in the world requires more than — that’s just these large language models and deep learning algorithms, and we all know that because a lot of our customers kind of think and they have tried and they go and they pull something off a Hugging Face, it is an amazing company. We just invested in their new round and grab a model and put some data in it and nothing happens. And then they don’t understand and they call us and say, “Hey, what’s happening here? I thought that this AI was so amazing and it’s like, well, it takes a lot to actually get this intelligence to occur. And that’s what I think that’s the value that Salesforce is bringing is that we’re really able to help our customers achieve this kind of technological superiority right out of the box just using our products in a low code, no code way. It’s really just democratization of generative AI at scale. And that is really what we’re trying to achieve that at the heart of every one of these AI transformations becomes our intelligent, integrated and incredible sales force platform, and we’re going to show all of that at Dreamforce

Salesforce is seeing strong customer momentum on Einstein generative AI (a customer – PenFed – used Einstein-powered chatbots to significantly improve their customer service)

We’re also seeing strong customer momentum on Einstein generative AI. PenFed is a great example of how AI plus data plus CRM plus Trust is driving growth for our customers. PenFed is one of the largest credit unions in the U.S., growing at a rate of the next 9 credit unions combined. They’re already using Financial Services Cloud, Experience Cloud and MuleSoft, and our Einstein-powered chatbots handling 40,000 customer service sessions per month. In fact, today, PenFed resolves 20% of their cases on first contact with Einstein-powered chatbots resulting in a 223% increase in chatbot activity in the past year with incredible ROI. In Q2, PenFed expanded with Data Cloud to unify all the customer data from its nearly 3 million members and increase their use of Einstein to roll out generative AI assistant for every single one of their service agents.

Salesforce’s management thinks that customers who want to achieve success with AI needs to have their data in order

But what you can see with Data Cloud is that customers must get their data together if they want to achieve success with AI. This is the critical first step for every single customer. And we’re going to see that this AI revolution is really a data revolution. 

Salesforce takes the issue of trust very seriously in its AI work; Salesforce has built a unique trust layer within Einstein that allows customers to maintain data privacy, security, and more

Everything Einstein does has also delivered with trust and especially ethics at the center, and I especially want to call out the incredible work of our office of ethical and humane use, pioneering the use of ethics and technology. If you didn’t read their incredible article in HBR this quarter. It was awesome. And they are doing incredible work really saying that it’s not just about AI, it’s not just about data, but it’s also about trust and ethics. And that’s why we developed this Einstein trust layer. This is completely unique in the industry. It enables our customers to maintain their data privacy, security, residency and compliance goals.

Salesforce has seen customers from diverse industries (such as Heathrow Airport and Schneider Electric) find success using Salesforce’s AI tools

Heathrow is a great example of transformative power of AI, data, CRM and trust and the power of a single source of truth. They have 70 million passengers who pass through their terminal annually, I’m sure many of you have been one of those passengers I have as well, Heathrow is operating in a tremendous scale, managing the entire airport experience with the Service Cloud, Marketing Cloud, Commerce Cloud, but now Heathrow, they’ve added Data Cloud also giving them a single source of truth for every customer interaction and setting them up to pioneer the AI revolution. And with Einstein, Heathrow’s service agents now have this AI-assisted generator applies to service inquiries, case deflection, writing case summaries, all the relevant data and business context coming from Data Cloud…

…Schneider Electric has been using Customer 360 for over a decade, enhancing customer engagement, service and efficiency. With Einstein, Schneider has refined demand generation, reduced close times by 30%. And through Salesforce Flow, they’ve automated order fulfillment. And with Service Cloud, they’re handling over 8 million support interactions annually, much of it done on our self-service offering. In Q2, Schneider selected Marketing Cloud to further personalize the customer experience.

Salesforce’s management thinks the company is only near the beginning of the AI evolution and there are four major steps on how the evolution will happen

And let me just say, we’re at the beginning of quite a ballgame here and we’re really looking at the evolution of artificial intelligence in a broad way, and you’re really going to see it take place over 4 major zones.

And the first major zone is what’s played out in the last decade, which has been predictive. That’s been amazing. That’s why Salesforce will deliver about [ 1 trillion ] transactions on Einstein this week. It’s incredible. 

These are mostly predictive transactions, but we’re moving rapidly into the second zone that we all know is generative AI and these GPT products, which we’ve now released to our customers. We’re very excited about the speed of our engineering organization and technology organization, our product organization and their ability to deliver customer value with generative AI. We have tremendous AI expertise led by an incredible AI research team. And this idea that we’re kind of now in a generative zone means that’s zone #2.

But as you’re going to see at Dreamforce, zone #3 is opening up with autonomous and with agent-based systems as well. This will be another level of growth and another level of innovation that we haven’t really seen unfold yet from a lot of companies, and that’s an area that we are excited to do a lot of innovation and growth and to help our customers in all those areas.

And then we’re eventually going to move into [ AGI ] and that will be the fourth area. And I think as we move through these 4 zones, CRM will become more important to our customers than ever before. Because you’re going to be able to get more automation, more intelligence, more productivity, more capabilities, more augmentation of your employees, as I mentioned.

Salesforce can use AI to help its customers in areas such as call summaries, account overviews, responding to its customers’ customers, and more

And you’re right, we’re going to see a wide variety of capability is exactly like you said, whether it’s the call summaries and account overviews and deal insights and inside summaries and in-product assistance or mobile work briefings. I mean, when I look at things like service, when we see the amount of case deflection we can do and productivity enhancements with our service teams not just in replies and answers, but also in summaries and summarization. We’ve seen how that works with generative and how important that is in knowledge generation and auto-responding conversations and then we’re going to have the ability for our customers to — with our product.

Salesforce has its own AI models, but Salesforce has an open system – it’s allowing customers to choose any models they wish

We have an open system. We’re not we’re not dictating that they have to use any one of these AI systems. We have an ecosystem. Of course, we have our own models and our own technology that we have given to our customers, but we’re also investing in all of these companies, and we plan to be able to offer them as opportunities for those customers as well, and they’ll be able to deliver all kinds of things. And you’ll see that whether it’s going to end up being contract digitization and cost generation or survey generators or all kinds of campaign assistance.

Slack is going to be an important component of Salesforce’s AI-related work; management sees Slack as an easy-to-use interface for Salesforce’s AI systems

Slack has become incredible for these AI companies, every AI company that we’ve met with is a Slack company. All of them make their agents available for Slack first. We saw that, for example, with Anthropic, where Cloud really appeared first and [ Cloud 2 ], first in Slack.

And Anthropic, as a company uses Slack internally and they have a — they take their technology and develop news digest every day and newsletters and they do incredible things with Slack — Slack is just a treasure trove of information for artificial intelligence, and you’ll see us deliver all kinds of new capabilities in Slack along these lines.

And we’re working, as I’ve mentioned, get Slack to wake up and become more aware and also for Slack to be able to do all of the things that I just mentioned. One of the most exciting things I think you’re going to see a Dreamforce is Slack very much as a vision for the front end of all of our core products. We’re going to show you an incredible new capability that we call Slack Sales Elevate, which is promoting our core Sales Cloud system running right inside Slack.

That’s going to be amazing, and we’re going to also see how we’re going to release and deliver all of our core services in sales force through Slack. This is very important for our company to deliver Slack very much as a tremendous easy-to-use interface on the core Salesforce, but also all these AI systems. So all of that is that next generation of artificial intelligence capability, and I’m really excited to show all of that to you at Dreamforce as well as Data Cloud as well.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I have a vested interest in Adobe, DocuSign, MongoDB, Okta, and Salesforce. Holdings are subject to change at any time.