More Thoughts From American Technology Companies On AI

A vast collection of notable quotes on artificial intelligence, or AI, from the management teams of US-listed technology companies.

Nearly a month ago, I published What American Technology Companies Are Thinking About AI. In it, I shared commentary in earnings conference calls, from the leaders of technology companies that I follow or have a vested interest in, on the topic of AI and how the technology could impact their industry and the business world writ large. 

A few more technology companies I’m watching hosted earnings conference calls after the article was published. The leaders of these companies also had insights on AI that I think would be useful to share. Here they are, in no particular order:

Adobe (NASDAQ: ADBE)

Adobe’s management will be building foundational models as well as using generative AI as co-pilots for users

Our generative AI strategy focuses on data, models and interfaces. Our rich datasets across creativity, documents and customer experiences enable us to train models on the highest quality assets. We will build foundation models in the categories where we have deep domain expertise, including imaging, vector, video, documents and marketing. We are bringing generative AI to life as a co-pilot across our incredible array of interfaces to deliver magic and productivity gains for a broader set of customers. 

Adobe’s management thinks its generative AI feature, Firefly, has multiple monetisation opportunities, but will only introduce specific pricing later this year with a focus on monetisation right now

Our generative AI offerings represent additional customer value as well as multiple new monetization opportunities. First, Firefly will be available both as a stand-alone freemium offering for consumers as well as an enterprise offering announced last week. Second, copilot generative AI functionality within our flagship applications will drive higher ARPUs and retention. Third, subscription credit packs will be made available for customers who need to generate greater amounts of content. Fourth, we will offer developer communities access to Firefly APIs and allow enterprises the ability to create exclusive custom models with their proprietary content. And finally, the industry partnerships as well as Firefly represent exciting new top-of-funnel acquisition opportunities for Express, Creative Cloud and Document Cloud. Our priority for now is to get Firefly broadly adopted, and we will introduce specific pricing later this year.

Adobe is seeing outstanding customer demand for generative AI features

We’re really excited, if you can’t tell on the call, about Firefly and what this represents. The early customer and community response has been absolutely exhilarating for all of us. You heard us talk about over 0.5 billion assets that have already been generated. Generations from Photoshop were 80x higher than we had originally projected going into the beta and obviously, feel really good about both the quality of the content being created and also the ability to scale the product to support that

Adobe has built Firefly to be both commercially as well as socially safe for use

Third is that, and perhaps most importantly, we’ve also been able to — because of the way we share and are transparent about where we get our content, we can tell customers that their content generated with Firefly is commercially safe for use. Copyrights are not being violated. Diversity and inclusion is front and center. Harmful imagery is not being generated.

Adobe’s management believes that (1) marketing will become increasingly personalised, (2) the personalisation has to be done at scale, and (3) Adobe can help customers achieve the personalisation with the data that it has

I think if you look at Express and Firefly and also the Sensei GenAI services that we announced for Digital Experience, comes at a time when marketing is going through a big shift from sort of mass marketing to personalized marketing at scale. And for the personalization at scale, everything has to be personalized, whether it’s content or audiences, customer journeys. And that’s the unique advantage we have. We have the data within the audience — the Adobe Experience Platform with the real-time customer profiles. We then have the models that we’re working with like Firefly. And then we have the interfaces through the apps like Adobe Campaign, Adobe Experience Manager and so on.So we can put all of that together in a manner that’s really consistent with the data governance that people — that customers expect so that their data is used only in their context and use that to do personalized marketing at scale. So it really fits very well together.

Adobe’s management believes that content production will increase significantly in the next few years because of AI and this will lead to higher demand for more software-seats

And we’re sitting at a moment where companies are telling us that there’s a 5x increase in content production coming out in the next few — next couple of years. And you see a host of new media types coming out. And we see the opportunity here for both seat expansion as a result of this and also because of the value we’re adding into our products themselves, increase in ARPU as well.

DocuSign (NASDAQ: DOCU)

DocuSign’s management believes that generative AI can transform all aspects of the agreement workflow

In brief, we believe AI unlocks the true potential of the intelligent agreement category. We already have a strong track record, leveraging sophisticated AI models, having built and shipped solutions based on earlier generations of AI. Generative AI can transform all aspects of agreement workflow, and we are uniquely positioned to capitalize on this opportunity. As an early example, we recently introduced a new limited availability feature agreement summarization. This new feature, which is enabled by our integration with Microsoft’s Azure Open AI service and tuned with our own proprietary agreement model uses AI to summarize and documents critical components giving signers a clear grasp of the most relevant information within their agreement, while respecting data security and privacy. 

Some possible future launches of generative AI features by DocuSign include search capabilities across agreement libraries and edits of documents based on industry best practices

Future launches will include search across customer agreement libraries, extractions from agreements and proposed language and edits based on customer, industry and universal best practices.

DocuSign has been working with AI for several years, but management sees the introduction of generative AI as a great opportunity to drive significant improvements to the company’s software products

I’d add to that, that I think the biggest change in our road map beyond that clear focus and articulation on agreement workflow is really the advent of generative AI. We’ve been working on AI for several years. As you know, we have products like Insights that leverage earlier generations of AI models. But given the enormous change there, that’s a fantastic opportunity to really unlock the category. And so, we’re investing very heavily there. We released some new products, and we’ll release more next week at Momentum, but I’m sure we’ll talk more about AI during the call. 

DocuSign’s management sees AI technology as the biggest long-term driver of the company’s growth

So, I think we — overall, I would say, product innovation is going to be the biggest driver and unlocker of our medium- to long-term growth. We do believe that we have very credible low-hanging fruit from better execution on our self-serve and product-backed growth motion. And so, that’s a top priority to drive greater efficiency in the near to medium term. I think the AI impact is perhaps the biggest in the long term. And we are starting to ship products, as I alluded to, and we’ll announce more next week. But in terms of its overall impact on the business, I think it’s still behind the other two in the — in the near to medium term. But in terms of the long-term potential of our category of agreement workflow, I think it’s a massive unlock and a fantastic opportunity for DocuSign.

DocuSign’s management is currently monetising AI by bundling AI features with existing features in some cases, and charging for AI features as add-ons in others; management needs to learn more about how customers are using AI features when it comes to monetization

In terms of monetization, I expect AI features to be both bundled as part of our baseline products, strengthening their functionality and value, as I suggested earlier. And in some cases, packaged as a separately charged add-on. We do both today. So, if you take our Insights product, which is really our AI-driven analytics product for CLM, we both have a stand-alone SKU. It’s sold separately as well as a premium bundle. I think, we’re going to need to learn a little bit more about how customers want to use this and what the key value drivers are before we finalize how we price the different features, but certainly mindful of wanting to capture the — deliver the most value and capture the most value for DocuSign, as we price it.

MongoDB (NASDAQ: MDB)

MongoDB’s management believes that AI will increase software development velocity and will enable more companies to launch more apps, leading to the speed of software development being even more important for companies

We believe AI will be the next frontier of development productivity — developer productivity and will likely lead to a step function increase in software development velocity. We know that most organizations have a huge backlog of projects they would like to take on but they just don’t have the development capacity to pursue. As developer productivity meaningfully improves, companies can dramatically increase their software ambitions and rapidly launch many more applications to transform their business. Consequently, the importance of development velocity to remain competitive will be even more pronounced. Said another way, if you are slow, then you are obsolete.

Companies are increasingly choosing MongoDB’s Atlas database service as the platform to build and run new AI apps

We are observing an emerging trend where customers are increasingly choosing Atlas as the platform to build and run new AI applications. For example, in Q1, more than 200 of the new Atlas customers were AI or ML companies. Well-financed start-ups like Hugging Face, [ Tekion ], One AI and [ Neura ] are examples of companies using MongoDB to help deliver the next wave of AI-powered applications to their customers.

MongoDB’s management believes that apps on legacy platforms will be replatformed to be AI-enabled, and those apps will need to migrate to MongoDB

We also believe that many existing applications will be replatformed to be AI enabled. This will be a compelling reason for customers to migrate from legacy technologies to MongoDB.

MongoDB’s management believes that in an increasingly AI-driven world, (1) AI will lead to more apps and more data storage demand for MongoDB; (2) developers will want to use modern databases like MongoDB to build; and (3) MongoDB can support wide use-cases, so it’s attractive to use MongoDB

First, we expect MongoDB to be a net beneficiary of AI, the reason being is that, as developer productivity increases, the volume of new applications will increase, which by definition will create new apps, which means more data stores, so driving more demand for MongoDB. Second, developers will be attracted to modern platforms like MongoDB because that’s the place where they can build these modern next-generation applications. And third, because of the breadth of our platform and the wide variety of use cases we support, that becomes even more of an impetus to use MongoDB. 

MongoDB’s management knows that AI requires vector databases, but thinks that AI still needs an operational datastore, which is where MongoDB excels in

The results that come from training and LLM against content are known as vector embeddings. And so content is assigned vectors and the vectors are stored in a database. These databases then facilitate searches when users query large language model with the appropriate vector embeddings, and it’s essentially how a user search is matched to content from an LLM. The key point, though, is that you still need an operational data store to store the actual data. And there are some adjunct solutions out there that have come out that are bespoke solutions but are not tied to actually where the data resides, so it’s not the best developer experience. And I believe that, over time, people will gravitate to a more seamless and integrated platform that offers a compelling user experience…

..Again, for generating content that’s accurate in a performant way, you do need to use vector embeddings which are stored in a database. And you — but you also need to store the data and you want to be able to offer a very compelling and seamless developer experience and be able to offer that as part of a broader platform. I think what you’ve seen, Brent, is that there’s been other trends, things like graph and time series, where a lot of people are very excited about these kind of bespoke single-function technologies, but over time, they got subsumed into a broader platform because it didn’t make sense for customers to have all these bespoke solutions which added so much complexity to their data architecture. 

Okta (NASDAQ: OKTA)

Okta has been working with AI for a number of years and some of its products contain AI features

So when we look at our own business, one of our huge — we have AI in our products, and we have for a few years, whether it’s ThreatInsight on the workforce side or Security Center on the customer identity side, which look at our billions of authentications and use AI to make sure we defend other customers from like similar types of threats that have been prosecuted against various customers on the platform. 

Okta’s management thinks AI could be really useful for helping users to auto-configure the set of Okta

One of the ideas that we’re working on that might be a typical use case of how someone like us could use AI is configuring Okta, setting the policy up for Okta across hundreds of applications on the workforce side or 10 or 20 applications on the customer identity side with various access policies and rules about who can access them and how they access them. It’d be pretty complicated to set up, but we’ve actually been prototyping using AI to auto-generate that configuration.

Okta’s management believes that AI will lead to higher demand for identity-use cases for the company

And then the other one we’re excited about is if you zoom out and you think this is a huge platform shift, it’s the next generation of technology. So that means that there’s going to be tons of new applications built with AI. It means that there’s going to be tons of new industries created and industries changed. And there’s going to be a login for all these things. You’re going to need to log on to these experiences. Sometimes it’s going to be machines. Sometimes it’s going to be users. That’s an identity problem, and we can help with that. So in a sense, we’re really going to be selling picks and shovels to the gold miners. 

Salesforce (NYSE: CRM)

Salesforce recently launched EinsteinGPT, a form of generative AI for customer relationship management

Last quarter, I told you of how our AI team is getting ready to launch EinsteinGPT, the world’s first generative AI for CRM. At Trailhead DX in March in front of thousands of trailblazers here in San Francisco, that’s exactly what we did. 

Salesforce announced SlackGPT, an AI assistant for users of the communication software Slack; management also believes that unleashing large language models within Slack can make the software incredibly valuable for users

We saw more of the incredible work of our AI team at our New York City World Tour this month when we demonstrated Slack GPT. Slack is a secure treasure trove of company data that generative AI can use to give every company and every employee their own powerful AI assistant, helping every employee be more productive in transforming the future of work. SlackGPT can leverage the power of generative AI, deliver instant conversation summaries, research tools and writing assistance directly in Slack. And you may never need to leave Slack to get a question answered. Slack is the perfect conversational interface for working with LLMs, which is why so many AI companies are Slack first and why OpenAI, ChatGPT and AnthropicSquad can now use Slack as a native interface…

…I think folks know, I have — my neighbor Sam Altman is the CEO of OpenAI, and I went over to his house for dinner, and it was a great conversation as it always is with him. And he had — he said, “Oh, just hold on one second, Marc, I want to get my laptop.” And he brought his laptop out and gave me some demonstrations of advanced technologies that are not appropriate for the call. But I did notice that there was only one application that he was using on his laptop and that was Slack. And the powerful part about that was I realized that everything from day 1 at OpenAI have been in Slack. And as we kind of brainstorm and talked about — of course, he was paying a Slack user fee and on and on, and he’s a great Slack customer. We’ve done a video about them, it’s on YouTube. But I realize that taking an LLM and embedding it inside Slack, well, maybe Slack will wake up. I mean there is so much data in Slack, I wonder if it could tell him what are the opportunities in OpenAI. What are the conflicts, what are the conversations, what should be his prioritization. What is the big product that got repressed that he never knew about.

And I realized in my own version of Slack at Salesforce, I have over 95 million Slack messages, and these are all open messages. I’m not talking about closed messaging or direct messaging or secure messaging between employees. I’m talking about the open framework that’s going on inside Salesforce and with so many of our customers. And then I realized, wow, I think Slack could wake up, and it could become a tremendous asset with an LLM consuming all that data and driving it. And then, of course, the idea is that is a new version of Slack. Not only do you have the free version of Slack, not only do you have the per user version of Slack, but then you have the additional LLM version of Slack. 

Salesforce is working with luxury brand Gucci to augment its client advisers by building AI chat technology

A great example already deploying this technology is Gucci. We’re working with them to augment their client advisers by building AI chat technology that creates a Gucci-fied tone of service, while incredible new voice, amplifying brand, storytelling and incremental sales as well. It’s an incredibly exciting vision for generative AI to transform which was customer service into now customer service, marketing and sales, all through augmenting Gucci employee capabilities using this amazing generative AI.

Salesforce’s management believes that Salesforce’s AI features can (1) help financial services companies improve the capabilities of their employees and (2) provide data-security for highly regulated companies when their data is used in AI models

But yesterday, there were many questions from my friend who I’m not going to give you his name because he’s one of the – the CEO of one of the largest and most important banks in the world. And I’ll just say that, of course, his primary focus is on productivity. He knows that he wants to make his bankers a lot more successful. He wants every banker to be able to rewrite a mortgage, but not every banker can, because writing the mortgage takes a lot of technical expertise. But as we showed him in the meeting through a combination of Tableau, which we demonstrated and Slack, which we demonstrated, and Salesforce’s Financial Services Cloud, which he has tens of thousands of users on, that banker understood that this would be incredible. But I also emphasize to him that LLMs, or large language models, they have a voracious appetite for data. They want every piece of data that they can consume. But through his regulatory standards, he cannot deliver all that data into the LLM because it becomes amalgamated. Today, he runs on Salesforce, and his data is secured down to the row and cell level.

Salesforce’s management believes that the technology sector experienced a “COVID super cycle” in 2020/2021 that made 2022 difficult for companies in the sector but that the tech could see an acceleration in growth in the future from an “AI supercycle”

I just really think you have to look at 2020, 2021 was just this massive super cycle called the pandemic. I don’t know if you remember, but we had a pandemic a couple of years ago. And during that, we saw tech buying like we never saw. It was incredible and everybody surged on tech buying. So you’re really looking at comparisons against that huge mega cycle… 

…That’s also what gives me tremendous confidence going forward and that what we’re really seeing is that customers are absorbing the huge amounts of technology that they bought. And that is about to come, I believe, to a close. I can’t give you the exact date, and it’s going to be accelerated by this AI super cycle.

Salesforce is doing a lot of work on data security when it comes to developing its AI features

For example, so we are doing a lot of things as the basic security level, like we are really doing tenant level isolation coupled with 0 retention architecture, the LLM level. So the LLM doesn’t remember any of the data. Along with that, they — for them to use these use cases, they want to have — they have a lot of these compliances like GDPR, ISO, SOC, Quadrant, they want to ensure that those compliances are still valid, and we’re going to solve it for that. In addition, the big worry everybody has is people have heard about hallucinations, toxicity, bias, this is what we call model trust. We have a lot of innovation around how to ground the data on 360 data, which is a huge advantage we have. And we are able to do a lot of things at that level. And then the thing, which I think Marc hinted at, which is LLMs are not like a database. These intra-enterprise trust, even once you have an LLM, you can’t open the data to everybody in the company. So you need ability to do this — who can access this data, how is it doing both before the query and after the query, we have to build that. 

Salesforce is importing 7 trillion reports into its Data Cloud to build AI features, and management believes this is a value trove of data

And by the way, the Data Cloud, just in a month, we are importing more than 7 trillion reports into the data layer, so which is a very powerful asset we have. So coupled with all of this is what they are looking for guidance and how we think we can deliver significant value to our customers.

Salesforce’s management sees generative AI as a useful tool to help non-technical users write software

But you can also imagine, for example, even with Salesforce, the ability as we’re going to see in June, that many of our trailblazers are amazing low-code, no-code trailblazers, but soon they’ll have the ability to tap into our LLMs like ProGen and Cogen that have the ability to code for them automatically. hey aren’t coders. They didn’t graduate computer science degrees.

The arc of progress that Salesforce’s management sees with AI: Predictive, then generative, then autonomous

So I think the way I see it is this AI technologies are a continuum that is predictive then they generate, and the real long-term goal is autonomous. The initial version of the generative AI will be more in terms of assistance…

… And then I think the fully autonomous cases, for example, in our own internal use cases with our models, we are able to detect 60% of instance and auto remediate. That requires a little bit more fine-tuning and we’ll have to work with specific customers to get to that level of model performance. So I see this is just the start of this cut. The assistant model is the initial thing to build trust and a human in the loop and validate it. And then as the models get better and better, we’ll keep taking use cases where we can fully automate it.

AI has already improved the productivity of Salesforce’s software developers by at least 20% and management thinks the same productivity-boost can happen for Salesforce’s customers

But the other use cases, which we are going to see, and in fact, I have rolled out our own code elements in our engineering org and we are already seeing minimum 20% productivity…

…In some cases, up to 30%. Now a lot of our customers are asking the same. We are going to roll Einstein GPT for our developers in the ecosystem, which will not only help not only the local developers bridge the gap, where there’s a talent gap but also reduce the cost of implementations for a lot of people. So there’s a lot of value.

Veeva Systems (NYSE: VEEV)

Veeva Systems recently announced an AI chatbot for field sales reps and management is not thinking about the chatbot’s monetisation at the moment

CRM Bot is an AI application for Vault CRM. You can think of it as ChatGPT for field teams…

…Yes, it’s really early right now. We’re focused on ensuring that we have the right product working with our customers. So that’s our focus right now. Let’s get the product right, and then we’ll get into more of the details on kind of the sizing and the opportunity there. But we’re excited overall about the opportunity we have in front of us …CRM bot will — that’s not an included product so that will have a license that will most likely be licensed by the user. So that will be net new. But as Brent mentioned, we’re focused on getting the product right and we don’t have pricing for that or sizing for that yet.

Veeva Systems’ management thinks that AI will not disrupt the company and will instead be a positive

Given the breadth and the nature of our industry cloud software, data, and services, AI will not be a major disruptor for our markets, rather it will be complementary and somewhat positive for Veeva in a few ways. We will develop focused AI applications where the technology is a good fit, such as CRM Bot for Vault CRM. The broadening use of AI will make our proprietary data assets, such as Link and Compass, more valuable over time because the data can be used in new ways. We will also make it easy for customers to connect their own AI applications with their Veeva applications, creating even more value from the Veeva ecosystem…

Veeva Systems’ management thinks that core systems of records will be needed even in the world of AI

I like our position as it relates to AI because we’re a core system of record. So that’s something you’re always going to need. I think that’s 1 thing that people should always understand. Core system of records will be needed even in the world of AI. If I ask Brent, hey, Brent, do you think 10 years from now, you’ll need a financial system to manage your financials. He’s going to tell me, yes, I really need one, you can’t take it away. ChatGPT won’t do it for me, right? I’m making a joke there, but our customers have the same critical operational systems around drug safety, around clinical trials, around regulatory, around their quality processes. So those are always going to be needed.

Veeva Systems is focused on leveraging its proprietary data assets with AI to make them more valuable 

Now we are also building our data assets, and these are proprietary data assets, Link, Compass and we’re building more data assets. Those will also be not affected by AI, but AI will be able to leverage those assets and make those assets more valuable. So I think we’ll develop more — we’ll do basically 3 things. We’ll develop more applications over time. CRM bot the first. We got to get that right. We also will — our proprietary data will get more valuable.

Veeva Systems’ management wants to make it easy for customers to connect their own AI applications with the company’s software products

And the third thing we’ll do is make our applications fit very well when customers have their own proprietary AI applications. So especially the Vault platform, we’ll do a lot of work in there to make it fit really well with the other AI applications they have from other vendors or that they develop themselves, because it’s an open ecosystem, and that’s how that’s part of being Veeva. 


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I have a vested interest in Adobe, DocuSign, MongoDB, Okta, Salesforce, and Veeva Systems. Holdings are subject to change at any time.