Microsoft Corporation (NASDAQ:MSFT) Morgan Stanley Technology, Media & Telecom Conference March 7, 2023 5:05 PM ET
Company Participants
Scott Guthrie - Executive Vice President, Cloud + AI
Conference Call Participants
Keith Weiss - Morgan Stanley
Keith Weiss
Excellent. Thank you everyone for joining us. My name is Keith Weiss. I run the US Software Research effort here at Morgan Stanley. And I'm really pleased to have with us from Microsoft Scott Guthrie, Executive Vice President of Cloud + AI. I think there's a couple of other things behind that. You have a wide scope of responsibility, but really a super interesting conversation ahead given everything that's going on with Microsoft.
Before we get into that something uninteresting our disclosure statement. For important disclosures, please see the Morgan Stanley research disclosure website at www.morganstanley.com/researchdisclosures. If you have any questions please reach out to your Morgan Stanley sales representative.
Question-and-Answer Session
Q - Keith Weiss
Excellent. So, thank you so much for joining us. I think it's a fascinating time to be talking to you given sort of everything that's going on within sort of the Microsoft ecosystem. And I think we should just start with what everyone is kind of super excited about.
The -- from an investor standpoint and especially for people who have been paying close attention the pace of innovation and what we're seeing in terms of AI and generative AI capabilities coming into the solution portfolio has really impressed people and impressed them to the upside of kind of what our expectations are.
Can you talk to us about how this came together? Because this is something that didn't sort of come into the forefront over the last month or two, this is something you guys have been putting together for years. Can you talk to us about sort of how you guys have built out the AI capability within Microsoft and now how we're seeing it expressed across the product divisions?
Scott Guthrie
Yes, I mean I think it's an exciting time. I mean I think we sort of call it the age of AI that we're entering. And it's probably going to be the most significant technology transformation in any of our lifetimes. We all experienced lots of big ones. But I do think over the -- and it's going to be over the next couple of years. So, that's not a statement around the next quarter and two. But I do think this is very, very profound and really going to change how business works how society works going forward.
And it's kind of been amazing on the technology side. I mean this has been a bet that we've made going back many years now and deep partnership with the OpenAI team and I know Sam is going to be here at the event I think later this week. It's been a great partnership where we kind of made some mutual bets on building kind of what we call the AI supercomputer which is kind of a service inside Azure that we provide which is really optimized around these very, very large language model trainings.
And we kind of did jointly a whole bunch of architecture work kind of designing how they were going to build the models, how we were going to build the infrastructure, and really built something pretty special that allows these large language models to be trained very fast and iteratively.
And then kudos to the OpenAI team. They really pioneered a tremendous amount of kind of new ways of thinking about building these models. And the combination I think has really been magic the last six months. And I think the road ahead is going to be pretty exciting.
As we start to move from training these models, providing these models, to really embedding this now into every single app and experience. And even at Microsoft you've seen even this week yesterday we announced our Dynamics 365 Copilot and our Power Platform Copilot. We shipped our GitHub Copilot last year. And you're going to see us kind of infuse this AI deeply throughout all of our applications. And it's I think going to be great for customers and really the next foundation of computing.
Keith Weiss
Great. So, if we think about it kind of structurally within Microsoft it's not just the OpenAI partnership. You guys have a lot of your own kind of AI research that you do in-house. You acquire some interesting technology with Nuance and sort of their DAX platform.
From what I understand there's a centralized kind of AI kind of core functionality. And then it's up to the product teams to figure out sort of how to expose that through their own solutions. Is that the correct way to think about it?
Scott Guthrie
A little bit yes. I mean there's a core kind of we call AI platform that we're building. And it's the same platform we offer to our external customers and partners. And so the nice thing is what Office 365 or Nuance or Dynamics or GitHub are using is the same platform infrastructure and the same capabilities that any external partner or customer can leverage as well. And we kind of believe that first-party and third-party symmetry is important. And so there's a lot that we share.
And part of the opportunity with these large language models is the ability to kind of have them know a lot of stuff about a lot of things and being able to be used in lots of different domains.
And then what we've built with our Azure OpenAI service as an example the ability for organizations or our internal teams to kind of provide fine-tunings on the model specific for a use case to make it even better. And that promise of being able to leverage the large language models which is trained in the public web and then that ability for say Morgan Stanley or another customer to be able to take proprietary data and tune it even further and know that that model is only going to be used by you, not by us, it's not going to benefit anyone else and you're going to control access. You're going to have the business model around it is what I think enterprise customers in particular are looking for and all of the SaaS ISVs and start-ups that are going to serve those customers are going to need. And I think I don't want to claim it's perfect. But I think we've got something special there. And the fact that we're able to use it now for our first-party products and we're able to offer it to all of our customers and partners, I think speaks to opportunity in the years ahead.
Keith Weiss
Right. And that's important both from the side of the equation. If you're talking about, like large enterprises particularly in sort of regulated industries, you bring about both sort of like the security and sort of the data residency side of the equation, but also the AI if you will get smarter to your particular business process. It's meant to solve the -- or answer the question specific to your business.
Scott Guthrie
Yes. I mean, AI learns with data. And so, it's -- one of the things you need to think about with AI is, as you provide data to that model and it learns, who owns that model, who owns that data? And that's why the trust is so important, I think in this AI age. And again, our promise is your data is your data. It's not our data. You get to monetize it. You get to control it. And the foundation models won't learn from your data. No. Your instance learns, but not the models that are shared by others. I think that promise and frankly the trust that we've earned over the years at Microsoft and you earn trust in drips and you can lose it in buckets. So it's really important to focus on that trust. But I think that puts us in a good position I think, where I think people look at us versus maybe some of the other big tech companies and say yes, I feel like I can trust Microsoft. And I think that hopefully makes us very good stewards in the years ahead, where people look at us as the trusted partner that's going to help them fully leverage this AI to its maximum potential.
Keith Weiss
Right. Got it. One of the reasons I was so excited to talk to you at this point in time, I think there's some really kind of foundational questions that investors are asking about the underlying technology. And one of the biggest ones is kind of the -- how to think about competition, how to think about what's going to make one model better than the other. And so, let me ask you the question like, how should we think about judging whether a GPT model from OpenAI is better than what Google's bringing to the equation or what Amazon is bringing to the equation? What are going to be the parameters of that competition?
A - Scott Guthrie
I think there's going to be two aspects. I mean, one is going to be on the raw technical capability of the model. And so, obviously we're going to be very focused on making sure that the base model from the get-go is super competitive. And I think the whole world probably didn't know what large language model -- or only a small part of the world knew what large language models were a year ago. And I think ChatGPT, some of the things we've done with Bing get a Copilot -- like suddenly the world's woken up to wow this is pretty amazing stuff. And so, we're going to continue to see large language innovation -- model innovations in the years ahead.
But I think the other thing to think about differentiation to your point is also going to be the signal of use cases of people actually using it. If you look at GitHub Copilot, which was the first really widely used large language model service in the world and you look at the accuracy of the code that was generated last July versus what it is today, it's dramatically better today. And it's because as people are using it, the model is getting better, the accuracy is getting better. And so sometimes first time to market or first to market and that signal improvement can really start to differentiate these models beyond even the base capabilities that are in it. And I think that's partly why you're seeing us move as quickly as we are, whether it's GitHub, whether it's Microsoft 365, whether it's Dynamics, whether it's Power Platform, Nuance, et cetera. There's literally nothing in our portfolio that we're not very aggressively looking to leverage AI, partly because we also want to get that signal going. And when you have hundreds of millions of commercial customers using your products, that's a lot of good signal, that's going to improve them. And that I think is going to further differentiate our models versus others hopefully in the market.
Keith Weiss
Got it. That's a good segue to the conversation on Bing and the new Edge browser. On one side of the equation, I think the Bing announcement and the capabilities impressed a lot of people. And really, it was a great marketing event for Microsoft and bringing it to sort of the entire world if you will like that. These capabilities exist and they exist within -- in Microsoft. There's really quickly some blowback about, sort of, well, some of the answers weren't right, right? Sydney emerged kind of over the weekend and it kind of freaked some people out. But it sounds like that's part of the learning process. That's part of making these models better as they have to get that usage up.
Scott Guthrie
Yes. I think, the day we announced the new Bing one of the things that we were very clear on was, this is going to be an evolution and we're going to learn and evolve. And we won't get everything right and we're going to keep improving and we're going to do it really fast. And that's been the model that the -- or the approach the team has taken on the Bing side. And I think they've done a good job of reacting fast.
And in some cases people are doing 200 prompts to try to cause the model to say something strange. But credit to the team, they're reacting fast. And we take it very seriously in terms of making sure AI is responsible, it's safe, that's kind of core to our DNA. And that's partly why it's not available to everyone yet. It's -- we start with a cohort of users. We learn. We improve. And we're going to make sure that we deliver this technology in a really safe responsible way.
Keith Weiss
Got it. In terms of the sort of monetization avenues on a go-forward basis. I think one of the competitive advantages that it seems like Microsoft has is all these avenues that you can sort of bring it out you could productize it through, so Bing being one of them. But like you're saying, like GitHub and all the developer platform is another one Teams Premium. The one that we haven't heard from yet Office and the overall productivity suite. But I guess that's to come, the March 16 event.
But is it fair to say, it's a kind of a similar, kind of, perspective, because one of the things that is interesting of all of these kind of innovations you've been putting out, they all have a price point behind it. Like, this isn't sort of innovation in the overall kind of softwares moving forward, but Teams Premium is a SKU. Like, should the expectation be that's going to be the route forward across the entire portfolio?
Scott Guthrie
I think, you're going to see -- I mean, I guess, the way I look at it would be, what is the productivity win you're giving to the business, whether it's around making an employee more productive or making up the specific business process more effective.
You take the example of say GitHub Copilot since that’s a product that's GA today. We're now seeing that developers using GitHub Copilot are 55% more productive with it on tasks. And that's sort of measured by independent studies. And 40% of the code they're checking in is now AI generated and unmodified. And so, if I talk to a CIO or to a CTO and sort of say, if I can give you 55% more developers overnight, would you take that?
They're all looking for talent. They're all constrained in terms of the number of engineers they can hire. They're gladly going to pay for that. And we have a good price point for that I think, that is sort of a no-brainer for them to pay. And it's a great service and a great business.
And I think, there's going to be lots of opportunities here. When you look at AI it's going to be additive. We're sort of adding new scenarios and taking cost out, enabling organizations to move faster, enabling them to be more productive. And so, I think, from a margin perspective, or from a revenue perspective to your price point, yes, I think, these things are going to be additive to our overall business.
And in a lot of cases, some of these use cases cost a lot of money for an organization. Think of Nuance with health care physicians and physician visits. Physicians only see so many patients a day. You can help them see significantly more patients and have a better experience. That is worth a lot.
Similarly, say call centers and customer support. If you can deflect a case without even having to have a human answer it, you have happier customers and, again, you've taken a lot of cost out of the system. And so, I think, for each of these things there will be different ways we monetize.
But in a world where we're sort of delivering massive productivity wins, the good news is, there's lots -- customers want to pay it, because it ultimately takes their overall cost down. And I think there will be different opportunities for us to add additional value going forward.
Keith Weiss
Got it. Outstanding. So today we're seeing a lot of this functionality be exposed through kind of the existing applications. On a go-forward basis, do you think this changes the paradigm of how people build applications? And could it potentially shift that pendulum now that we see between sort of buy-versus-build applications more towards the build side of the equation?
Scott Guthrie
Yes. I think there's going to be -- I think in the short run, I think the fastest way to get some of this AI value is going to be through finished apps, again, like the Copilot experiences that we're launching, because people are already trained on the apps and it just augments the apps integrates with it, lets them move faster. And so, I think there's a huge opportunity there.
And then, I think, we're also going to see then the next generation of apps that are going to be built on the raw APIs and the services around it that are going to re-envision pretty much every experience that we see.
I've told the story a few times about e-commerce. We kind of all take for granted you go in a web browser to an e-commerce site there's categories on the left, you click, there's products, you click, you read the reviews the price point is it in stock, you add it to your cart you check out. That was kind of codified 30 years ago and it hasn't changed dramatically.
I think we're going to be in a place probably two holiday seasons from now where instead of browsing I'm probably going to have a text box and I'm going to say, I want to buy a gift for a family member. Here's the price point. I want it delivered by December 19. This is what they like. And it's going to find the products for me. And that's going to be a very different user experience. It's going to be a very different question-and-answer experience. I'm going to be able to ask you questions about the product versus scroll through hundreds of comments and reviews.
And I think every organization needs to start to be thinking about, okay, how do I reinvent how I do retail, wealth management, manufacturing, routing, customer support. And I think that is some of it is going to be built where people are going to build them themselves. And I think big brands are going to need to have more control. And a lot of it is going to be buy components of it and how do you compose them together.
And I think part of what we're trying to do with the Microsoft Cloud is we do both. And then also being able to point to the fact that how we built GitHub Copilot or how we're building the Teams Premium or how we're building that dynamics, you can use the exact same APIs that we are gives us an opportunity to also talk credibly to other software vendors and to other enterprises about how they can do the same thing we are. And I think -- I think there's a big opportunity.
Keith Weiss
Okay. One of the presumptions I made in an earlier question was that ChatGPT or the Bing announcement more succinctly was a great marketing event for Microsoft. Is that correct? Has that spurred more customer conversations for you guys? And maybe more broadly where are we in terms of the customer conversation around these generative AI models or more AI more broadly? Like how far into this opportunity you think we are?
Scott Guthrie
I think we're still early innings. I mean, I think, the thing that's been great about I think ChatGPT and then also around Bing is the fact that like end users can do it. The number of people I've talked to who maybe haven't used all of the new products from all the tech companies, but seem to have tried those two2. And said, hey, we're using it now. My children are using it for homework, which you're not supposed to you or we're using it in a variety of use cases that kind of I hear more and more interesting ones. I think it has actually made what has been a very technical concept large foundational AI models transformer-based learning.
You know, like most people didn't know when the world that even meant 12 months ago and yet hundreds of millions of people have heard of ChatGPT and Bing now and tried it. And so I think that's -- I think that is actually making it much more real which is giving us an opportunity to then say, hey, let's show you how you can use this in customer support. Let's show you how we can use it with developer productivity. Let's show you how we can use it for sales productivity. And it's a good conversation starter. And again I think people are looking for solutions that integrate with their workflows that they already have and help them kind of accelerate even more.
Keith Weiss
Got it. One of the things that investors are struggling with a little bit here is it seems like just a massive opportunity ahead of Microsoft. It's something that Satya has talked about is I mean this is what's expanding sort of IT as a percentage of GDP from 5% to 10% over the next 10 years in kind of his way that he lays out that market opportunity.
But in the near-term, we're talking about cloud optimization. In the near-term we're seeing sort of Azure growth decelerate. When -- like can you give us some -- well one can you give us some kind of perspective on kind of what do those cloud optimizations mean? Like what are customers doing? Are they changing their views on how they want to use cloud fundamentally, or is this more of a short-term tactical impact that's just about sort of getting in line with the budgets?
Scott Guthrie
Well I think cloud optimization has been sort of a core part of the cloud journey for 10-plus years now.
Keith Weiss
Okay.
Scott Guthrie
So I don't think it's necessarily new per se. I mean in general the typical pattern we've seen going back many years is you either migrate a workload to the cloud or you build a new workload in the cloud. You then optimize it and then you reinvest the savings and you rinse and repeat with the next workload or the next use case.
And we like that optimization process. I mean I think -- and we have dedicated teams on Microsoft that help our customers with it because we know it earns loyalty we know it earns confidence to move more. And at the end of the day we can help our clients and customers and partners get more out of their investments. We know they'll spend more with us and they'll invest more in digital technology, which increases the overall size of the pie.
So in general we like that. And the types of optimizations people do, sometimes when they're first moving they have a large test environment and they go like okay, do more test and production and I can shrink that test environment a little bit, or can I take advantage of things like reserved instances or new cost savings plans that we have in Azure to get better -- commit to a longer term benefit and reduce per unit cost a little bit on an ongoing basis.
And then there are sometimes where they kind of right-size VMs or right-size databases. And so that's all natural. There's only so much optimization you can do until you're done. And so while people are optimizing, it's not like they're going to optimize it down to zero. At some point you get done and then you go into the next workload.
What's happened in the last six months or nine months has been as the macro situation has been uncertain, people have been I'd say optimizing even more and they are sometimes holding on to those savings a little bit longer before they reinvest it. But I haven't heard really from any customer a long-term change in terms of cloud. And there's always more workloads. There's always more use cases. And as we've been talking about with AI, if you're not constantly reinventing yourself with digital technology, you're going to be under severe competitive pressure. And so I don't so much worry about the long-term. But it does lead obviously to shorter term questions in terms of that optimization journey and what exactly the impact is. But again longer term I'm not hearing any changes.
Keith Weiss
Right. And that concept of optimization can only take place so long. Does that kind of -- is that what Satya is talking about when he says that he thinks this optimization activity last for a year but is unlikely to go significantly beyond that?
Scott Guthrie
Well, I think we're trying to be a little careful in terms of giving guidance on a specific quarter or more annualized basis. And I'm not trying to say that, because I think the reality is we'll see. But I think at some point when you optimize a specific workload, you can't optimize it anymore. And so there is a finite amount on a workload basis. And that's why again we ultimately feel very good in terms of the long-term for cloud and don't see any strategic shifts that our clients are making. It's more a case of it's going to have at times a dampening effect in the shorter term. But again the more they optimize, the more value they get the more, they generally in the long-term want to invest with us more. And again I look at AI and I look at other new use cases that are coming out and hear a lot of excitement around people saying, yeah, I got to be ultimately doing more of this in this mix. And so I do think those reinvestment savings for me is not in doubt. It's, obviously, we're all wondering exactly the win on a quarter-by-quarter basis. But again I have confidence in the long-term.
Keith Weiss
Got it, got it. I want to ask a question about Azure gross margins, but it's like a roundabout question. When you guys did the Bing announcement, Satya pretty directly went after the gross margins, the key competitor there in terms of Google saying that this is going to be a lower gross margin business and we're willing to spend on it. And it didn't take long for investors to say, well, if it's lower gross margins for Search, is this going to be lower gross margins for Azure? And the other cloud businesses they roll more of this AI functionality beneath those platforms. So is that the case? Does that like -- because this is more computationally intensive, because you have to bring in GPUs to do the training that this is going to be a compressive impact on overall cloud gross margins for Microsoft?
Scott Guthrie
I think, overall, the thing I would probably point to is going to be -- the fact that these are new workloads, and the fact that it really opens up more top line revenue, and again, for a lot of these use cases take a developer, if you can make that developer 55% more productive, I got to believe there's a lot of gross margin in there, because that ultimately translates into real opportunity to transform how an organization gets productivity out of their employees. And so I do think we're going to see depending on the use case for AI different ways that we'll monetize different margin structures.
But I ultimately look at if we can keep growing productivity dramatically for businesses that probably is going to be definitely long-term opportunity from a TAM revenue perspective and I think it's going to allow us to maintain some good margins as we do it. And obviously that will continue to evolve in the quarters and years ahead as people take maximum advantage of it.
I think the other thing on AI that we're thinking about is that AI has gravity, meaning, you can't go faster than the speed of light. And so if you've got an application, or a database, and you've got an AI model, the further they're apart, the slower the network path and the calls between them are. And for a lot of these use cases take something like GitHub Copilot, it's not like we're making one AI inference, like we're doing it on every key stroke. And the further that is apart, the slower the experience.
And I do think we're also going to see as we look at AI opportunities with Azure specifically is there's both the direct Azure AI model opportunity. But there's also the fact that people are going to increasingly want to move their apps and their databases into Azure to be close to those large models. And that's also going to be an opportunity for us to not only sell more AI, but also to sell more VMs, more storage, more databases, more everything. And I think we also see that as a real opportunity both with customers that we have today, but also there's a lot of customers that we don't have today.
Take this particular ZIP code has not been our strongest, because it's very much open source-based developers, which has not been Azure and Microsoft's historical strength. But this is creating a conversation where people are like, look, we want to take advantage of these large language models, like we want to talk about how we could use them. And we need to first show our value with the models. We need to first show the capabilities of Azure.
But I do think this is going to be a great door opener for a lot of customers that haven't really given us a hard look yet historically, because they don't -- they already had a cloud. And the fact that the OpenAI models run exclusively on Azure, I think ultimately is going to be a big differentiator for us.
Keith Weiss
Got it. Got it. So both through sort of pricing and volume you can make up for any kind of higher compute intensity needed in this type of process.
Scott Guthrie
And we're continuing to just sort of really optimize these models. I think a lot of people were surprised last week when OpenAI lowered their prices by a factor of 10. That's partly because they found a way to lower the cost of inferencing correspondingly.
And they said, hey, we can get a lot more revenue, open up a lot more opportunities when it's more effective. And that model, that's cost-optimized didn't exist 30 days ago. So we're in still early innings, and there's still a lot of optimization and learnings that we're doing, and it's going to be a fairly dynamic. But I think it's going to be exciting, because it really is again just looking at some of the ChatGPT use cases that people have posted about or the Bing use cases people have posted about they're already using it in fascinating ways that none of us I think would have thought of a year ago. And yeah, I think we're going to see far more use cases over the next year.
Keith Weiss
Right. That optimization activity that you were talking about that is bringing down inference costs I assume, that's really just kind of getting started right now. But it's a motion it's a muscle memory that Microsoft has. I mean you guys have been driving up cloud gross margins and Azure gross margins pretty materially over the past couple of years.
One of the questions I get from investors is does the fact that you guys don't have a GPU design of your own, does that inhibit sort of how far you can go on that sort of optimization curve? Google has their own GPUs and then, Amazon has their own design for GPUs. But Microsoft doesn't. Is that any significant inhibitor in terms of how efficient you could get?
Scott Guthrie
Well, in general, I mean, I think we're looking for, how do we optimize everything? Whether it's the silicon GPUs, whether it's the network interconnects, whether its data center designs, whether it's server hardware, whether it's fiber. And some of those things we're doing organically through the acquisitions we've done in the last six months are hollow core fiber through the company we call Luminosity and fungible which does storage and I/O optimization for DSPs.
And so these are very specific scenarios that maybe five years ago would not have made any sense for us to be investing in. Now at the scale that we're operating on it makes a lot of sense to invest in. And you're going to continue to see us innovate both organically and inorganically to kind of optimize every layer of the stack.
And part of the reason why not just OpenAI, but if you look at a lot of the other large language model start-ups out there are using Azure is because, we have some pretty differentiated hardware with our AI supercomputer. And that again includes silicon hardware network, power data center and a whole bunch of other design elements. And you're going to continue to see us innovate in that.
So we're going to be looking for opportunities every layer of the stack to do optimizations. And I think the partnership we have with OpenAI and the fact that we're building all of our own apps, deeply taking advantage of these models is also giving us real signal on what optimization really is going to matter. And how do we continue to be on the bleeding edge of optimizations.
And I do think that signal for us has helped a lot in terms of what we've built. We wouldn't have built it the way we built it, without the partnership we had and without some of these early applications. And I think that ultimately hopefully is going to help again both with the platform layer and at the app layer give us some good differentiation in the years ahead.
Keith Weiss
Outstanding. Unfortunately, that takes us to the end of our allotted time. I could have this conversation all day long, but thank you so much Scott for joining us. This is a very [indiscernible] conversation.
Scott Guthrie
Thanks for having me Keith. Thanks everyone.