Snowflake Inc. (NYSE:SNOW) Morgan Stanley Technology, Media & Telecom Conference March 7, 2023 3:15 PM ET
Company Participants
Mike Scarpelli - Chief Financial Officer
Christian Kleinerman - Senior Vice President, Product
Conference Call Participants
Keith Weiss - Morgan Stanley
Keith Weiss
All right. Thank you everyone for joining us. My name is Keith Weiss. I run the US software research group here at Morgan Stanley. And very pleased to have with us Mike Scarpelli, CFO; and Christian Kleinerman, SVP of Product from Snowflake. So thank you gentlemen for joining us.
Christian Kleinerman
Thanks for having us.
Mike Scarpelli
Thank you.
Keith Weiss
Before we get started a brief disclosure, of course, an important disclosures, please see the Morgan Stanley research disclosure website at www.morganstanley.com/researchdisclosures. If you have any questions, please reach out to your Morgan Stanley sales representative. Excellent.
Question-and-Answer Session
Q - Keith Weiss
So that out of the way. So actually I want to get started in the presentation with you Christian and talking about the market opportunity ahead of Snowflake. And I think one of the most impressive parts of the story is how that opportunities evolve over the past couple of years. I remember at the IPO we were talking about roughly in the $80 billion, $81 billion market opportunity. But you guys have developed into the adjacencies around your core business. And now we're talking about $248 billion in market opportunity. Can you walk us through the steps of how we got there and how we expanded out that opportunity?
Christian Kleinerman
Yeah. So the early days of Snowflake were all about helping organizations break down silos and consolidate the data. If you look at pretty much every large organization they have a little bit of Vertica and Netezza all these different database technologies and it's hard for them to think across.
So our thesis was let's help organizations, combine the data and be able to think throughout the business. And then what we saw is even when customers have been able to consolidate data, they keep finding reasons to start to copy bits and pieces of their own data into different systems. I have an application that does some AI, so I copied the data. I have an application that does some grasp processing, I copied the data. And our whole thesis is instead of re-siloing the data, how do we help customers bring that application, those business logic into Snowflake? And that's when you hear us talking about Snowflake as an application platform, which dramatically changes the scope of what we do. And intersecting this type of business logic on Snowflake, we're also very focused on helping organizations collaborate with data. That's where our data sharing technology fits in, that's where our data clean room technology fits in. And the intersection of all of this as the opportunity for us keeps getting larger and larger.
Keith Weiss
Got it, got it. The data sharing element is probably one part of the Snowflake story I think people still under appreciate. The way I think about it in the old data warehousing technologies, pricing was based on capacity. Like how big is your data warehouse? In the Snowflake model, 90% is compute, it's how many questions are you asking of the data. And in every company that I talk to, one of the primary reasons for moving into a cloud-based data warehouse is to enable more sharing and enable more people to ask questions of that data. So I think there's like an inherent expansion of the market opportunity that comes just from moving to the cloud and just from getting that data sharing. And you guys track to vis-à-vis edges. Can you explain to us kind of what edges are, and how you see that developing within the base?
Christian Kleinerman
Yeah. So we think of sharing or enabling sharing of data both within an organization but also across our organizations. And both are really very important and meaningful opportunity for us. The way we think of sharing relationships is what we call an edge, which is the connection of two organizations or two parts of organization where they have activity, one, querying data from the other. And that's what we call an active edge.
So if I share that with you Keith we have an edge. But the number that you hear or the metric that you hear us talk about is what we call stable edges, which are edges that have a minimum threshold of activity over a minimum amount of time, which tends to suggest that this is not a one-off conversation Mike and I did, but it's a persisted ongoing relationship.
And Mike if you want to add anything?
Mike Scarpelli
Yeah. No, that data sharing really creates a stickiness in terms of we're actually seeing RFPs out there from our -- some of our customers are actually asking their vendors questions, are you a Snowflake customer? Because they want to do data sharing. So beyond just stickiness it's driving new customer adoption on Snowflake, because people are insisting on doing data sharing through Snowflake. And you really see that happening in the financial services industry, which by the way shouldn't surprise you because the financial services industry has been sharing data for years and years and years.
Unfortunately the data you've been sharing has been through FTP downloads, which is such an old technology or PDFs and we can avoid all that. There's no reason why the whole concept of a bank statement going to a company is irrelevant. You can do data sharing, so you don't have to actually transfer any data and you can just run your reconciliations directly against that in Snowflake. Again, we're working on things like that for our own use case internally and it's a much, much more efficient way of doing things. And more importantly because the data isn't getting transferred, it's secure and governing you know exactly who's accessing it.
Keith Weiss
Right. And just to continue down this thread a little bit. You talked about it in terms of creating stickiness from an investor perspective. I think one of the, sort of, whole grew out that we're always looking for investments is where are there really defensible moats around companies because technology and software evolves so quickly it's hard to get a technology moat there remains durable. But ecosystems that people create around certain technologies and data sharing being one of them could potentially be that defensible moat.
You talked about financial services. Can we segue a little bit into, sort of, the industry focus? Because I'm sure this is probably one of the kernels of why you have this industry focus is to try to create these ecosystems financial services one. Can you walk us through some of the other verticals that you think you could develop these types of ecosystems in?
Mike Scarpelli
Well it's happening in the media streaming area. With advertisers and media companies and data clean rooms is another form of data sharing. And especially with all the privacy concerns today that's definitely a key one.
Healthcare there's all kinds of opportunities on both the payer side and as well as in pharma with the development of new drugs and stuff. There's a lot of data sharing that happens between companies in that. Many times the pharma companies use third parties to do part of the work on those things and that's an important piece as well too.
You can pretty much apply it to any industry data sharing. And it's funny when you talk to people. I was actually talking to someone the other day who's the CIO of a bank and I was talking about data sharing. And he's like, well we don't really do any data sharing. And I'm like, okay. And that's what most people say. And then when you dig into it, oh, yes. We send these reports to Fidelity. We get these things. So you are doing data sharing. You're just doing it in an inefficient way.
Christian Kleinerman
Yes. On the industries we've even heard state governments…
Mike Scarpelli
Yes.
Christian Kleinerman
Interested in this. Yes. You imagine how many agencies are there and they all would like or would benefit from collaborating. So I think it permeates every industry.
Keith Weiss
Right. And it all comes back to asking more questions of the data and utilizing the data more fully. If we go one step further and talk about the concept of data cloud that you guys talked to and now becoming a platform for application development. That's probably even a bigger expansion of kind of the market opportunity in terms of app dev. Why is Snowflake? Why is Snowflake the platform for doing this application development? And can you talk to us about some of the tools that there's capabilities brought on board like the native application framework and the Streamlit acquisition that enable that application side of the data cloud to really come to fruition.
Christian Kleinerman
Yes. So the core thesis for us in this topic of collaboration is that an organization that leverages second-party data, third-party data, second-party and third-party services, we'll do better. And now that at this point there are many studies where they show you will outperform your peers if you figure out how to not only leverage your own data, but how do you enrich and put your data in context. That's the concept of the data cloud for us and that is what is unique about Snowflake.
Technology, yes, we can deliver technology and we're very proud of the technology we have. But when a customer buys into Snowflake he buys into this data cloud. And data cloud is where we all this ecosystem of players, data providers is one form of partnership. But more interestingly there's a lot of interesting IP interesting business logic that organizations are creating.
And what we're doing with this concept of application platform and native apps in Snowflake is can you package that logic? Can you make it available to other customers? So now when a customer buys into snowflake they're buying to this ecosystem. And we've seen customers that had passed on Snowflake.
Like I'm good and I'm not interested. When they see some of the applications that are coming on to Snowflake they can do this data sharing that I can repurpose a team of 30 people that were doing pipelines and ingestion and encryption and decryption all of that goes away. That is the appeal and that's how we think of the Data Cloud unique for Snowflake.
Mike Scarpelli
No, I agree.
Keith Weiss
And at all from a monetization standpoint it all comes back to more questions being asked of the data. And that's one of the really interesting things about Snowflake it's such a straightforward pricing model such as straightforward monetization model.
Mike Scarpelli
It's actually a beautiful model that we really have one product, three different flavors of that product depending on which addition you want. But every new feature we have our salespeople don't either go in and get the PO out of a customer. They just need to go in and educate the customer so a customer can consume more and then the follow-on capacity purchase orders follow. It's a very simple model and I love it. And the customer also, because of our model the way that we price, we sell a customer credit. A credit is a unit of measurement with the amount of compute you use, and we charge you by the terabyte of storage you have and the beautiful thing of every software improvement, every hardware improvements that improves the price performance. You can do more with that same credit every year.
And – so we become cheaper to our customers every year. And that's good, because the better the price performance, the more workloads they move to us. The more performance the speed at which we have, more workloads can come on to us that otherwise we weren't fast enough for. So our whole product road map is focused on more features that are going to drive consumption, but then improving that price/performance the speed at which we operate.
Christian Kleinerman
And we have the data. Like, one thing is to say the other thing is we can show the amount of compute credits that we generate per query per question asked, it keeps going steadily down. We've publicly shared in the last three years roughly 20% better economics for Snowflake as a platform. And our customers see that that it's not only given faster answers, but better economics.
Mike Scarpelli
And you can see that too. I think we're. Right about three billion queries a day running through Snowflake?
Christian Kleinerman
We crossed it.
Mike Scarpelli
We crossed $3 billion. I know as of last week, we were $8 million queries short of $3 billion a day. But you can see how the number of queries have grown in Snowflake, the revenue doesn't grow as much why, because the price/performance improvements to customers.
Keith Weiss
Right. That improving price/performance we talked about it $248 billion TAM. But there's the addressability of that TAM, and you need to have the right price/performance to address that TAM. As you improve that more and more of that potential market opportunity come serviceable over time.
Mike Scarpelli
Yes.
Keith Weiss
Can we talk about the ML and AI opportunity within Snowflake? And I think, it's been a – somewhat of an investor debate of whether a data warehouse whether the Snowflake architecture is correct for building ML, AI type of models on top of and workloads on top of. You guys announced Snowpark for Python, which I think makes it more applicable. But can you explain to us, why the data warehouse and why Snowflake is the right platform for building out these applications?
Christian Kleinerman
Yeah. Core to what we want to do and enable for our customers is deliver programmability of data. So, how do I get value? How do we extract value out of my data without trading off governance and security? And that's what's different from what you will hear from everyone. Everyone has Python. Like, we get asked a lot, why did it take you two, three years to incorporate Python into Snowflake? Because incorporating Python in an unsecured way, it's easy. We can do it in a couple of weekends. But then you can ask CIOs, how do you know that your data science team did not download some library from the Internet, and it may have had a vulnerability and potentially exfiltrate the data? And that's where the answers get a little bit less clear. Oh, yeah, the networking team was in charge or someone was in charge, like what we offer – and I'll get to your AI part of the question. What we offer is a secure way to program data. And when we say program data, it can be just transformed data, or it could be doing AI and ML.
So for us AI/ML is one additional workload that we want to support running close to the data in a secure fashion. And then you can say, you do want to do training. We have customers coming into Snowflake to do training. You want to do machine learning scoring. We have customers coming on to the platform to the scoring. And at our user conference, we introduced this low latency storage mode we call Unistore, which is very low very fast read and write. That's very common for online feature stores online recommendations applications of ML.
Then you come and say well, there's a new thing called language models. Language models is nothing, but here's another form of pre-trained machine learning. I want to be able to score that based on data that, I have in Snowflake. I may want to be able to fine-tune that based on data I have in Snowflake. And for us, it's a continuum. I'm not trying to dismiss the importance of AI, but what is really important is do everything you want to do program data, do AI, do proprietary computations, but do so without trading off security governance policies privacy. That's the value prop of Snowflake and it resonates to now end with customers.
Keith Weiss
Yeah. And something, I hear a lot, when I'm talking to CIOs, particularly in like regulated industries, when they're thinking about these large language models and stuff like ChatGPT, the security implications haven't really been explored. It is a real threat of sort of data leakage on a go-forward basis. But if I'm hearing you correctly, you today have companies that are utilizing Snowflake for training these models.
Christian Kleinerman
Yes. And for sure, we have not only customers leveraging Snowflake for machine learning. Part of the Snowpark for Python integration enabled that. We introduced a type of cluster that has more resources. We call it Snowpark warehouses, which are just above it.
The one piece that you can say, well, you don't have is GPU support that is for use cases where it's this deep learning. And you can stay tuned. We'll be sharing more about this at our user conference in June. But fundamentally, we just think about it. The broad vision and broad ball for Snowflake is bring computation, whatever the nature it may be to run closer to the data, an AI and ML is just one such example.
Keith Weiss
Got it. Perfect. I want to dig a little bit into Unistore. That's something that you guys talked to us a lot about at the last Analyst Day. And it enables Snowflake to now address more transactional workloads, right? And for the broader audience there's analytical workloads and transactional workloads, and historically never the two shall meet. Today, given sort of the computational resources that you guys have at hand and is not constrained, it's now more amenable, like you can bring those two together.
So one, can you talk to us about sort of the underlying technology that enable you to bring those two together? And two, what's the market opportunity that opens up when you can look at the data from both perspectives, both in terms of using it for transactions but also for the deep analytics?
Christian Kleinerman
Yes. So I'll rewind a little bit on database history. In the very early days, a database was a database, was a database and it did both transactional analytics.
Keith Weiss
That was before my time.
Christian Kleinerman
Way before my time, I got the tail end of that. And then, specialization happened and for example, Teradata credit work credit to say, we're going to build a database focused on analytics. And many others models, the Verticas, Netezzas, et cetera and Oracle and Db2 and others went on the transactional side. And for 20 30 years, they progressed on separate tracks. And as you said, they would never combine, because the specialization was for each type of use case.
What's changed and what's different, which is the question we get asked a lot of. Okay, if this thing has never happened, why do you think you have a shot at succeeding here? Is the cloud helps us present a unified product, a unified experience for our customer, even if behind the scenes there are different ways to store the data. And that's what Unistore does. The implementation of Unistore, we call it a hybrid tables. Hybrid, because they have a storage system optimized for analytics and a storage system optimized for fast reads and writes.
But we can behind the cloud, tie all the details on this data replicated, that have moved back. And what this does for us is now we enable customers to store data in Snowflake, build applications, application stack or machine learning pipelines or machine learning inference at low latency with very fast reads, very fast writes, but also that data seamlessly available for analytics. So, it's the technology in the cloud, the fact that we're delivering a hosted service that enables us to do this. And I believe, it's a big part of our application stack. Mike, if you want to say something about the market opportunity.
Mike Scarpelli
Yes. No, it's -- well, we don't know how big the market. It's really a good opportunity, but I think it's important too I think, it will be a revenue -- first of all, the product is still a private preview today. It's not in public preview. I think it will be in public preview at the end of this year. We've learned a lot from customers and we're revising some of the stuff on the engineering side. But that will have an impact on margins, because of the fact that there is dual -- there's duplication of data. Definitely, it will drive revenue, but it will -- it's not going to cause our margins to decrease but it puts a gate as to how big the margins can get the product margins in the company. But definitely, opens up a massive market opportunity for us as well too. To be determined how big that is.
Christian Kleinerman
Yes. And I would add that the bigger goal for us is enable full applications to run inside Snowflake. And if you look at the elements of an application, there's the core storage. So we have Snowflake analytics as well as UniStore. You want a middle tier to be able to do processing. That's where Snowpark fits in. And you want a presentation tier, which is where Streamlit fits in. The combination of all of those, change the art of what's possible and how we think modern applications will be built and deployed in a secure way.
Keith Weiss
Got it. That's a good summary. I want to shift gears a little bit and talk about the business model and get into near-term results. And maybe just stick with the theme of history lessons. I think what are the really interesting things about Snowflake is like how pure of a consumption model it is.
And if we think about it holistically from where we came from with perpetual license models, where all the risk was put on to the end customer. Like you got to figure out how to set it up, you have to figure out how to get productivity out of it. But upfront you give us a couple of million dollars. With Snowflake, nobody is paying you until they're starting to run queries against the data.
Mike Scarpelli
I'll correct that. They're paying us many times upfront, but they're not incurring the expense until actual use.
Keith Weiss
There's a commitment, but you're taking on a lot more of the risk.
Mike Scarpelli
Yes. We take on the risk and that's why it's super important that we are there for our customers' success. And why we spend a lot of time in why we insist our sales people stay engaged with customers. In the seed model and I know this when I was the CFO of ServiceNow and I know we buy a lot of licenses from other people.
It's painful when you buy a license and you start having the expense even though you may not start using it for six months. And so yes, we do bear that at Snowflake, but the benefit of that though is just as you can see a slowdown if people are tightening their belts, you can see an acceleration in our business as well too. And a people have more visibility into their business.
Keith Weiss
Right. So people have taken advantage of that flexibility during a tightening spending environment. How do you -- and just to bring it back to sort of the current sort of results and sort of what we've seen throughout 2022. Obviously, customers are taking advantage of that. And we've seen optimization in all sorts of cloud models, including Snowflake. How do you get an assessment of kind of where we are in that cycle?
Mike Scarpelli
Yes. So I think optimization is an overused term by many companies today. We've been talking about optimizations as far as two years ago. And at our Financial Analyst Day, we talked about this is nothing unique and this will be ongoing with any customer, but there are no big optimizations out there.
And optimizations just to be clear what they are is, we find instances where Snow -- people have not written the most efficient queries that are taking up too much compute. We spend time on the professional services side to help them rewrite, reengineer the query so they use less compute. But one of the biggest and low-hanging fruits on optimization, we've seen customers store data that they've never accessed.
And why are you doing that? We've seen customers store the data twice in Snowflake when you don't need to. We've seen customers where they would choose bigger warehouses than what they really need. They would disable the auto-suspend function. Well, today, we help you pick the right size warehouse that you need and take away a lot of that. You still -- customers can still choose, but we help you size the warehouse correctly.
We -- when you disable the auto-suspend function, there's a lot more alerting that happens on that. And we're monitoring constantly to make sure that warehouses aren't left hanging. That's what optimizations are for us.
Christian Kleinerman
And we will continue adding product capabilities to do all of this proactively or automatically for customers. Nobody wants the cycle of I grew, I optimized; I grew, I optimized. We believe in you're always optimized and that's good for everyone.
Mike Scarpelli
And I have a team of people that literally look at spikes in revenue on a daily basis. And when they see something, we reach out to the customer driven by finance with the rep to do that, to understand what's going on. You may say why am I doing that? I know if we have an unhappy customer that they left a warehouse running, or the using Snowflake in efficiently, they're going to ask for a credit back. And so I'd rather get in front of that.
But more importantly, we reforecast our revenue on a daily basis, based upon, the prior days consumption. And if I'm incorrectly re-forecasting on spikes, that aren't real spike like ongoing consumption, then I have a problem. So we do that. I don't know a single vendor that's ever reached out to me to tell me, when I'm consuming too much of something.
Keith Weiss
Right. So you feel comfortable with that. You've planned out that curve if you will. You've taken out the -- any excess.
Mike Scarpelli
There are no big optimizations that I'm aware of. And it's now manner, it's not some small ones, but there's no big ones out there when I look at the top customers.
Keith Weiss
Got it. When we think about the adjustment of the, forward year kind of revenue guidance that you did on the last conference call that was less about optimization or not about optimizations. It's about consumption pattern?
Mike Scarpelli
It's about customers ramping more slowly. And what I would say is, early adopters by their nature tend to move faster. And a lot of our early customers were let's get up and running on Snowflake who cares about costs. And we'll worry about optimizing later. The more -- I don't want to say laggards, but the later adopters tend to be more methodical. They tend to be more cost-conscious.
Those -- a lot of our early customers were the digitally native companies that were very fast moving and we're growing so quickly, they didn't really care about cost. But when you're dealing with well-established Global 2000 companies, these guys have always cared about costs. And they just move at their own pace.
And what we're seeing is, we've landed so many of these large customers over the last three, four years. They just grew slower than these digitally native early adopters.
Keith Weiss
Got it.
Mike Scarpelli
They're still going to get to the same end state.
Keith Weiss
Okay. So the destination remains the same. It just takes longer-and-longer to get there. And maybe one of the dynamics has been really interesting in the model throughout the year is the large customer growth has still been very robust.
I think the last three quarters have each been the largest net new additions into the $1 million-plus spenders. Can you dig in with us a little bit about the sort of the life cycle of that $1 million-plus customer? How long did it take them to get there? I think the average million-plus customer now is like $3.7 million, $3.8 million?
Mike Scarpelli
$3.7 million.
Keith Weiss
$3.7 million. How long does it take them from when they cross $1 million to get to, sort of like that average?
Mike Scarpelli
So when we sign up a new customer a Global 2000 is a little over $100,000 is what they start at a normal and non-Global 2000 in the $50,000 to $60,000 a year and they quickly grow. It usually will take a Global 2000 to get to that $1 million two to three years to get there. Why? Because some get there much faster, but they generally move pretty slow these companies.
Keith Weiss
Got it. Got it. I want to shift gears to the margin side of the equation, because that was the other really pretty spectacular part of the equation if you look back at calendar 2022.
Consumption models aren't supposed to see expanding margins as growth slows down. It should be harder for you guys to you grow margins. So attrition models are mechanically geared that when growth slows down you could just see more margins. But you guys saw a really robust expansion in your overall free cash flow margins during 2022.
How's are you able to do that? And if the demand environment gets better and then the consumption picks up, shouldn't that be incrementally even sort of more positive for margins on a go-forward basis?
Mike Scarpelli
So we've been -- I've been with the company now for a little over 3.5 years. Since day one I remember that, first year when I joined we were expected to burn $220 million and we'll quickly turn that around. We've always been focused on free cash flow. And I would say its revenue growth, product margins and free cash flow.
And it's pretty simple math, as to how that's working. We continue to show product margin improvements. We continue to show operating margin improvements. I will say what's kind of surprised me is I was expecting early on more of a shift in payment terms with our customers. Most of our customers still -- they sign a three year contract or a one year contract and pay us annually in advance. 80%-plus of our customers still do that.
I have expected customers to want to move to quarterly or monthly payment plans. Why? Because the cloud vendors give everyone monthly payment terms. And that is an option to customers. We give them that option, but it's all about discounting. And they would rather get a higher discount and pay upfront. I do think with people earning real returns on cash balances overnight now that there will be a shift in that. And that's one of the reasons why we kept our free cash flow flat at 25% next year. That's one piece. But then also we got some surprise early payments in January that I wasn't expecting that influenced -- that made that free cash flow higher in Q4 than it otherwise would have been.
Keith Weiss
Got it. Got it. One other thing I want to make sure we touch on we have about a minute left is the expansion of the AWS relationship. AWS obviously a major infrastructure partner for you guys as well. But you've well expanded that relationship. It's not just being in the marketplace. There's go-to-market commitments being made on both sides of the equation. Can you dig in a little bit about that of what you and Amazon are now doing together on a go-to-market?
Mike Scarpelli
Sure. We have committed headcount out of AWS aligned to our verticals. Globally, as well we're committing headcount to -- we've had those headcount anyways, but we're matching our headcount as well too. On the alliances side, there's more dollars that are committed for migration funds to Snowflake on AWS. And there's a lot more free credits, there's a lot of POCs that we run. And those POCs could be a new customer. But then also when we're looking at doing Snowpark, we offer free credits to customers to do evals. That stuff is funded by the cloud guys.
Keith Weiss
Got it.
Mike Scarpelli
We fund some of it and they're willing to throw money in. So it's a pretty big financial commit -- we're making a big financial commitment to them. They're making a big financial commitment to us as well too. And it also promotes -- AWS is really good about -- everyone thinks you compete with AWS Redshift. They're going to -- they talk about all these product improvements. And the reality is in large accounts AWS partners with us out of the gate because they want to see those customers land in AWS. And history has shown that Snowflake helps those customers land in AWS. And that's good for AWS because they can sell a lot of other software services around Snowflake.
Keith Weiss
Outstanding. Unfortunately, that takes us to the end of our live time slot. But Mike, Christian, thank you so much for joining us today.
Mike Scarpelli
Thanks for having us.
Christian Kleinerman
Thanks for having us. Bye.