MLOps
Weekly Podcast

Episode 
6
A VC Breakdown of MLOps with James Alcorn
Principal, Zetta Ventures
white featureform logo

A VC Breakdown of MLOps with James Alcorn

July 19, 2022

Listen on Spotify

Transcript:

Simba Khadder - 00:00:06

Hi, I'm Simba Khadder, and you're listening to the MLOps Weekly Podcast. Today I am chatting with none other than James Alcorn. James is an investor at Zetta Ventures. His investments include amazing MLOps companies like Semi Technologies, the makers of Weaviate, and ourselves here at Featureform. He also has investments in many applied ML companies, including Galley and Enzo Data. James observes and sits on many of these companies' boards. And we're super stoked here at Featureform to be able to call him a board member. James, it's awesome to have you on the show today.


James Alcorn - 00:00:36

Yeah, thanks so much for having me. This has been a long time coming.


Simba Khadder - 00:00:39

Why don't we start by sharing how you got into VC?


James Alcorn - 00:00:42

Well, I think that the important background is, so I came to the US from Australia, where I grew up. When I was 18, to come to college here in the Bay Area. As soon as I dropped into San Francisco, you're completely inundated by the technology ecosystem and community. So it was an interest in the investing side of that, kind of grew through my time at Berkeley. And after college went out and worked in private equity for a year or two and then moved into the venture side of things and really haven't looked back since. So it was kind of a quick and organic move into investing after kind of just really being surrounded by the technology ecosystem in the Bay.


Simba Khadder - 00:01:17

What made you focus on MLOps?


James Alcorn - 00:01:18

The focus on MLOps really came from our focus as a firm, is investing in both machine learning applications and that's really been the kind of historic focus of that Zetta Venture partners. But by virtue of investing in the machine learning application companies, you really get a sense for what is going on under the hood in terms of what MLOps technologies they're using. It was really just by working with those application companies that we got super excited by the infrastructure and the tooling that people are using to kind of get models into production, to monitor them, to retrain them, to make sure they're kind of working in the way that you expect. So it was an organic thing.


Simba Khadder - 00:01:53

Yeah, it makes sense. That's the [inaudible 00:01:55] applied ML companies and you got to see these problems.


James Alcorn - 00:01:58

Yeah, that's exactly right. And I think what's been interesting is, over the past two years especially, as more and more companies are willing to use third-party solutions or actually buy software from MLOps vendors as opposed to hand rolling their own stuff, the trend that was maybe emerging one or two years ago is now becoming very obvious, and that's quite exciting from an investment standpoint.


Simba Khadder - 00:02:19

What changed? Why weren't people buying MLOps four years ago? Or did the companies not exist? What's happening that has caused this kind of stark shift in the last few years?


James Alcorn - 00:02:29

It's a good question, and this is not a very novel answer, but I really do think it comes down to the kind of maturity cycle of machine learning and AI. People are putting models into deployment. And when you're putting things into production, you start dealing with production issues. So I think it really has to do with just the maturity cycle of machine learning in general across the enterprise.


Simba Khadder - 00:02:50

You mentioned that a lot of your learnings about MLOps come from talking to the companies doing ML themselves, and seeing what they're facing, things that they've built, probably in-house, things that they're buying. What else do you do? How else do you learn about MLOps?


James Alcorn - 00:03:02

No, not I'm a practitioner. I never have been. I mean, I studied economics in my undergrad, so I'm not technical, so I do a lot of reading. I spent a lot of time reading a lot of academic papers. I also spent a lot of time talking to practitioners and enterprise buyers to try and get a good sense of the shape of demand for these MLOps companies within each of the specific categories along the MLOps value chain. And then honestly, in terms of investing in MLOps companies, there is kind of all the quantitative work that you need to do and the talking to customers and talking to businesses. But then a good amount of it as well is also intuition. Trying to estimate and predict where the market will move in two to three years. A lot of that is just based on intuition as well.


Simba Khadder - 00:03:43

The things you're mentioning maybe are probably not things that practitioners think a lot about. Like, for example, think about where the market is moving. They're more just like trying to solve the problems at hand. Firstly, could you make clear the differences in how you think from a practitioner?


James Alcorn - 00:03:57

That's a good question. I would qualify this by saying as someone who has never been a practitioner, I don't want to put words in people's mouths, but here's how I would hypothesize that an investor might think about it differently to a practitioner. But aside from, if a practitioner is concerned about developer ergonomics of a specific tool, scalability, consistency, is it going to fit with my existing stack? I think the investor is yes, definitely concerned with all those elements, but is also looking beyond that to things like the ultimate market size for this product, the quality of the founding team. All these kinds of different [inaudible 00:04:32] of attributes around the company that maybe wouldn't be top of mind for a practitioner.


Simba Khadder - 00:04:38

So what do you think a practitioner could learn from how you think about things? [inaudible 00:04:44] that probably doesn't matter how you think for practitioners, but I'm sure there's a lot of things they could pick up. I know obviously you're not a practitioner, but if you have to say what are some things that you do when you look at companies that you think a practitioner or a buyer could also be doing?


James Alcorn - 00:04:57

Well, I think for a buyer there's probably more to learn than a kind of practitioner. But I would say we're looking at these companies, so if an investor is looking at things outside of just those specific elements around a product, and the kind of the technical features that I spoke about. I think some of that holds value for buyers and for practitioners. Like the quality of a founding team of an early-stage start-up is often very correlated to the quality of the product and the quality of the experience that the buyer is going to get. So I think incorporating a view on those elements as well actually could have some value for a buyer or practitioner as well.


Simba Khadder - 00:05:30

Yeah, it's super interesting. I know I look for it when we buy stuff from earlier companies, I like to talk to the founder and it's more just to understand how they think, where we're going. And like you said, just someone who, when you're buying an early-stage product, you're investing into it in a way. And investment is kind of on the team. MLOps is still so young. I have this conversation with people sometimes where I'm like, "Yeah, there is no MLOps company yet where you can just deploy tomorrow enterprise-wide, have no hiccups, because they've done it a thousand times. There's no company that big yet, and that mature yet." And so at this point, you have problems to be solved. So obviously you're looking at a product, but you're also kind of investing in the team. You're making a technical investment in this thing.


James Alcorn - 00:06:17

I agree. I think you're kind of leasing on the point, which I think is the really high-quality teams are building really high-quality products in large and high-quality markets like these things are often very tightly correlated. So I think you actually can glean a lot of insight around the nature and the character of product in a company through going out and assessing its market and going out and assessing its positioning, and going out and learning about its founders. I think all those things are very informative.


Simba Khadder - 00:06:43

Well, you do this for a living. Part of your job is just to be able to suss out the quality of a founding team. For someone who's a buyer. [inaudible 00:06:53] as much experience doing it as you will. What do you look for? What are the main ways you kind of can look at, let's say you were a buyer or in your case, you're investing? What are some basic things you can do to make sure that you're doing this diligence as you're buying?


James Alcorn - 00:07:06

That's a good question. Something that I often like to do with founders is to actually just, in initial conversations, to not speak specifically about their business or product, but to have a conversation that has a larger context or kind of covers a broader surface area. Because I think a lot of these venture back founders are pitching for a living, so they can get very good at telling a particular story or a narrative. So it's often super insightful to get to kind of break out of that story or narrative and get people speaking about things that they haven't necessarily rehearsed. So I think that's one easy technique to really understand what makes someone tick and the idea of what makes someone tick also gives you that insight into kind of who they are and what the character is and how determined they are.


Simba Khadder - 00:07:49

I've even had in some of my early calls, especially with like, I guess C level people at bigger companies. A lot of times they'll ask me questions not about Feature Stores, but it's about MLOps in general. And I think it's a little bit of this, of like them trying to, one, learn. But also to try to gauge what our view of Featureform, my own view is, on MLOps more broadly outside of Feature Stores. Most MLOps companies are specifically focused on one point of the stack. There isn't really MLOps platforms that live in a barrel and bubble. Like, everything is kind of associated with other parts of the ecosystem. And I think asking questions that way can help you understand if someone just has built a point solution and that's all they think about versus, "Hey, we're really thinking about the problem to be solved." Which is usually much broader.


James Alcorn - 00:08:42

And you're actually very good at that. Like, I've been on the receiving end of this myself. You're very good at kind of asking those open-ended questions about the state and the future of MLOps. And I think there's a good reason for that. Like, the jury is out on what MLOps is going to be. Like it really is. On the platform versus the kind of standalone solution debate. On the cloud vendor debate. Like the jury truly is out. So I think finding people who have a strong opinion and are convicted about a certain version of the future, that's appealing, certainly to an investor, that's appealing. And I think that's appealing to a buyer as well.


Simba Khadder - 00:09:14

It's interesting because it's the overstated analogy of you need to skate to where the puck is going to be, not where it is. But you're almost kind of doing both. Because you also have to kind of build to what the state of the world is today. But also be aware of what your state of the world is going to be in a few years. Because you don't want to squeeze yourself out. But also you need to be valuable today.


James Alcorn - 00:09:33

Well, you got to be alive, you got to be a growing concern, and then you also have to design and deliver your version of the future as well. So yeah, I think it is a combination of the two.


Simba Khadder - 00:09:41

Yeah, a lot of what we do is also educating this. Because there is so much noise in the space. We talked a bit about point-in-class solutions and platforms. Do you think that in the future it will be mostly platforms, mostly point solutions, or mix? Yeah. How do you think about that?


James Alcorn - 00:09:56

Yeah, well, I would actually start by making the distinction between cloud vendors that offer an MLOps platform and then kind of a pure play MLOps platform. So a company that really makes the majority of its revenue by selling an application or infrastructure as opposed to compute. So I think the cloud vendors will always offer end-to-end MLOps platforms because it drives demand for kind of their core services. Whether pure play MLOps platforms will exist and come to dominate the market, I think that's an open question.


00:10:27 

And I think it's an open question because we're actually kind of in the early innings of this current generation of MLOps businesses that are going out and finding a specific problem and building a really custom and tailored, and high-quality solution to that. To solving that one problem that is at one point along the MLOps kind of value chain. We are seeing, I think, the first signs of businesses that have entered at one point, try to kind of vertically integrate along that value chain. But whether they will be able to do so sustainably, whether they'll be able to compete against companies that are going against one specific point.


00:11:03 

I think it's an open question. It really is. So I think time will tell. But the early evidence suggests that people are happy to pull together a number of different MLOps solutions if they are best in class at this specific problem they are solving.


Simba Khadder - 00:11:17

It's interesting because it feels like it used to be all platforms, and it's moved to being all point solutions. And yeah, it'll be interesting to see if all these point solutions, the ones that succeed, and are building all the other points, just vertically integrating or they maintain their own points.


James Alcorn - 00:11:34

I think on that point, it's a kind of attractive and almost consensus view to say, "Okay, MLOps right now is in a toolchain sprawl. Like there is a sprawl of MLOps companies and that leads to consolidation." I think that's a line of argument that you'll hear a lot throughout the investor community and maybe throughout the founding community as well. But I don't necessarily know if that is true. And I look to DevOps as a segment that is instructive for MLOps. And I think in DevOps there are very large businesses that are built around one specific part of the DevOps toolchain. So I don't necessarily think it's a foregone conclusion that just because a sprawl exists, that there are many companies going after specific parts of the MLOps toolchain.


00:12:20 

That will naturally lead to a consolidation and that will naturally lead to platform businesses kind of emerging as the leaders there. So I would be wary of making too strong a statement around either version of the future.


Simba Khadder - 00:12:34

Yeah, it's a really good point. I mean, GitLab and HashiCorp as two examples, they're both DevOps companies. They've both gone public, they've both grown like crazy, they're both DevOps. There's definitely overlap. There's overlap between everyone a little bit, but they are two companies that are very different. I think it's just probably a mix of DevOps is one of those things where there is no perfect answer for everyone. And MLOps is probably great. I think it's very similar in that way. The size of the company, size of data, what you're doing, computer vision, it's going to look very different from NLP. It's going to look different from tablet data.


00:13:08 

If you're a bank, it just comes up very different from if you're a startup. And trying to build a platform that solves it for everyone is just not really possible, or solves it well for everyone. We're all going to want to use it. I don't think it's really possible unless you build something that is almost configurable.


James Alcorn - 00:13:24

Right, but even if you do build something that is highly configurable, there's going to be portions of demand or portions of customers who don't want something that's highly configurable. No, I agree.


Simba Khadder - 00:13:34

Yeah. And that's also my argument about having like platforms versus plain class. Will there just be computer vision platform that is really good for startups of a certain size or a specific space? Yeah, it will be interesting to see how it all plays out.


James Alcorn - 00:13:48

Yeah. And you know what I think is like if you want to continue on the kind of DevOps versus MLOps kind of analogizing, I think a really interesting question is what is going to be the ultimate size of the MLOps market relative to that of DevOps? And I think there's arguments both ways here. The argument against MLOps being larger is that kind of DevOps is a much broader, more horizontal tool space with more companies and users who actually have a need for those toolsets. One argument for MLOps being larger by contrast, is that the analytics done on top of the kind of core software engineering work is of higher value, and therefore the kind of tools to support that high-value analytic type workload would theoretically be worth more.


00:14:35 

That's a really important question that I don't think gets asked a lot. But I think what the ultimate answer is. The relative size of the MLOps to the DevOps market will have a very, very large impact on how we look back on this age of machine learning and AI and the rise of this kind of software category and kind of where it sits in the history of technology.


Simba Khadder - 00:14:55

How do you think about the different spaces interacting? Like there are parts of MLOps which are really DevOps problems. It's very specific. [inaudible 00:15:08]. But for example, let's talk about monitoring. Like we've had infrastructure monitoring forever. We've got Prometheus versus big companies like Splunk that kind of fall in that space and obviously, there's ML-specific ones, that machine learning is going to have specific problems that only exist for machine learning, and the generic tools don't really solve that. Same kind of applies for data ops, like data cataloging.


00:15:32 

A lot of what we build even as a feature store is a much more specific and focused on a pain point or focus on a use case type of cataloging, amongst other things. When you think about kind of the interplay between data ops, MLOps, DevOps and even just data infrastructure like Snowflake and Databricks coming to the space of it, how do you break that down? Obviously, no one knows what it will be like in the future, but if you have to say how is it all going to play out or how is it going to be cut up?


James Alcorn - 00:16:07

Well, I have an answer that probably isn't going to be very useful. I actually think a lot of this is just semantics and it's like different nomenclature for different things. Fundamentally, I don't really care if a company is more on the data infrastructure side than the pure play MLOps or closer to DevOps than data infra. I don't care. What I care about is finding businesses that can generate a lot of revenue very quickly and eventually produce cash flow. That's it. And so it's a fun and almost like academic exercise to categorize businesses into their different segments and to make these massive market maps and stuff like that. But I don't know if there is a whole lot of utility to doing it otherwise, outside of just the exercise of doing it itself. And I know that's a very dissatisfying answer, but that's genuinely kind of how I approach that problem.


Simba Khadder - 00:16:56

It's almost like a question that doesn't matter just because who knows what's going to look like in the future. Point is, they're all going to kind of lean into where they're most valuable, where there's, I guess the most room for growth. That's just going to happen. And it's a bit associated, like where are the problems going to lie in the next few years. If you know where the problems are going to lie, you could probably write a better market map, but I don't think anyone really knows. There's probably a little bit of Conway's law. I think this is true of DevOps, where a lot of the way that DevOps is done, like the fact that we do CI/CD and always consider to be one thing, is more of the fact that lots of the tools that did CI also did CD.


00:17:33 

And so people have always just concerned them like one component, even though there are obviously CD only things out there like Spinnaker. There's kind of a funny interplay between the market. What the market wants is kind of what will become to exist, but also some of the tooling where people start, how they configure themselves in the market space almost like might create some weird awkwardness in how things are cut up. And that just happens, I think, in every market.


James Alcorn - 00:18:00

Yeah, well, I'll tell you what I think would be a real tragedy is if a company that considers itself to be an MLOps company made product or distribution or strategic choices that were kind of against their own interests in order to stay categorized as an MLOps company. Like, that would be the definition of a poor business strategy to make these decisions simply on the basis of remaining in one software category. So I think that's a potential danger of us kind of glorifying those market distinction type exercises, is if people take it kind of too far in terms of what they do with their business.


Simba Khadder - 00:18:35

Yeah, that's a good point. It's like being focused on the problem, not focused on the categorization. Like, that's not MLOps, so we don't do that. Or that is MLOps, so we must do it as well. I've seen a lot of people just build things because, "Everyone else has this type of thing, so we should have it too." It's almost like copying each other, but then it can become a little bit of the blind leading the blind. Because like you said, no one really knows what it's going to look like in a few years. And even as a practitioner, you know, what the problems to be solved right now.


00:19:06 

But there isn't really a good example of what a perfect MLOps workflow looks like. We've talked to so many companies. I haven't talked to one, where I'm like, "Yeah, everyone should do it this way." Like, everyone has a lot of problems that they're still working on solving. A lot of them are very basic. Lots and lots of spreadsheets being shared around, lots of Google Docs being shared around. Like, we're still at that level, I think, in most big companies.


James Alcorn - 00:19:28

Yeah, I actually think investors are somewhat at fault here. And I'll say I'm definitely guilty of it as well. Because we'll kind of go out and put out to the world that, "Hey we're looking for MLOps businesses." And then you're kind of ingraining that kind of core MLOps philosophy as the hallowed ground, like the place to be in, the companies to build. And if that leads businesses, as I said before, to make decisions that are kind of against their own long-term interest because they are aspired to be an MLOps company, that's a total tragedy. So I do think it's really a function of all the different actors in this ecosystem. Which includes companies, which includes investors, which includes buyers.


Simba Khadder - 00:20:06

Yeah, that's a really interesting point. As obviously someone who's in the ecosystem, I've seen a lot of companies pivot in. But what I also see, which is always interesting, is every few months, well not every few months, but there are like a few windows. I remember distinctly where it felt like there was 100 new MLOps companies that I'd never heard of. And it felt like everyone just left their job at their FAANG company and started a company doing MLOps without really any sort of differentiation. Just like we're also named Buzzy, we're also a feature store or whatever. That's fine. We need people in it.


00:20:44 

Like this is how the space is going to play out. But I think it was interesting that a lot of people came in without any differentiation. It's like we're yet another MLOps company. A lot of those companies just went away almost as quietly as they came. But I just remembered there were these times where we would go into like a customer meeting, and we have our competitors that we're well aware of, that we run into at meetings all the time. But every so often we'll go in and we'll hear about four or five companies we've never heard of, and it will be like one or two people who haven't even left their job yet.


00:21:14 

And then there was like a few windows where that happened, and they all just disappeared. So I do think that's a bit of people looking at investors, what investors are doing. See the market is hot and being like, "Oh, I can do that. I built the X before. I'll just go and start a company."


James Alcorn - 00:21:28

And investors should not be leading those conversations and ideas about what the future is going to look like. We need to take our signal from founders. And I think you have to actively think about that and actively ensure that you're preserving that kind of relationship and balance. Because it can kind of get out of whack, I think. And that's where I think you can fall into real trouble as an investor. If you think you know a market better than a founder and therefore this idea won't work for reasons X, Y and Z. Well, I would actually say nine times out of ten you're going to be wrong about that.


Simba Khadder - 00:22:01

It's almost like, because I have seen investors who have obviously strong, and they should, have strong views of what the future is going to look like. But I think there's a difference in understanding the problem space and where things are going, especially if you were in it, versus kind of willing something into existence. Almost like being a founder but not being the founder. Just kind of like trying to put money in anything that kind of looks that way just because you think that company needs to exist. It can be attractive, it sounds like, based on what you're saying.


James Alcorn - 00:22:28

I think that's right. Yeah, I think that's a good way of framing it. And look. You can look to some of the most successful enterprise software investors in the world and I won't name any names. But folks have publicly spoken about moments in time where they've held such strong conviction around a specific solution to a specific problem, that when they came across an interesting company solving it a different way, they would pass on investing in that company simply because it didn't fit their kind of vision of the world. And with the benefit of hindsight, that's been the wrong decision. So it's not going to be true all the time. But I do think it is a good lesson in reminding ourselves, like the first principles of good venture investing, it's taking signal from founders.


Simba Khadder - 00:23:09

That's a super interesting story. You mentioned, you talked about Applied ML companies, you're seeing pretty much every MLOps company probably that's creating impact. Where do you see the most, I guess, opportunity? Where do you see that people are having the most problems?


James Alcorn - 00:23:22

Yeah, I think I'd answer that question by saying like a kind of higher-level secular trend in MLOps that I think is real, and that I'm very excited about is the shift from model-centric to data-centric. And that's something that has been publicized for a few years now. But the hype around it is starting to match the actual enterprise buying behavior. And more than buying behavior, it's almost like a cultural shift in organizations that are actually putting ML into production, around prioritizing a data-centric approach versus a model-centric approach.


00:23:55 

So if that's the high-level trend, the next question might be, "Okay, well, what are the implications for MLOps?" And I think they're pretty large because I think there's a category of new MLOps businesses to be built that enable companies to take that data-centric approach. Because I think a lot of the MLOps companies, the super successful ones over the past five years that have come to market, are very centered around the model-centric approach to building and deploying ML. So that's, I think, like a trend in MLOps that A, is real and that B, has major implications about, and C, that I'm quite excited about.


Simba Khadder - 00:24:31

Yeah, I think everyone talks about it differently, but if you were to define like, what is model-centric machine learning and data-centric, what's the difference?


James Alcorn - 00:24:38

Well, I think it's just the idea, like, its core, the idea of acknowledging that we can get superior model performance by paying more attention to what's going on in our data than we can by tweaking model architecture. I think that at its core is like the guiding principle, and maybe to speak practically about the tool changes that might occur. So I think data set curation is a pretty underappreciated and certainly an under-resourced category within MLOps. And over the past six to twelve months, I've seen a lot of businesses coming to market that are MLOps tools that are specifically designed for data set curation in kind of very novel and interesting ways that are tied back to model performance.


00:25:24 

And I don't think we've seen businesses like that before. A lot of these companies are within the unstructured data domain, are using an embedding led approach to actually kind of go out and vectorize the data set, understand where the clusters in that latent vector space are, and then make decisions about how to curate your data set based on what's going on in that vector space. I think that's a good example of an emerging MLOps category that was made real and possible by the shift towards data-centric AI.


Simba Khadder - 00:25:51

That's super interesting. I love that definition because it's very simple, like there's no buzzword in it. And I saw it myself when we were doing recommender systems. I remember sometimes looking at a paper, implementing architecture. And oftentimes we wouldn't get superior results. And even when we did, we sometimes wouldn't deploy because we were too afraid. We were like, "I guess it works better, but is it really going to work better?" Especially with a recommender system where it's not obvious. Maybe it will do better most of the time, but there's going to be like 10% of users where it's just awful. And that can be worse over a longer period of time.


00:26:26 

So we have all kinds of stuff we built like A-B testing, different models, and prod. So we have all kinds of cool stuff around that. But yeah, it definitely is, I think one thing I talked to companies about who are getting into ML, starting to do more ML, is a lot of people want to build complex features and complex models right off the beginning. And I've talked to big companies that have put all this time and effort to do so, and now how's that compare to the baseline, right?


00:26:53 

Well, there's no baseline. I'm like, well, just like put up a linear regression and see how it does. Just like do something really simple and then you can at least justify focusing on the model. I think a lot of the time there's a lot of obvious and easy ways to get signal. We have a story where we just added the user agent, like what kind of phone someone was on, and it gave us so much insight as to whether a user was subscribed or not. So if you joined it and crossed it [inaudible 00:27:17] over features.


00:27:19 

And so yeah, I always say start simple features, simple model. That's your easiest baseline. It should be very quick. Then you can build complex features, get more signal out, keep a simple model, and then eventually the final stage is complex features, complex models. And most companies don't get there, and don't need to get there. I know like Facebook, some of these really big companies, for sure there, they actually have pulled out most of the signals from their data in some use cases. And truly the only thing they can do now is focus on the model.


00:27:48 

But I would say there's probably ten companies that can honestly say that they have models at that level.


James Alcorn - 00:27:54

No, I think that's right, and I think it really does beg the question, why wasn't data-centric AI a kind of phenomenon or philosophy earlier than now? Like it's taken nearly ten years since [inaudible 00:28:06] for us to kind of come around to this idea that data is the key aspect when thinking about machine learning. And I think it has some amount to do with the academic community, wherein the incentives around academic papers and publishing, and citations and things like that really rewarded researchers who were focused on making incremental improvements to very, very large models.


00:28:28 

I think that has been the kind of culture in and around academia for some time. That is definitely changing now. But I think that certainly plays a large part in the reason why we were kind of living in this model-centric world for so long.


Simba Khadder - 00:28:40

I think a similar way that I've seen to frame that, is no one gets their paper published for having an interesting feature engineering approach. It just doesn't happen. I'm sure it can, like [inaudible 00:28:51] and stuff, but for a very basic thing. A blog post, sure, but can you get a published paper? Probably not. And so I think taking the way, and there's obviously a ton of, because of that we have things like Dolly and all these other awesome models that exist because of that focus. But for a lot of use cases, especially tablet data, fraud detection, things like that, it's mostly data-centric I guess, approaches to it, to models, works better in practice.


James Alcorn - 00:29:19

And I think that's like a really important distinction. What is interesting to the ML public writ large is, "How big is your model? Like what are its capabilities, et cetera." And you see this with all the large language models right now that are kind of being very widely publicized. It's almost like an arms race on who has the most parameters. That's literally one of the kind of core attributes that lead the headlines of these models. But in practice, when you're putting ML into production to solve a business problem in an enterprise, that is going to be the last thing on your list that you care about to get good performance and a good result. So I think there is a bit of a dichotomy between what's fun and exciting and sexy and what provides cool business value.


Simba Khadder - 00:30:02

What's the Tweet length that people should take away from this podcast?


James Alcorn - 00:30:06

I think a lot of people who are new to machine learning and MLOps in general often feel afraid to kind of dive in head first or get their feet wet simply because the space and the category can seem so challenging from a technical perspective. But as someone who has kind of entered it and really fully embraced it, who doesn't have a deep technical background, I guess my Tweet length takeaway would just be, if you're interested in it, just go and get involved. Go get talking to people, go get reading papers, just go and start immersing yourself in the community in the category. Because it's a super interesting and compelling and exciting part of the technology ecosystem to be involved with.


Simba Khadder - 00:30:44

That's awesome. James, thanks so much for hopping on, and taking the time to chat with me. Always a pleasure having these conversations with you.


James Alcorn - 00:30:50

I really appreciate it. Thank you.

Related Listening

From overviews to niche applications and everything in between, explore current discussion and commentary on feature management.

explore our resources