Joshua Burkhow: 00:00 Welcome to Alter Everything, a podcast about AI, analytics, and the future of work. I'm your host, Joshua Burkhow. Today, I'm joined by Ari Kaplan, the head of evangelism at Databricks. Now, Ari's one of these people whose reputation far precedes him. I was really impressed by what I had heard and what I had read about him and all of his accomplishments to date. Then I met him. And I also realized that he's as generous, caring, and as good a human as he is unbelievably talented and intelligent. I've since been completely mesmerized. To say Ari is legendary would be a real understatement. Ari is often called the real Moneyball guy as he's part of the inspiration for the book and major motion picture that starred Brad Pitt, Moneyball, from his work having built analytics departments for multiple major league baseball teams, including the Chicago Cubs. Now he's also been influential in shaping how modern sports thinks about and uses data and analytics. Now, if all of that wasn't enough, here's where it gets even more interesting because Ari's work goes far beyond baseball. From starting off as a young kid building operating systems at age 10 to getting into the prestigious Caltech, to influencing even Formula One racing, all the way to becoming a world-renowned AI expert and fellow evangelist, after all that, having spent decades with humanitarian investigations into missing influential people that are responsible for saving thousands, even tens of thousands of people's lives. Now, if you want an inside look at someone who brings humanity and passion together in their love of interrogating data, their love of finding patterns, this episode is for you. Let's jump in.
Joshua Burkhow: 02:07 Ari, welcome to the show. As you know, I find your career really fascinating and frankly, so many different topics that I'd love to discuss with you today. How about we do the easy thing and just start at the beginning? What was young Ari like? Like, were you always fascinated by numbers and baseball and the theme of finding patterns? Was that something you were always into?
Ari Kaplan: 02:28 Yeah. And hey, Joshua, so good to be doing this. Yeah, young Ari, I grew up in New Jersey, started a newspaper route at 12, was kind of geeky but cool. I guess from a third party perspective, I love Mork and Mindy, Robin Williams show, science fiction, tons into reading, Ray Bradbury and other greats. So kind of like into this fantasy world, but a lot of it was in my head. This was in, like a young me, sold my first computer game when I was only six years old, a text adventure game. There was like magazines where you could sell your stuff. And that's one thing I love about technology. It's a great equalizer. On the internet, you don't know who you're typing to, what their background is. So hey, if you could write cool games and you're six years old, why not? People wouldn't know any different and ended up writing, it's called a bulletin board service, a BBS, which is really the precursor to the internet. That was back when I was about 10 or 11 years old. But I wrote my own operating system. I wrote my own programming language, compression algorithms. And then Educational Testing Service, that does the SAT and AP tests, heard about my compression algorithm and then hired me when I was 13 years old to help do database management for ETS, which was kind of cool. So that's kind of like a summary of a young me, loved trying to create things that...
Joshua Burkhow: 03:56 That's amazing.
Ari Kaplan: 04:03 ...happened before. Things were simple and you're trying to create things that had never been done before and it was just magical.
Joshua Burkhow: 04:11 That's amazing. That's amazing. So yeah, definitely came on early for you. What was the lead up to getting into Caltech? Caltech had to be a pretty transformational thing for you.
Ari Kaplan: 04:23 Yeah, Caltech, those who don't know, the California Institute of Technology, it's one of the top, sometimes it's ranked the top for sure for science and technology. The Big Bang Theory television show really popularized it. A super small school, about 250 students or so per class. So very like difficult to get in, know, rigorous subject matter. But if that's your thing, if you love every science, entrepreneurism, innovation, like couldn't be a better place. You learn not just book smart, but street smart, how to pick a lock, how to pick a safe, how to make dynamite. And then my brother...
Joshua Burkhow: 05:06 That's awesome. Yeah, yeah.
Ari Kaplan: 05:09 ...went there before, but the ability to do like groundbreaking research as an undergrad, I thought was the best in the world. So I'm like, I got to do this opportunity where I can do really cool research.
Joshua Burkhow: 05:23 That's amazing. Were you, what were you most obsessed with? Were you sort of in exploratory mode and trying a bunch of different things or, I don't think baseball had sort of really crept in yet, had it, that your love for baseball?
Ari Kaplan: 05:36 Yeah. Yeah, I was a huge baseball fan growing up as a Mets fan, but you know, it was a fan. I didn't know what to do with it. I liked a lot of things. So I just knew I loved creativity and I, and still do to this day, you know, many decades later, I like sitting in the world where it's innovative, you're like at the leading edge, but what you're working with, dealing with, you can actually go and do most of it in the real world. Like quantum computing...
Joshua Burkhow: 06:05 Right.
Ari Kaplan: 06:10 ...is going to be huge. But at this point, it's like if you have an idea to make it into a lab, you know, that'll take years, if not longer. And then there's some aspects of AI that is like pure research driven. And I know people love that. But I like it where I get people excited, and I get myself excited, and then I can actually do something. And that is where, you know, scientific curiosity and discovery is...
Joshua Burkhow: 06:35 Yeah. Yeah. I think that's probably something you and I share is I have the same thing where I like reading research papers. I like nerding out, but I just need it to be practical. I need it to be, even with all the AI stuff, I'm much more aligned towards what I can actually do with it versus, you know, the hype and all that stuff, which will probably...
Ari Kaplan: 06:35 And that's what Caltech is.
Joshua Burkhow: 06:58 ...get into for sure. I'm curious, okay, put yourself back in the day. Caltech, I think you graduated what, '92, '91, I think? What was your sort of next path after that?
Ari Kaplan: 07:12 You know, I had a lot of opportunities. I got a job offer from JPL and major league baseball. And it really came down to working full time in baseball or working like in the private sector. And I ended up doing both. So I worked full time at Oracle. A lot of that, they call Oracle back then the Caltech grad school since like aside from grad school, that was like the number one employer. So this is back in '92, you know, the relational database, believe it or not, companies would say, I don't think we have a need for a relational database. Whereas now that would be like outlandish, but that's the...
Joshua Burkhow: 07:31 Hmm. Haha. Right. Yeah.
Ari Kaplan: 07:52 ...world back then. And we believe, I believed...
Joshua Burkhow: 07:54 Blast for me.
Ari Kaplan: 07:55 ...it. I'm like, this is going to be so huge. I can't turn this opportunity down. But I love baseball. So I ended up working with the Montreal Expos as a consultant. So I kind of chose the best of both worlds. Part of me though, also loved theoretical physics.
Joshua Burkhow: 08:00 Yeah. Hmm.
Ari Kaplan: 08:11 Looking at things super close. So that was a huge insight too, but if you're a sports fan, if you see the practicality of databases, those were my two big driving forces, I couldn't turn them down.
Joshua Burkhow: 08:24 Did you sort of see the world of analytics? I mean, this is probably not yet data science as it was called right after that, but did you sort of see those worlds the same as physics and mathematics and science easily applied to baseball? Or was this a, seemed like you were one of the pioneers of this idea. So like, did you think about it back then versus how you think about it now?
Ari Kaplan: 08:48 Yeah, so here's a story I haven't told in a long time. I came up with an n-dimensional hypercube to explain the multi-state of baseball. That's a fun phrase to say. Figured out how to predict how many runs are going to score using the...
Joshua Burkhow: 09:00 Right.
Ari Kaplan: 09:04 ...approaches I used in physics. So for example, with bases loaded and nobody out, you know, four dimensions, zero, runner on first base, second base, third base, and the number of outs, zero, one, or two. Like bases loaded and nobody out, 2.254 runs are expected to score. Bases loaded, one out, 1.87 runs are expected to score. And back then that was kind of a novel approach. And that was like the foundation, even now in 2026, to see how did a play increase or decrease the chances of scoring. So if you even like football, we're big in the playoffs, you could watch television and it'll say, win probability, you know, the Bears going up and down, up and down. They had a 3% chance of winning. Now they have a 90% chance of winning. Oh, they just lost. That's all based on this win probability of physics. So yeah, in some ways there's overlap. But in some ways it's very different back then.
Joshua Burkhow: 09:43 I love the phrase multi-state. So there's this sort of, you've got this sort of Schrödinger's cat thing, right? Where you're sort of, you're in one of many different states that could then happen next and you've got probabilities on all the outcomes versus like more of a linear path. Is that what you mean?
Ari Kaplan: 10:01 Yeah. So one of the things that really resonated for me was going back to my physics background. And in quantum mechanics, you know, it's kind of like the uncertainty principle where if you measure something, you know, just by measuring it, it's going to affect the outcome and you literally cannot know beyond a certain probability. And I'm like, you know, in baseball, there's the state of the game. And if you try to predict it, I love predicting things. It should match reality. And I'm like, okay, if you can measure those states, you can know your expected likelihood of things. And at the end of the year or end of a career, it should be very accurate, right? But when you're not, the Heisenberg uncertainty principle, you can't know beyond a certain percentage. You're trying to do your best. And again, the Big Bang Theory, have you seen the show, Joshua?
Joshua Burkhow: 10:44 I did watch it, yeah.
Ari Kaplan: 10:48 There's an episode where Sheldon, the main protagonist, obsessive, has the same seat and the same table at a cafeteria every single day. He gets upset because someone's sitting at his seat. Well, in Caltech, there was a cafeteria. I sat at the same seat, same table for lunch and dinner for 14 years. And part of the reason is because there was someone there. One is you can hold a conversation across people's heads because of the acoustics. And number two was I knew everybody who sat there. We've been friends for 10 plus years and we'd walk through every scientific paper we'd just read. And so these weren't just people I knew. These are like my best friends I could say anything to. And sitting at this cafeteria, quantum mechanics would come up with its multi-states and Schrödinger's cat. And I love having these dinner conversations and lunch conversations for hours and trying to think through these problems in a fun and humorous way and having it really resonate. And it started to affect, you know, how you viewed things. So that's it. It's almost like that, yeah, that multi-state of Schrödinger's cat, if you knew things, what the state is, but can the state change just by observing it?
Joshua Burkhow: 11:53 Yeah. I'm just going to pretend I've been to Caltech and I sat at that seat too, because that sounds...
Ari Kaplan: 11:57 It's called Dapper, it's Dapper, it's 98 on my CLAP and yeah, so it's pronounced what they used to call it, Dab-nay.
Joshua Burkhow: 12:04 All right, Dab-nay. I'll have to come see it. I'll have to come see it next time I'm in LA. I mean, now that you know what you know about, you know, these quantum type sort of properties versus other forms of more traditional physics, has baseball become less or more fun for you?
Ari Kaplan: 12:25 You know, it's funny you ask that. I've answered that question somewhere about what I'm gonna be 65 this year. And I remember when I was under 40, it just was like I loved it. It was everything. I'd be obsessed and compelled with certain aspects. And now I'm much more mature and I still love it. It's just different. It's not obsessive anymore. It's kind of like you reach that pinnacle and then I'm like, I just reached it and it's not quite it, you know, and so I love it. But to me, I've spent a couple of decades there. I've done my thing. I definitely moved on. And it was as much that I got older, more mature, but also I kind of exhausted my excitement there. It's kind of like many of us, you reach some pinnacle and you're kind of searching for what's next. So yeah, I guess I view it differently.
Joshua Burkhow: 13:11 It's funny. I, so I think through your career, I mean, when I got to hear a lot of talk you gave at, sorry, at a data and AI summit on storytelling. And I felt like there was this common thread where you're always telling this story of how you should be a few years ahead of the curve. And when everybody else gets there, that's when you should kind of move on and go move to the next level, the next thing that you're fascinated about. But how do you find those inflection points? Like how do you sort of, where to put your energy next?
Ari Kaplan: 13:44 So I've lived this, you know, AI and like some people have called me the GodAI, the godfather of AI and sports, you know, and I'm known because I was one of those early people doing it like decades ago, right? But I, you know, so I did that. And then what was kind of cool is then helping the Cubs win the World Series, you get that pinnacle moment, right? But what I found is I moved on to Databricks because I was obsessed, like, okay, I believe in AI. It's going to change everything. I want to be part of that. I could help sports kind of be reactive. They'd be like, I have an idea. I want to let me work backwards from, you know, the future. I have an idea. I want to test it out. But in baseball, I have to convince the owners and the general manager and get everybody on board. It's like the world's slowest moving organization in many ways. But so I went to Databricks to help, and it's one of the leaders in AI. I'm like, I could help any industry. And I go help say a healthcare company. And within weeks, they can get started and make real-world tangible changes. That was exciting. So I literally traded the prestige, get paid much less, made a bigger sacrifice, but I'm like it.
Joshua Burkhow: 14:18 Right.
Ari Kaplan: 14:55 That's worth it. I'm much more excited. But it took to basically give up that pinnacle of my career for something more that I would love. And yeah, so it was just that moment of choosing. I literally could stay in baseball, you know, for the rest of my career. You know, I had a very nice cushy job. But I wanted to seek out and I felt this gut thing. I'm not happy. And it took me a while to realize what would make me happy and do that. And it was being able to help almost anybody.
Joshua Burkhow: 15:26 Yeah. Yeah. And I feel like there's probably, I mean, the parallel story here. I mean, we both love the Moneyball, like the Moneyball book. I did find it fascinating too. And I'm curious how much you found yourself in the book that was written. And when you were researching for that, because my fascination with that was like, they're really picking up all of these ideas that were happening for 10 years prior to them. And then it kind of, but they made it like, look, this is new and novel, but they took the best of everything.
Ari Kaplan: 15:54 Yeah.
Joshua Burkhow: 15:56 Is that kind of the same path of you and the data bricks? Like, you're like, yeah, a lot of people have been playing here for a long time, but now I can grab the best ideas and then apply that and bring that together to make the biggest impact. Was that it?
Ari Kaplan: 16:07 That's a great question. And actually what, so it wasn't the way you described it. It's my ideas that I brought to the table. So with the A's, it was Billy Bean. I didn't work with Billy Bean and the general manager. They would take pieces, but then you literally have an organization that has its own structure, its own way of doing things. And they would use whatever I gave them and then fold it into their own organization. With Databricks, it's been a lot of my ideas of how to communicate and how to help.
Joshua Burkhow: 16:14 Yeah.
Ari Kaplan: 16:35 customers and how to educate them, taking like literally that next piece, doing something that literally hasn't been done before, creating my own content, inspiring others. And that just was like pure joy to do that because like again, Databricks, it's like a blank slate. It's a strong company. They know what they're doing, but you give me my own blank slate to go draw on. I can create what I want. I could influence the whole organization in a different way. And that's one of the strengths of Databricks and I love that. And I have no regrets.
Joshua Burkhow: 17:12 Yeah, the evangelism side of things is a fascinating, I mean, so I started evangelizing the use case for analytics a lot before I started doing that for AI. I'm curious, from your standpoint, like, what makes, what does evangelism mean to you? And then like, how do you think about, I mean, you know, it's much more scalable than, hey, let's get everyone in a classroom and teach them about a thing. Like, what is your, what's your philosophy on all that?
Ari Kaplan: 17:41 Yeah, I call my style, I'm an evangelist, but I'm a relatable evangelist. So I try to find the other person, what makes them happy, what problems, what should they care about? And then I want to go and plant seeds. And so like I have this thing where I just say, I want to just sow seeds to everyone. Like you can't just teach people and convince them. You have to sow seeds, let them go grow on their own. But from that, they have to make their own journey. And then I plant enough seeds that they go on their own journey. And then they have that self-discovery. and then they want to learn more, they think it's their idea. So I'm not trying to convince someone or lecture someone. I'm trying to, you know, get them excited and they don't even know it because like it's their journey, just connecting the dots. And so, you know, it's maybe a lesson from quantum mechanics about observing things changes the state. It's like, can you get someone to do something without directly observing and pushing them that way? You know, I believe in that. So that's one of my philosophies. I mean, there's many others. And what I also love is I have so many passions. So like I have racing, AI, baseball, even like the human investigation work I do. I can usually use one of those things to connect with the person I talk to. So by connecting with them, I build that rapport and then, you know, it's kind of natural.
Joshua Burkhow: 18:34 Right.
Ari Kaplan: 18:57 You know, it's like human to human and they like me, they like my style, they think I know what I'm doing and then they want to learn more. Like they're seeking it out and it's just more natural. So that's some of my style, but you know, I've gone to great lengths to try to create content, create storytelling and humor and, you know, yeah.
Joshua Burkhow: 19:14 Mmm. Yeah, no, I love that. I mean, I think, yeah, the whole, just the idea of planting the seed, letting them go on their own journey. There's a, I mean, I'm a big fan of the Feynman technique of teaching and learning. And there's like a little bit of pieces of that in here too. Like as you, I think in order to teach something, you have to go find all of your intuitions so that you can then explain a thing in a very, very understandable way. And then people can then further explain that. So they kind of go on a journey and there's so much beauty in that. Cause it's non-manipulative, it's,
Ari Kaplan: 19:42 Yeah.
Joshua Burkhow: 19:47 it's an open road, which I, which I agree with. mean, it's the antithesis of being like told what to think or what to do. I absolutely love it. And it allows me to like, that approach of planting seeds also gives you, I imagine, gives you a lot more freedom to find, to do the opposite where you learn. Where like the more you plant seeds and the more you have relationships and connections, the more you become this hub of everyone then bringing their wisdom back to you, right? Which is like a compounding, I think, investment effect.
Ari Kaplan: 20:17 That is perfectly true. And like it's great insight, Joshua. Yeah, so when I try to teach people and help them and tell stories and I get tons of feedback and I learn from their questions, what was confusing, like very early in my career, I was this, had my PhD and was very confident, academic and someone asked me a basic question about metrics. And I totally misinterpreted their question. I was too academic and I didn't realize that they were a practitioner that needed very specific answers and they didn't need to know like the formula. I can go on 20 minutes explaining. But from that, I learned and I'm like, oh, that's how I should communicate things. And so now I start with the most simplest thing and then I can go deeper and deeper. And so yeah, I learn, you know, and that it's more fun because I'm not just like, you know, talking to someone and I'm teaching them. They're teaching me things. They're asking great questions. And it's kind of like you get some cross pollination going back and forth because you both get stronger. You both get smarter. You both have like, and it creates like that energy that's like you know when you see two people you look forward to seeing them again because they're like syncing you just have the same energy you didn't even know you had and that's that's kind of like what I love
Joshua Burkhow: 21:31 Yeah. God. yeah, I mean, that's kind of what we're both doing here is we're kind of like, we're pretty like-minded and we're, it's just kind of cool to like riff on those things. I appreciate that feedback too for me, because I would say I still fall into the trap of like wanting to go too deep too fast, especially when I talk about stuff that I'm interested in and also I assume.
Ari Kaplan: 21:42 Yeah.
Joshua Burkhow: 21:54 someone else knows. they ask me a question, I kind of interpret it on like, they're a little bit further along than maybe what they are. And I'll kind of go deep a little bit too fast. So that's, that, that's great. I love that. What, okay. So you, you, worked in a lot of different things and, and lots of different industries, but baseball, Formula One, you, you did AI and healthcare as you mentioned. mean, I'm curious, I mean, you had mentioned it earlier.
Ari Kaplan: 22:14 you
Joshua Burkhow: 22:22 You know this a lot, which is like, you said you'd be happy to be able to help anybody, you know, and anywhere. Did your life's experiences bring you to that conclusion? Or were you always a helper and a giver and trying to sort of make a difference in everything that you did?
Ari Kaplan: 22:41 No, so I was selfish as a younger person. But I needed to kind of fulfill my own ambitions. And like, in sports, it was the whole, you know, the Holy Grail is winning a World Series. And so to satisfy that and move on, great, you know, and ⁓ the other thing is I won't spend too much time on this, you know, was involved in the Raoul Wallenberg investigation when I was like in my late 30s. And, you know, it's tough, you know, seeing just how bad things can be and that you have an opportunity to do something about it and I'm like I can spend my life where...
Joshua Burkhow: 22:50 Yeah.
Ari Kaplan: 23:17 I have my life's mission as being something noble and make making a difference versus just kind of trying to like optimize things or you know because I knew that I did want to help people. That's the Caltech like mindset you use science technology you improve people's lives and I had to like shift it from kind of like selfish of like me to like more of other people and that came more when I became a parent but also when I reached at what I thought was my pinnacle and I'm like
Joshua Burkhow: 23:27 Right.
Ari Kaplan: 23:44 what's going to be that life mission. And I realized I needed to broaden it beyond just baseball or sports or healthcare and allow me to impact more and more people, but at a grassroots level. So you know, I'm never gonna be the CEO of a company. I can be on a board, I've been offered things. I don't want to be that. It's not the best use of me. I love literally, hey, this insurance company has a problem, let me go help them. You know, or let me like write some content that can inspire people.
Joshua Burkhow: 24:13 Yeah, well, you're doing it. can tell you that for sure. yeah, mean, certainly this, the rest of the episode here, now that we've laid the foundation of all the other stuff, I'll spend a little bit more time kind of picking your brain on AI and machine learning, a lot of the stuff I'd love to talk to you about. But I'm curious for you, what's
Ari Kaplan: 24:15 because like I just, I just love that.
Joshua Burkhow: 24:36 Like when you were in sports or when you had your pinnacle moment, I'm gonna guess that machine learning wasn't at the state it is today and, you know, data science teams are probably a little bit different. I mean, what have, I mean, how's it changed since then? And I'm curious also if you're seeing or noticing some of the patterns of like how a data organization should be set up to succeed and use these tools well.
Ari Kaplan: 25:01 Yeah.
So one cool thing is that I was always at the forefront of things, whether it was like a relational database, whether it was analytics back when it first started, whether it was cloud computing in 2000, whether it was big data and machine learning. So all the breakthroughs that have happened over the past decades have been when it first came out. I was learning it and utilizing it. So what's been a great gift that I've had is every new thing that came out, I could say,
Joshua Burkhow: 25:11 Right.
Ari Kaplan: 25:32 I've used this. And so I can say, here's where you do this, here's where you do that. That's super helpful versus here's a textbook answer. But like even in sports, we had AI. I used AI for draft picks with the Cubs. I would use AI for tracking players and whether they're going to succeed in the minor leagues or how they're developing. I used it for finding injured players, understanding, you know, what's injury rates, what could reduce it. So I've used it.
Joshua Burkhow: 25:37 Right.
Ari Kaplan: 26:01 and now with generative AI, not just people have been using large language models since BERT and transformers before, but generative AI, what's special about it is that anybody can use it. Like my grandfather or my child or my next neighbor can start working with it with a very simple prompt and get powerful results. And that really democratizes it. So everyone should be involved. And yeah, that's kind of like my take.
Joshua Burkhow: 26:29 I agree with you wholeheartedly on that. I think it's the same thing with like kind of the no code, low code stuff, which is not my favorite of all trends, but it really was this idea of trying to make things much more accessible. And I like democratization is the theme of it. And I definitely feel like Gen AI is doing that. What are some of the more common mistakes you see, especially in the rush of trying to get into Gen AI? Are there mistakes that you're seeing organizations make and how do you think that they can be avoiding those?
Ari Kaplan: 26:59 Yeah.
The thing is that trying to force fit generative AI is something where you should be using gen AI for what it's good at. It's not good at everything. And so like if you wanted to do like prediction for sports and just throw a bunch of prompt together and have an LLM do sports prediction, you're probably better off doing a regression or doing like machine learning. That's not what gen AI is for. So knowing...
Joshua Burkhow: 27:12 Mm-hmm.
Ari Kaplan: 27:26 what problems Gen AI is good at, which I think that's the first mistake, know, use Gen AI, you know, apply it to all these problems, but you're using a hammer for a screw. The second thing is I think people focus on like, what's the metrics, what's the, you know, evaluations, and this should be what I should optimize, and they don't take a step back and realize the big picture. I've seen companies come up with like, we have to optimize the low latency and we need like super fast response times for our chatbot.
Joshua Burkhow: 27:37 Yeah, yeah.
Ari Kaplan: 27:55 I'm like, that's not the thing to optimize. Why not optimize the right answer? I'd rather have it take an extra second or two and get the right answer than get a wrong answer very quickly. And the corporate cultures where they don't realize like they're trying to solve the wrong thing, they should be doing something very different. They may not have all the data they need. And just realizing where you are, so it's not just the answer, it's part of the journey. And they need to take a step back.
Joshua Burkhow: 28:13 Yeah.
Right.
Ari Kaplan: 28:23 and get that broader context.
Joshua Burkhow: 28:26 That's, I mean, definitely sage advice and I agree with you. It's picking the right problem. And I think even, think, we're all so keen to go solve problems with new technologies that we pick problems that aren't even necessarily the right priority of problem to solve, especially when there's probably like a lot of other, like we could solve this in like a quarter of the time with a better person onboarded or a process updated. Like why are we...
Ari Kaplan: 28:48 Yeah.
Joshua Burkhow: 28:51 jamming AI into this versus just making it simpler. There's always going to be that tension. And I think that's always going to be present probably in the next, I don't know, next 10 years or so at least. So what, one thing I'm really curious about is, okay, so you got all the way, you made the transition from all these years in baseball, you end up at data bricks and now you're in there sort of helping people.
Ari Kaplan: 28:55 Yeah.
Yeah.
Joshua Burkhow: 29:17 Do you see, because part of what I wanted to do when we jumped on was I wanted to talk a little bit about like how you approach sharing context with an LLM and things like that. But I'm wondering, you kind of talked about early on, this is getting into the weeds a little bit, but you talked early on about like looking at player stats. Like I imagine you cared about like the very specific context of like the most granular data to see patterns. Are you taking that into
Ari Kaplan: 29:38 Yeah.
Joshua Burkhow: 29:45 when you're, even when you work with clients and you think about working with LLMs, like you're trying to give the LLM a lot of really, really good data so it can find patterns. So it can do its job or like, what's your approach there?
Ari Kaplan: 29:58 Yeah, so it's actually a critical approach that's under, I think, under appreciated. And I say, so I speak with a lot of people I'm working with and they want to know, you know, how to do prompt engineering. I'm like, actually, the critical piece is you need data engineering. You know, you need to understand your data. So before you do the prompt engineering, before you get all the evals, you need to have quality data, you know, but
Joshua Burkhow: 30:07 Mmm.
Ari Kaplan: 30:26 Not to generalize for all problems, there's certain problems. I do want quality data. There's other problems where, like with sports analytics or databases, I wanted to have complete data. But sometimes there's no need for that. And you could have some of the unstructured data. You could have like noisy data, stuff that's incomplete. And that's okay. It just depends on what you're trying to do. But if you're doing customer service and you want to...
Joshua Burkhow: 30:47 Yeah.
Ari Kaplan: 30:53 have an LLM go through it, you want accurate, complete data. You want to know that that person is asking a question. Here's the full purchase history. Here's the full contact information. And so you just have to use the right data. And so, yeah, I love bringing data engineering is critical. And then yeah, the more specific, the more granular the data can be is very important. And with databases, you can scale. You don't have to worry about data sizes anymore. You can just go really granular like tracking.
Joshua Burkhow: 31:02 Right.
Ari Kaplan: 31:22 baseballs, location and movement, spin rate or player movements and arm angles. That only wasn't possible because like maybe 10, 15 years ago, we didn't have the technology to measure and have it affordable and accurate enough. But as you mentioned, once you have that granular data, the techniques I was using for many decades, you can leverage all of that. But you also then,
Joshua Burkhow: 31:25 Yeah, yeah, yeah, yeah.
Ari Kaplan: 31:46 need more compute power. You need larger storage. And the whole cloud computing really brought that together. And so you kind of needed all these things to happen.
Joshua Burkhow: 31:54 Yeah, yeah, absolutely. Just to be clear, I mean, do you, do you think that like going back to the foundation of good data management, even, I mean, like I always feel like I've, when I joined Alteryx as their chief evangelist, and one of the things that I wanted to tackle right away was I wanted to educate my team about like, you guys got to have a talk track on the fact that we do self-service analytics, but analytics is only going to be as good as.
Ari Kaplan: 32:16 Yeah.
Joshua Burkhow: 32:23 the quality of data you're putting into it. even self-service analytics for the masses, it ain't going to save you. If your data is shit and you don't have trust in it, and you don't even know if you have complete data, and it's not accurate in any form. I was like a big champion for that for a while. But I'm curious, like now if an organization, because there's a lot of bad data management out there, like do we have to go fix all that to unleash Gen AI or is it, what do you think?
Ari Kaplan: 32:49 Yeah.
It totally depends on what they're trying to do. But so one thing I want to say is I coined the phrase data plus AI. I made that the whole pillar of I just loved we're AI powered by data. I was one of the first people is a data AI company. And everything I touched is like powered by data and then doing AI. So you have to have quality data. But it doesn't always mean you have to clean all the data. know, one thing I'll say is that I'm very involved like in the data.
Joshua Burkhow: 33:03 Right, yeah.
Ari Kaplan: 33:19 AI governance space and like, okay, let's clean up and govern our data and make it go through the whole process. There's two approaches. If I just had to start a new company or start a new project, I would say start with data that's in good shape. But the reality is that most companies don't have that choice. Most companies, they're not a startup. They already have to, you know, years and years and years of data and it's kind of a mess. And so as part of generative AI and LLMs, you can use those tools.
Joshua Burkhow: 33:30 Right.
Ari Kaplan: 33:47 to help automate cleaning up that data. So use AI to transform that data and the reality is, you know, many companies are already taking that approach of like, let's leverage our unstructured data, let's leverage our knowledge, but let's make it more structured and let's bring it together and let's kind of clean it up. I'm not saying that's the perfect approach, but in many cases they can kind of go on that journey and once they do that, they can get much better results.
Joshua Burkhow: 34:16 Yeah. That makes sense. Okay. So I had this idea. wanted to ask you, because I know we're kind of towards the end here.
Ari Kaplan: 34:22 Yeah.
Joshua Burkhow: 34:24 But I really wanted to ask you, because I've done sort of the fast five before, but with you, I want to be like, I want to see if we could do a rapid fire sort of lightning round thing because I have so many things I want to ask you. I'm like, if I just let them, you give me the off the top of your head thing, but there's two paths here. Like if they're good ones, you could, we could punt, we could extend one into a longer conversation or we'll just do.
Ari Kaplan: 34:39 Yeah.
Joshua Burkhow: 34:51 We could do it straight up, straight down and call it a day.
Ari Kaplan: 34:54 Let's try it. I'm nervous, but I just say, I my lawyer will come out. I'll say it depends at times, but let's try it.
Joshua Burkhow: 34:55 Okay.
All right. There you go. There you go. Okay. We'll, we'll give it, we'll give it a shot. And if, and if we feel it's like too good, can always do a part two, a completely different format. would be like, that'd be a fun one. Okay. Let's do it.
Ari Kaplan: 35:11 Yeah, let's do it.
Joshua Burkhow: 35:12 All right. Okay. Most people don't know that I, fill in the blank.
Ari Kaplan: 35:22 Geez, my wife won't like me saying this, but I'll say it anyway. I play video games a lot.
Joshua Burkhow: 35:26 We'll come back to that. All right. Okay. So if you could rewind time and live in any era, which would you choose?
Ari Kaplan: 35:37 I love renaissance.
Joshua Burkhow: 35:39 Okay. Okay. One piece of advice you'd give your 25 year old self.
Ari Kaplan: 35:46 Listen more.
Joshua Burkhow: 35:47 Okay. If you could have dinner with any three people in history?
Ari Kaplan: 35:54 Einstein, Gandhi, and I'll go with what the heck, Cleopatra, let's go.
Joshua Burkhow: 36:00 I like it, I like it. All right, what's something you're currently learning or trying to get better at?
Ari Kaplan: 36:07 How to make impactful presentations.
Joshua Burkhow: 36:11 Okay, what's a controversial opinion you hold about AI or data?
Ari Kaplan: 36:18 Most of what people talk about like the future with regards to like the existential threat or super intelligence is total hype and B.S. That's a very controversial opinion.
Joshua Burkhow: 36:27 I love it. All right. If you weren't in tech or sports, what would you be doing?
Ari Kaplan: 36:33 A teacher.
Joshua Burkhow: 36:35 Love it. Okay. All right. We did it. Well, that was fun. That was a lot of fun. All right. I mean, we could keep going. mean, I mean, a few of those, obviously, would be really fun to explore. I mean, I've got like 10 more I wanted to ask you, so we might have to do a part two on this. But I'll save it and we'll do it later. But yeah, Ari, I mean, I really could...
Ari Kaplan: 36:40 We'll close it out and we'll leave it for another day. You ready?
Joshua Burkhow: 36:56 We'll close it out and we'll leave it for another day. You ready?
Ari Kaplan: 36:45 I'll leave my lawyer face where it depends off. All right.
Joshua Burkhow: 36:48 Yeah, just turn it off. All right, I'm going to throw you a softball here and then they get harder over time. The first one is the most underrated skill in AI.
Ari Kaplan: 36:59 Yeah, I would just say like natural curiosity, learn new methods. It's changing so quickly that like don't get set in your way. The automation of automation gets pretty meta. So, yeah.
Joshua Burkhow: 37:05 That's a good one. Curious. Love it. What are you most proud of in your career?
Ari Kaplan: 37:20 I would say helping the Cubs win the World Series since it rewarded generations of fans with that satisfaction.
Joshua Burkhow: 37:24 That's a pretty big one. That's right. Be careful, you're going to get pummeled by a whole bunch of Cubs fans. What's a hobby that you like doing that most people don't know about?
Ari Kaplan: 37:41 I have so many, but the one I still love is virtual reality. I love playing Meta Quest, like aside from family and work, I do that almost every night, many times a week. If you haven't tried video gaming on like VR, you got to try it. That's my hobby.
Joshua Burkhow: 37:45 Okay. Love it. I might have to jump into that one. All right. If we take the Cubs World Championship out of the, out of this one, but what's your next best baseball moment?
Ari Kaplan: 38:10 Wow, I'd like to give two, but I'll give my number one. Ernie Banks, Mr. Cub, coined the phrase, the friendly confines and Let's Play Two, played in, I don't like the term, but it is the term, the Negro Leagues. He played in segregated baseball, became my best friend. That's one hand. The other hand is Michael Jordan, arguably the GOAT, if not the GOAT, LeBron James, but the best player of the time in basketball. Jordan, Ernie Banks, and myself in my office at Wrigley Field, just the three of us talking about life, talking about human rights, talking about like racial discrimination and, you know, in America, what they each personally faced. That was an incredible highlight.
Joshua Burkhow: 38:56 Right. I can't even imagine. Did Jordan try to get you over to the White Sox to do the analytics there or no?
Ari Kaplan: 39:02 You know, they still, yeah, they still are in deep need and have many years to go. But Jordan, when he played baseball, they're like, it was an experiment. Can you take what could be the greatest athlete, basketball, and have him play baseball? The answer was no. He had basketball skills, but for baseball...
Joshua Burkhow: 39:21 Yeah. Yeah.
Ari Kaplan: 39:25 He just didn't have it. But yeah, the White Sox need serious help.
Joshua Burkhow: 39:31 I don't know if we'll publish that. I don't want them coming after me. All right. I saw this and it caught my eye that you did a little bit of playing in band at Caltech. Best jazz great. Who do you go to for jazz?
Ari Kaplan: 39:48 Oh, since you mentioned that, I'll say Michael Peter Balzary, otherwise known as Flea from Red Hot Chili Peppers. In a way, that's kind of a tongue in cheek since I was in jazz band at Caltech and he and I, hear he still does once in a while, sit in on the Thursday night jazz band. So he would come in and just sit in and play with no fanfare, no media. But, but yeah, I don't know. I'm in Chicago.
Joshua Burkhow: 39:53 All right. Yeah. Yeah. Wow. Yeah, yeah.
Ari Kaplan: 40:15 We're great for blues and jazz. So I would say Dizzy Gillespie for pure skills would be my next go-to.
Joshua Burkhow: 40:16 I mean, exactly. Yep, that's a good one. That's a good one. Yep. All right, sort of answered this previously, but I'd love to sort of rehash a retake of it. If I had a free plane ticket in my hand, would you go back to somewhere that you've already been, that you just absolutely love and admire, or would you choose a new place you've never been to?
Ari Kaplan: 40:45 It's hard. I'm still in the mindset. There's so many places I haven't been to. So I would try something new.
Joshua Burkhow: 40:50 Same. Yeah. Yeah. Even the risk that it ends up terrible is still rewarding in that regard. I'm the same.
Ari Kaplan: 40:55 Yeah. Yeah. Try to do a new culture, try to meet different type of person or experience I haven't done before.
Joshua Burkhow: 41:02 Right. Yeah. Love it. I love it. Ari, my friend, thank you so much for your time. We'll have to sync up calendars so I get to see you somewhere on the road and all our travels. But I really appreciate you being on the podcast. Amazing conversation.
Ari Kaplan: 41:21 Yeah, you bet. No, this has been great. You're an awesome person. So honored to have been on here and great to run into you and we'll sync up and figure out meeting in person.
Joshua Burkhow: 41:29 Absolutely. Thank you again, Ari. Appreciate you.
Ari Kaplan: 41:32 Thank you, appreciate you.
Joshua Burkhow: 41:34 Well, that was a lot of fun. Here are a few key takeaways that I took from my conversation with Ari. Stay relentlessly curious, but also practical. You know, the success in Ari's career is about building what's actually usable in the real world. Staying a few years ahead of the curve, then also moving on once breakthroughs become commonplace. The second thing is better data changes the game. You know, sports analytics exploded as the inputs evolved. So from box scores, they went to play by play to pitch by pitch and even sensor camera data today. The more granular the data, the more actionable the insights became. Now this last one is one that I really hold closely. And I believe that AI is ultimately a human story. You know, the tech is powerful, but the real work is navigating how AI is going to reshape identities, our jobs, our decision-making, and ultimately freeing all of us up from the mundane tasks. The Wallenberg investigation is really a reminder that your data skills can serve something bigger than ourselves. Uncovering things like truth, bringing closure, and even driving a positive impact. Now, before you go, if you liked today's episode, can I ask you a personal favor and share it with your friends and colleagues? Let us know what resonated, what you want us to explore next. You can email us at podcast@alteryx.com, that's A-L-T-E-R-Y-X. You can subscribe to Alter Everything on YouTube, Spotify, Apple Podcasts, or wherever you listen. Thank you for listening. We'll see you next time.