For a full list of episodes, guests, and topics, check out our episode guide.
Go to GuideJoin us in a discussion with Richie Cotton, Senior Data Evangelist at DataCamp and host of the DataFramed podcast, for a special crossover episode exploring the hottest topics in data science, analytics, and artificial intelligence! Don’t miss the full video of this conversation on YouTube! Discover whether AI will reduce the need to learn coding, the real-world applications of AI agents, and what it truly means to be an "AI-first" company. Get expert insights, practical advice for building a career in data and AI, and learn how to stay ahead in a rapidly changing tech landscape.
Ep 192 - DataFramed Crossover
===
[00:00:00] Introduction to the Podcast
---
[00:00:00] Megan Bowers: Welcome to Alter Everything, a podcast about data science and analytics culture. I'm Megan Bowers, and today I'm talking with Richie Cotton, Senior Data Evangelist at Data Camp. In this episode, we chat about all kinds of data and AI hot topics. We'll discuss questions like, "Will AI reduce our need to learn coding?", "What are applications of AI agents today?", and "What does it mean to be an AI-first company?". Let's get started.
[00:00:32] Meet the Hosts
---
[00:00:32] Megan Bowers: Hi, everybody. Excited to be here for a new type of episode for me, at least. I'm Megan Bowers, and I am a Senior Content Manager at Alteryx, and I'm host of the Alter Everything podcast, which is a podcast about data science and analytics culture. So excited to chat about data hot topics today.
[00:00:53] Richie Cotton: Yeah. Uh, great to be chatting with you, Megan. Uh, so I'm Richie. I am a Senior Data Evangelist at Data Camp. Uh, so, uh, I'm also the podcast host. I host the Data Brain podcast about data and AI. So, uh, great to have a collaboration with you.
[00:01:09] Megan Bowers: Yeah, definitely.
[00:01:10] The Role of AI in Coding
---
[00:01:10] Richie Cotton: I wanna talk a bit about, uh, changes in skills. So, uh, at Data Camp, we do a lot of, uh, education for people, uh, wanting to learn about data skills or wanting to learn some AI skills. And one of the big questions we have is, "Well, AI can increasingly write good code, do I still need to learn about, uh, how to write code?". So, uh, since, uh, you know, working for a company that makes, uh, a low-code, uh, tool, uh, do you wanna talk about, uh, how has that changed things?.
[00:01:36] Megan Bowers: It is really fun to, to work at Alteryx and talk to just the variety and like, diversity of people that use Alteryx and, and like the diversity of roles in the business. So we see a lot of business analyst-type roles jumping in with Alteryx, whether that's someone in supply chain who you know is getting drowned in the data and the spreadsheets and the supplier name challenges, or it's folks in finance who need more governability, auditability, and like, often these people aren't coming from a data background. Sometimes they've been pushed into it, sometimes they've raised their hand for something like, "Oh, sure, I can help with an ERP migration," and then all of a sudden, like they are a data person. Like, all of a sudden, that kind of like happens to them or, or they lean into it, they say, "Okay, I, I do enjoy working with data". But yeah, I mean, that's like one thing that I see all the time is that users come from more of like the business background, the domain expertise, and they are able to pick up the data analytics with the, the low-code, no-code tool that we have.
It's cool to watch, and it does kind of raise the question of, "Do analysts need these, these coding skills?". And it's hard to kind of come up with a one-size-fits-all answer, but I do feel like it's moving away from, you know, "You have to know R, you have to know Python" to "You have to be able to pick up some of these tools," and the tools might change, you know, you might get new software, um, to, to learn. And so that's like, that's one of the things I've kind of observed. And I think, I mean, when it comes to more of like the data science role, um, I have just been reading a little bit more on that, and I think like there are some shifts from going from as heavy on the coding, since AI can help with the coding, moving towards like the advising, the monitoring, the AI kind of advisor role, but I've seen, you know, I've seen literature on this or writing on this, but I'm like, "Ooh, I, I don't know if I've seen those job descriptions quite yet," but yeah. I'm curious what your take is.
[00:03:50] Richie Cotton: Yeah, yeah, I mean, I do think there's a big difference, uh, depending on what your, your job is. So like you mentioned, like even if you have never wanted to be a data person, you've gone into business or you've gone into like, uh, some other role, data's kind of invading everything, right?. See a lot of people where like, "I have no intention to be a data person, but now I'm somehow a data person". And there it's like, you really do want a lot of tooling assistance, 'cause maybe you don't care about the nitty-gritty of, of data so much, you just want to get your job done. Whereas if you are actively a data professional, you know, you've got into this on purpose, you've been mindful, "I love working with data," then yeah, you probably want a bit more control. Uh, and so coding is, is is still incredibly useful in some cases. And maybe it's different between data analysts and data scientists. I think for a lot of data analytics roles, particularly business analytics, Alteryx plus BI tool, that's amazing. You don't need to write code. You will solve all your problems. Uh, but for data scientists, I, I think Jupyter notebooks are probably still the kind of standard interface. And so, yeah, uh, a bit of Python, a bit of SQL, that's gonna get you a long way.
[00:04:52] Megan Bowers: Yeah, and something I think about too, just seeing some of the code generation or, or people using Claude or, or OpenAI to help with generating code. I think about, "Well, what if it doesn't work or what if you run into problems or like, yeah, you get some errors or you have issues six months down the line?". Like you do still need to be able to troubleshoot, um, and to be able to like, diagnose, and like, yeah, I think that there's that element to it as well.
[00:05:21] Richie Cotton: I'm right there with you. And I think, uh, one of the things we've been thinking a lot about, um, at Data Camp, as we create courses, where it's like you learn to code, originally, like the courses were very sort of focused on like, "Well, you need to learn the syntax". And now it's less about that. It's like, "Well, you know, the AI can help you with the syntax, you need to understand the concept of like, what are you actually trying to do?". And you also need to be able to read the code that the AI has generated and understand it just to make sure, "Is this actually right or not?". And be able to debug it, 'cause if something goes wrong, you need to have a clue about, "Oh, okay, what, what's happened here? What do I need to do to fix it?". Being able to read stuff is more important than being able to write stuff, I think, at the moment.
[00:05:59] Megan Bowers: Yeah. Yeah, that makes a lot of sense.
[00:06:01] AI Agents: Hype vs. Reality
---
[00:06:01] Richie Cotton: So, yeah, uh, beyond this, uh, I think you had a, a story about, uh, AI agents.
[00:06:06] Megan Bowers: Yeah, I can jump in. I mean, we, I know we wanted to talk about like AI agents and like the hype versus reality. "Are we there yet?" which is a hard question to answer. I appreciate like Gartner and other companies trying to survey people on this, but, but I did find some interesting, like, applications on this one Wall Street Journal article where they were talking about, "Okay, how are companies using AI agents today?". This was like, more beginning of the year, and there was an interesting one at eBay, where they're using agents to like, write code and create marketing campaigns. It was talking about how they kind of have this like agent framework where it kind of determines which AI models are used for which tasks. So to me, that felt like a use case at the next level. It's not just, "We have some AI models we're using". It's like, "We are kind of orchestrating and deploying, sending out," I don't know exactly how it works, but you know, "sending out the right agent for the right task, that kind of higher level of, of determining, you know, what is needed for which task". And I thought that that was, that was interesting. And they're, they're using it for code as well to like, yeah, generate code, improve code, things like that.
[00:07:22] Richie Cotton: Yeah, it just seemed like there were a lot of use cases and maybe the definition of AI agent is kind of very broad because it's on a marketing term at the moment because there's, everyone wants to say, "Yeah, we're using AI agents". And so there's like some very simple stuff where it's like, it's basically just like, it's a business process encoded in software and like, maybe there's an LLM in it somewhere. And then you have all these companies promising, like AI workers. So you've got, um, like this Cognition Labs has this "Devin AI software engineer". You got, like Julius AI's got, um, an "AI data scientist". And got like a "universal AI employee," which, um, sounds very ambitious, and I'm not quite sure yet. I'm not quite sure these work completely yet. Like, you can't quite replace humans reliably. Uh, I don't know whether you have a sense of like, where the, the sort of cutoff point is for like, "What's an agent that actually works?".
[00:08:17] Megan Bowers: It is tough. I mean, what you said about it being sometimes marketing, there's almost this "AI agent-washing," like greenwashing, where it's like you take something that's close to an AI agent and you spin it out as, as like, "We, we have an AI agent". So it can be kind of tough if you don't know the details to know, "What exactly truly is agentic AI being deployed at the companies?". Some of these are still maybe more in experimental phases. There was this like, report by Gartner that like, "40% of agentic AI projects will be canceled by 2027". And that, that prediction, and I was like, "Whoa, that's a, that's a big number," and I think it speaks to some of the early-stage-ness of it all.
[00:09:03] Richie Cotton: Although surprisingly low, actually. Like, only 40%, that means 60% is actually gonna work. I, I feel your glass. That's wild. Optimistic.
[00:09:10] Megan Bowers: Okay, okay. I don't know. I was just like, "Oh, if we're, if people are missing billions of dollars, that's a lot of, 40% is a lot of dollars that are just gonna disappear because of like, unclear business value or risk, but you know". I appreciate the glass-half-full take, Stacy. Yeah, you're like, "60% of these agents like that, that could translate to people losing their jobs because if the agents are like a digital workforce". It's true, 60% is pretty high.
[00:09:40] Richie Cotton: Definitely. I mean, I think there's like, um, the very simple agents, uh, and maybe like, uh, where most of the action is at the moment. So, um, regardless, like, uh, one of my colleagues in the sales team created this agent where like, after every sales call, the, the salespeople are supposed to write about like, the, the state of the deal, like what happened in the call and use this framework called MedPick, which I really understand. The sales thing just basically says, "These, these are the attributes like where we're up to with the deal," and all the account executives, their business development reps, they, they really hate doing this, 'cause writing, what happened in the call is tedious and it's going after new deals. So there's an agent now. It takes the transcript, it summarized it according to the format. It uploads the, the details like wherever they're archived, and Salesforce somewhere. So is doing a task that like, humans just really, really hate doing, and they do it badly because they hate it. Um, and so automating that kind of task is like, it's fairly straightforward. It works, and you don't even have to, it doesn't have to be perfect. You just have to be better than a human that doesn't care. Um, so, uh, that's, that's like just a really nice example. I don't know whether you've seen similar examples of just like, simple things that you can automate.
[00:10:51] Megan Bowers: Yeah, I mean, I think that it's, it's the automation piece that kind of moves into the agent. Like, you're not the one copy and pasting the transcript, sending it to ChatGPT, getting it, like sending it out. If there's like that automation of the task, I feel like that is the first thing like that I've seen where it's just like the next, it is kind of the next logical step where, "Okay, um, you know, we've seen people in Alteryx, like, uh, pull in LLMs to enrich or to, to clean up their data, things like that". Um, but then once that is incorporated into like a full automated workflow, and it's, you can schedule it, it schedules on a run, it does this, like, I think it gets more into that AI agent category.
And I've also seen it, seen, um, kind of the agents starting to be used in cases where just, uh, you wanna, you wanna get to answers, uh, but you're trying to pull from like many data sources, not just a, a web scrape out the internet. You, you want your business sources, you want your Salesforce data in there, you want your customer data or, maybe not customer data, but you know, all these different data sources at your company, and having an interface where you can interact with all of that and even like, automate some reporting, things like that, I think, um, is where I've seen some, some companies moving into.
[00:12:19] Richie Cotton: Okay, yeah. So I'm, I like, I like the idea of, um, or at least automated data cleaning, 'cause that's the, like, the part of analytics no one really likes. I know no one's like, "Yeah, I love cleaning data". So, yeah, uh, that's a great idea for just, uh, yeah, automating the business of your job that you don't like. Uh, and it's good for your mental health anyway, even if, uh, even if you don't care about the productivity benefits.
[00:12:42] Megan Bowers: Right?. Yeah. Creating lookup tables not good for your mental health, so being able to automate the data cleaning and swapping and, and all of that. Totally. It's much better as well as the, yeah, memorizing coding syntax. Like, I don't know, people already are Googling syntax. You just can't remember it all the time. So anyway. Yeah, I like that.
[00:13:09] Richie Cotton: Absolutely.
And I guess, uh, so maybe one of the bigger developments over the last few months has been that these deep research tools are getting quite popular, and they seem to be sort of getting to the point where they actually work, uh, of what they're promised to do, which is just think a bit more carefully. And so you can have sort of slightly more complex reasoning also. If you build these into agents, I feel like that's the next step of having agents that can think a little and do and make smart decisions. I don't know, have, have you played around with the deep research tools at all?. Have you, uh, seen whether they work or not?.
[00:13:41] Megan Bowers: I've seen some colleagues play around with them and show some kind of things around, say they have a new prospect or a new customer, and they really wanna know, like, "I wanna help their business with our software". Like, "I wanna know all about their business," and the amount of information, 12-page to 20-page research report on, "This is the business, these are the challenges". Like, some really interesting information, and like, I was kind of impressed by just the volume of it all, um, as well as the amount of time that, um, this was through ChatGPT, that, that ChatGPT took to think about all of this. Yeah, I haven't played around. I like, love using for research, but haven't played around as much myself with the deep research. But what about you? Have you seen some interesting results from that?.
[00:14:35] Richie Cotton: Yeah, so it just seemed to be one of those things where, um, the theoretical sort of really good use case is doing customers for research or competitive research, where you're like, "I don't really want to go and read through like 20 blog posts from this competitor". "I just wanna see what they're doing". But then, yeah, uh, the deep research tools, they go away, they spend like 10 minutes, they, they come back with a 20-page report, and they're like, "Well, do, do I wanna read 20 pages on this?". Then you have to use AI again to summarize what the deep research tool did. I
[00:15:09] Megan Bowers: just like, that was a little bit. It occurred to me when, when this 20-page document was shown, I was like, "Woo, gonna need an executive summary".
[00:15:16] Richie Cotton: Yeah.
So maybe like, uh, it, it's a work in progress. I feel like at some point it's probably gonna get very useful, but now it's kind of used with caution, and you really do have to spend quite a lot of effort, I think, just from engineer to make sure you're getting exactly what you want there.
[00:15:31] Megan Bowers: Right. And something that made me think when you said, you know, "the next step of maybe incorporating this mode into AI agents," like, it is more research, resource-intensive, and if, if there are issues with really defining the true business use case, the true ROI of an AI agent, and then you put in this insane, costly, or like resource-heavy computing with the deep research mode, it's like, "Oh, yeah, definitely proceed with caution," in my mind, of, "You definitely would wanna kind of work your way up maybe to, to incorporating a model like that to make sure that, that, that you truly need that power and they're not just kind of making it advanced research just to make it advanced research". That's just something that comes to mind.
[00:16:16] Richie Cotton: Absolutely. I mean, you can research, uh, things like forever, but unless you actually act on that or you've got like the right information in there, then it's just a, a, a sort of a waste of everyone's time, I suppose.
[00:16:27] AI-First Companies: What Does It Mean?
---
[00:16:27] Megan Bowers: A lot of this, we're talking about a lot of AI applications, and there's been headlines around companies saying like, "they're an AI-first company". So I'm curious like, what your take on that is, on what that really means.
[00:16:41] Richie Cotton: Oh, yeah. I mean, I have very mixed opinions on this. So I think, uh, on the positive side, it's like, "Yeah, in every business, there are gonna be lots of use cases of AI that people have not yet thought of". And it's important for CEOs to motivate people to be like, "Yes, let's go and see what we can do here". Um, the cynical side of me is like, "Well, actually, this is just, um, a polite way of saying we wanna do a hiring freeze, or we wanna make layoffs," and this is just like a, a way to make it more publicly acceptable. So there are definitely gonna be, um, some sketchier cases where this is just a sort of PR front around, uh, layoffs or around hiring freezes. But in general, I think it is good for almost every business to be start thinking, in fact, not just about AI, but in general about, "How can you go about, uh, re-engineering your processes?". Because just the fact that tooling has got so much better just for like, uh, automating things with software and doing new things with, with LLMs, doing new things with, um, computer vision. It just means there are a lot of use cases for making your process better, and as you start thinking about changing your processes, you might go, "Okay, well, we don't actually even need a technological fix". "It's just something simple like, 'Oh, well this workflow is stupid, let's do things a different way,' and we're gonna see some improvements there".
[00:18:03] Megan Bowers: Yeah, yeah, definitely. And I think that, you know, I, I can also be a little on the, the cynical side with the top-down mandates or the, "We are AI-first". And it's interesting to see, there was one example with like, uh, the company Klarna, where they, they went AI-first for their support service chatbots, and it went from humans to AI, and then they ended up rolling that back and starting to hire back some humans because the customer feedback just wasn't it. They were getting, yeah, lower satisfactions. Like, I think that at the end of the day, like, the AI that they implemented, it wasn't truly what the customers wanted. And I think that that's like an, a good thing to keep in mind for, or like, yeah, to keep in mind for all these companies when you're saying "AI-first," and I think it's important to start considering whatever business unit you're in, wherever you are, of like, "How could, how could I use AI to make this better?". But then, you know, if you're putting it in your products, your offerings, your services. "Okay, is this truly what the customer wants? Does this really solve their problem? Does this really improve their experience?". I think that that is gonna be like a more important question moving forward.
[00:19:25] Richie Cotton: Yeah, definitely. No, I do find that, uh, Klarna case very fascinating because, I mean, with everyone saying we're going AI-first, now it's like, "Klarna has started this process like back in 2023". They're very much ahead of the curve. Um, I think they did have some successes. Like, uh, they, they've, they sort of had this sort of, um, chatbot that was hooked. It was like a, I think a GPT-powered chatbot was hooked up their recommendation engine, so giving product recommendations, and that kind of worked. The area where it didn't work so well was with trying to completely automate customer support. I think it's like, careful, it's good to think carefully about, "Where do you want your humans in the process still?" and like, "Are there any absolute limits on like, 'We will not get rid of these humans,' as you're starting to think about, like, uh, incorporating AI?".
[00:20:07] Megan Bowers: Yeah. Yeah, definitely. That, "Where, where's the human in the loop?" "Where is the review point?" kind of no matter what, what process you're looking at improving with AI?.
[00:20:17] Richie Cotton: Yeah, definitely. Well, well, actually, as I say it out loud, you're probably not gonna know that in advance, right?. You probably just have to do some experiment and just say, "Okay, well, maybe we, maybe we take it too far and we roll it back," like Klarna. "Like, you can always undo these decisions, right?". "You can always hire more people," not, not the most efficient way of doing things, but.
[00:20:36] Megan Bowers: Yeah, not without some reputation damage in this case, but yeah, you're right. I mean, you have to have some sort of testing at least to, to get a sense of where you might need review points. But I, in my mind, ideally that testing is done either as a, as a beta test or internal tests or whatever, before you launch it out to all your customers.
[00:20:57] Richie Cotton: Yeah, absolutely. Uh, you're probably gonna want to like, just trial the AI-first process at least a little bit and see, "Do your customers hate it or not?" before, uh, before you roll it out to everyone.
[00:21:09] Evolving Data Roles
---
[00:21:09] Richie Cotton: Okay, uh, so, um, I guess, uh, the next thing to talk about is, um, the, the changes in data roles. Uh, so, um, for a long time, there was this idea that the data scientist was the, the "sexiest job of the 21st century". I really had that phrasing, actually, 'cause you know, it, it's, it's, it's not really true. Uh, a good adjective for most data science work. Uh, yeah, uh, spending all day, uh, swearing at a Jupyter notebook is, is, uh, not really that sexy. But anyway, data engineering is kind of arguably, uh, claimed that crown. Uh, so data engineering is super hot at the moment. And then also, um, you've got analytics engineering, sort of, uh, more recent role as well. Um, so yeah, I guess let's talk about, uh, let's talk about those roles. Uh, do you wanna talk us through, uh, how you feel about those roles?.
[00:22:02] Megan Bowers: I feel like the, the more of the focus on the data engineering, the analytics, engineering, it's, it's all this more focus on getting those, getting the good, clean data in pipelines like so that it can be used for AI. So it makes more sense to me that as we start to automate some of the, maybe some of the modeling or AI-assisted modeling and then, you know, AI data, all this stuff, it's going to be just even more important that we have the reliable data. 'Cause if you feed bad quality data through these AI systems, then you're left with something even worse. I don't know, uh, that's been like a theme, I think, on, on our podcast, just the importance of good data, of the right data for your business, of all of those aspects. And I just think that, that, like, obviously there are data scientists who have these huge, really broad roles that kind of are already doing this, and maybe they will just like, shift their focus a little bit more or see job descriptions that shift more into the data preparation and data pipeline, um, element. But yeah, that's just what I've been seeing of just the importance of, uh, having the right data set up for these AI models so they can actually get the value out of it. Um, and I think that that will continue to kind of shift what, what is like, wanted in the market for, for data-type roles. What have you been seeing?.
[00:23:36] Richie Cotton: Yeah, I mean, it's, it's really interesting, um, the idea that sometimes you really want like high quality control in your analysis. And, um, I forget what a lot of analytics, uh, that's half the job. And the other half is just like, very much ad-hoc analytics. So, um, the stuff I do is like, "Oh, it's trying to work out like, which podcast episode is more popular". Very little quality control needed. It's just like a quick look at the data. But if, um, my colleagues, uh, in the, in the finance department where it's like, "Okay, we need to know exactly how much like ARR the company has, like how much cash flow we have," uh, uh, they do paranoid data science, so everything has gotta be absolutely spot on, and they're the quality controls there and you need to, uh, have those pipelines in place. And that's where, um, the data engineering, the analytics engineering, to make sure every, um, "I" is dotted, every "T" is crossed, uh, that's where it becomes incredibly important, I think.
[00:24:30] Megan Bowers: Yeah. Yeah, definitely. It's important to have that trust element and like the, the quality control, I think, that you were talking about. Definitely.
[00:24:40] Richie Cotton: Absolutely.
Um, so actually, uh, with the analytics engineer, I think it's, it's incredibly interesting because a lot of the job there is that, um, when you've got different, um, teams having different definition to things, their job is to kind of synchronize that. So I guess like a, a common thing is like, "Maybe like one team you say, 'Okay, uh, anyone who has registered on my website, they're a customer'". Another team might say, "Well, okay, they actually have to buy something and pay us some money in the last financial year, and that, that, that comes from as being a customer". And so different teams having different definitions of stuff cause a whole lot of confusion. And, uh, analytics engineers, one of their job is to create this semantic layer of like, what are proper definitions of, uh, any business metrics that are used by all teams. And I love that kind of synchronization, just reducing confusion throughout your business, which is incredibly important as you, particularly as you scale your business and you've got more teams who are in different locations.
[00:25:35] Megan Bowers: Yeah. Yeah, I think that's awesome. I think that is like, kind of an age-old problem of, "Oh, these people in this business unit, I mean this, by this metric, but someone else in another business unit will see that on a dashboard and freak out". And so I really love that, having that, that, that business layer, like, have been data dictionaries and, you know, like that layer to describe, "Okay, this, these are the, the, the rows and columns you're looking at," but like, having that for the business, I think, is really important. And I guess that does kind of transition into what I wanted to talk about next because it gets into this like domain knowledge, like, "What happens with domain knowledge, the business knowledge as we move into, you know, in the data space as we move into more of the AI-powered, more of the automation?".
[00:26:25] Richie Cotton: Oh, yeah, absolutely. Because, um, just thinking about it, if, um, you've got seven different definitions of what is a customer, and you're asking your AI chatbot, "How many customers do we have?", it doesn't know which one to use. You're gonna get different answers, different times. It's kind of probabilistic or it's a little bit, uh, random. So you really want that single definition. So you need that semantic layer, that single definition of any given business metric. Otherwise, the, the AI is not gonna be able to give you a good, consistent answer.
[00:26:55] Megan Bowers: Yeah, definitely.
[00:26:56] Richie Cotton: Yeah. And so, uh, you, you were talking about the importance of, um, domain knowledge. Do you have a sense of like, what data practitioners need to know about their, their domain or their, their business?.
[00:27:07] Megan Bowers: Yeah, it's a big question,
[00:27:08] Richie Cotton: right?.
[00:27:08] Megan Bowers: It's a big question. That's a good question. I'm gonna go with the classic, "It depends". I think it can really depend on whether like the data practitioners are kind of in this, in like a center of excellence function where they are, you know, they are the data analytics, data engineering, whatever team, servicing for a bunch of different teams versus like, being a finance analyst or being, you know, a, a marketing analyst person in the marketing department. I think definitely the threshold for those, that second type of role is higher. Like, you're expected to have more of the, of the domain knowledge. Overall, though, the biggest thing for me when it comes to domain knowledge is like, you need to know what your business stakeholders care about. Like, what metrics are most important to them and, and the why kind of thing. Understanding, um, and asking good questions around if you're dealing with data, it's like, "Oh, what does this mean when this number goes way down?". Or asking those probing questions of getting out, you know, "Why do you want this analysis?". Because occasionally, sometimes stakeholders will come and, and think that they want one thing, and, and as you dive deeper, you realize, "Oh, there's actually, we, I know about the data at this company, there's actually more data we could use to like, really get you some more insights," but they don't know 'cause they're focused on their own roles. And so, like, asking those like questions and having like a feedback loop with the business side, no matter how embedded your role is, I, I think it is just like a crucial, a crucial skill, especially if we're gonna, some of the technology skill pieces might be automated.
[00:28:49] Richie Cotton: I do love the idea that, um, it's incredibly important to be able to translate between business questions and data questions. It's just like going backwards and forwards between the two. I think that's a, a pretty timeless skill. Um, so, yeah, uh, definitely want to to work on for a lot of people. I was definitely interesting too. You talk about the difference between being in a central data team and being, uh, embedded as a data practitioner within like a, a, a commercial team or some other team. So I think that's definitely a trend I've seen is that, um, for, for data analysts and data scientists, they are increasingly becoming embedded. So unless you're a large company, it, it's sort of fairly rare to have, um, analysts in central data teams, though more and more it's like, "My job title isn't 'data analyst,' it's 'sales analyst' 'cause I'm in a sales team" or "I'm a 'marketing analyst' because I'm in the marketing team" or "I'm doing data plus product and I'm in a product team". So, uh, I don't know whether there's something you've seen as well, and do you think that's changing the nature of the, the data analyst role?.
[00:29:50] Megan Bowers: I do see that at, at smaller organizations, like, I don't know, in my mind the, the centralized function where they're kind of taking in tickets and they're, they're managing things like that can be maybe like a more mature kind of data structure, but I'm, I'm not sure. But like, I do see, um, a lot of embedded analysts and even, I guess going back to the very beginning of the episode, I see people who, their teams need an analyst and they step up and they, they become an analyst even though that's not their background. And something interesting like a conversation a few months back on the podcast was with, um, with an author who talks about like, this role of "citizen data scientist" and I, I, I guess it's something that is also kind of an, a new industry term that I think is really interesting of like, those people who are almost, you know, "50% business, 50% data scientists," and they're able to do that because of the low-code, no-code tools that arise. And so, yeah, I think like, also going back to what you said on, "There's data everywhere," like, you have to, you, you know, no matter what role you're in, data might, uh, data's gonna find its way into, into your work. Uh, so there's kind of some evolution of, of these roles. Um, and I'm curious to see how that'll continue to play out. It's hard to, hard to know.
[00:31:19] Richie Cotton: Yeah, definitely. I have to say, uh, I'm totally there with you that, um, even if your background isn't data, a lot of these, uh, the tooling is there now, so it's very easy to do at least some analyses, even if you have no idea to be a, uh, a data analyst is, is all your job. But just like, yeah, I mean, just picking up, uh, some simple like Alteryx workflows, um, or drawing some, uh, dashboards, uh, in Power BI or Tableau or Looker or whatever the, the platform is, these aren't terribly difficult skills to learn. So it's something that you can do in addition to having, um, like your, your sales, marketing, product, whatever, knowledge. So yeah, you can do it as like a, a kind of part as part of your job rather than it being like a, a full-time everything.
[00:32:04] Megan Bowers: Yeah. And but then I do think it gets a little trickier as you move into the more traditionally, like, more technical data science, data engineer, you know, "What is that?". "Are there citizen versions of that?" Um, uh, that can get a little, a little harder to balance with the business side, probably.
[00:32:24] Richie Cotton: Yeah, definitely. I think once you get into, um, I'm trying to think what a, a more difficult thing. So maybe like time-series forecasting is, uh, a very useful thing, uh, for business analytics, but it's also, it's a pretty technical skill. You are gonna have to spend like a lot more time learning that. Um, the thing is with statistical stuff, so hypothesis testing, um, it's one of those things where like, uh, when people take courses on Data Camp, they, we have to push it further back in the curriculum 'cause it's like, it's the first thing that's like, really conceptually tricky. Like, everyone gets the hang of like, drawing plots pretty quickly. Uh, well, not everyone, but most people get hang of drawing plots pretty quickly and maybe data is fairly easy to learn for most people. Um, you just gotta kind of spend the time. But then statistics, you really gotta stop and like, think carefully 'cause a lot of it's counterintuitive. So, yeah, uh, there are definitely some conceptually harder things that, um, maybe you do have to put in a lot of time to do well, although maybe AI will make it easier for everyone, but.
[00:33:28] Megan Bowers: Maybe it will. Yeah. It's, it's kind of, it's a crazy time. I even just preparing for this episode, like, just looking at all of the articles out there on AI, "this, AI-first, that, like, these companies are doing this, this is, this is not getting value, this is getting value," like, I sometimes I feel like I can't keep up a little bit. That is, it's challenging to keep up with, with the pace of all this sometimes.
[00:33:56] Richie Cotton: Absolutely. Yeah, I mean, I think it's like, it's you and 8 billion other people trying to figure out, "How do you keep up?". Um, so, uh, yeah, I think continuing to learn stuff, especially when you've got a day job and you've got deadlines of things, it is difficult, and part of it's just about building that habit of spending time to learn. So whether it's like, you block off like half an hour on a Friday afternoon to to, to just sit and read stuff or whether you're try and make it like a 10 minutes on your commute every day. Um, yeah, uh, building that habit of learning is incredibly important. I don't think you have any tips for like, "How to build learning into your day-to-day life?".
[00:34:37] Megan Bowers: I mean, I think it is important to incorporate as, as you can in small bits. And for me, it's been important to when I do have a little bit of extra time, just taking a bit of time to like, experiment. If I hear about a new tool, if I hear about, um, somebody else's use case, you know, do kind of the "learn by doing," at least with just like generative AI. For me, that's been, that's been helpful, um, to kind of play around with it and see what it is in action. Um, but yeah, I'm sure that people wanna learn more. The, the Data Camp courses, I think, could be super useful to, to get into that.
[00:35:19] Richie Cotton: Absolutely.
Um, and so I think one of the things I've found especially helpful is like, you know, there's all these like, papers on archive, like new technology developments and, frankly, I can't bother to read the whole version. So it's like, "Okay, download the PDF, throw it into, uh, one of these AI chatbots, like, 'Okay, can you just tell me like, uh, the executive summary?' like, a bit more, get a bit more detailed than the abstract, maybe, but like, uh, just ask a few simple questions to try and understand what's going on there". I found that incredibly helpful.
[00:35:47] Megan Bowers: Yeah, that makes me think like, maybe the logical next step of this podcast is creating an AI agent that sends you automated summaries on data hot topics if we, if we go through the next steps of what AI agents could do. But anyway, more of a silly example.
[00:36:04] Richie Cotton: I mean, beyond this, I have to say, um, with LinkedIn feed, I have to follow a lot of like, AI influencers, and so it's very difficult to like, uh, go on LinkedIn without just seeing someone talking about like, whatever the hot new thing is. So, uh, yeah, uh, it is definitely, uh, fairly easy to sort of passively consume this when, uh, it's been thrust at you.
[00:36:25] Megan Bowers: Yeah. Yeah, definitely.
[00:36:26] Building a Career in Data and AI
---
[00:36:26] Richie Cotton: Do you have any like, final advice for people who want to like, build their data or AI careers?.
[00:36:33] Megan Bowers: What's interesting is typically for, or sometimes on our podcast, it's like, "advice for people getting into data". And for that, like, my biggest thing is like, "Find the data where you are in your role now and start experimenting, start trying to use these new tools". And, you know, whether that looks like automating your Excel analysis or downloading a, a visualization tool or, or experimenting more with an LLM, I think that's a great place to start. Um, in terms of like, building, you know, an already driving, you know, career in data as, as we see AI more. I, I do think a piece of it is trying to keep up with some of these announcements and like, understanding when certain new technologies come out, when that could be transformative, like, for your work and instead of maybe shying away from, "Oh, gosh, that tool, it's like, maybe it could replace my job in, in five years or something," like, leaning into, "How could I use some of the stuff I have now to, to show and demonstrate that I am, you know, automating and I am improving with, with AI and be that person that people start coming to at your company when they have some AI questions or, or use cases or ideas?". Like, if you can get to that more of the AI advisor role, whether unofficially or officially, I think that's huge. How about you? What, what is your advice?.
[00:38:05] Richie Cotton: Yeah, no, I, I love that idea of, uh, getting to the point where other people ask you for advice just by, uh, understanding use cases and, uh, having like, at least like the first clue about what to do next. Uh, having a, developing an introduction for like, "What's, what's sensible, what's not?". Um, for me, I think a lot of the stuff we talked about on the show today about the idea of having a mix of technical skills and business skills, and beyond that, I guess, it's the soft skills, it's the communication skills, um, sense of ownership, all this kind of stuff. Those is a package that's really gonna help your career, I think. Um, beyond that, I guess the one thing we've not really talked about is the idea that everyone's going "AI-first," and it's all about process engineering. So understanding process analytics, that's the one area of like, analytics, I think is like, dramatically underrated. No one tends to care about, but like, it, it, it's the future.
[00:38:58] Megan Bowers: Yeah. That's awesome. Yeah.
[00:38:59] Conclusion and Farewell
---
[00:38:59] Megan Bowers: Thanks so much for this conversation. This was super fun.
[00:39:02] Richie Cotton: Absolutely. I really enjoyed chatting with you, Megan. Thanks
[00:39:06] Megan Bowers: for listening. To learn more about Data Camp and the Data Framed podcast, head over to our show notes on
alteryx.com/podcast. And if you like this episode, leave us a review. See you next time.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.