Data Science Mixer

Tune in for data science and cocktails.
Alteryx Alumni (Retired)

From studying extreme environments to using data to answer business questions, Tessa Jones from Calligo shares how her science-focused mind has helped her succeed in data science. 






Cocktail Conversation


Mixer LI-2.png


During the episode, Tessa mentioned she sometimes thinks "business scientists" is a better name for her team than "data scientists," because it reflects how they understand the whole business landscape as a larger ecosystem.


What do you think about that name? Do you think it better represents your work? Is there another option you'd like better?


Join the conversation by commenting below!




Episode Transcription

SUSAN: 00:00

Welcome to Data Science Mixer, a podcast featuring top experts in lively and informative conversations that will change the way you do data science. I'm Susan Currie Sivek, the data science journalist for the Alteryx Community. For today's episode, I talked with Tessa Jones.

TESSA: 00:16

I'm the Director of Data Science for Calligo, a company based out of London.

SUSAN: 00:23

Awesome. And where are you physically based?

TESSA: 00:25

I'm based out of Seattle.

SUSAN: 00:27

Okay. Very cool. Would you mind sharing with us which pronouns you use?

TESSA: 00:30

She, her.

SUSAN: 00:32

And are you possibly having a happy hour drink or snack as we're visiting today?

TESSA: 00:36

I'm having a chai tea right now. Yes, I'm very much a coffee in the morning and then tea in the mid-morning kind of person.

SUSAN: 00:45

Oh, yeah. Yeah. Yeah. Same here. I'm having a, I think it's called, Autumn Sunrise mix. So, yeah. Had to move on to the herbal tea though, so. Otherwise, this would have been a far too animated conversation. [laughter] Tessa has recently been on a new intercultural adventure with Calligo, which she'll share in a moment. But as you'll hear, it's the first of many amazing experiences she's had in her career. First, as a scientist exploring both earth and space, and then as a data scientist who emphasizes keeping the science in data science. She has great insights to share, including what it means to think of data science as existing within a larger ecosystem and how models might consider ROI alongside standard metrics. It's all part of how Tessa thinks about building not just machine learning models, but machine learning solutions. Let's hear a little bit more about that intercultural adventure. Great.

TESSA: 01:45

Actually, our company just got acquired by Calligo. So we're their first kind of venture into the United States, so.

SUSAN: 01:54

Oh, awesome. Very cool. How is that going with different cultures joining together?

TESSA: 01:59

Super fascinating. It's really interesting.

SUSAN: 02:01

Yeah. I bet. I bet.

TESSA: 02:02

Yeah. Yeah. It's really interesting. And I'm part of our internal Jedi committee, and I'm passionate about that. And it's just been interesting to hear things that are relevant to them that I hadn't considered. Things like royal lines and things like that. That I think, "Of course that makes sense." But it's not something I'm exposed to, so it's been interesting.

SUSAN: 02:24

Right. Right. How has that come up, if you can talk about it?

TESSA: 02:27

Yeah. No. It was mostly just a conversation between our two DEI kind of communities coming together. Understanding some of the things and the different topics of areas of possible discrimination. And just through the conversation, some of the people on the UK side were just talking about apparently there's some people with royal lines that work for Calligo. And it's just interesting to think that that's actually a differentiator in the company. Anyway. So just the interesting things to think about that are new to me.

SUSAN: 03:03

Yeah. Yeah. Not the usual for an American audience or company?

TESSA: 03:06

Right. Yeah. Exactly.

SUSAN: 03:09

That's so cool. So Tessa, one of the things that I noticed as I was looking over your LinkedIn and all that good stuff, I thought it was so interesting that you have worked in an underground lab, but also for a space science company. And I thought those were really cool extremes. And I'm just very curious to hear about how that all progressed and how that all eventually led you to data science?

TESSA: 03:33

Yeah. Yeah. Great. Well, actually, a lot of it came out of the research I was doing as a grad student, the underground laboratory work. Because I was working on research related to thermodynamics in aqueous geochemistry. So basically, modeling groundwater flow and thermodynamics. Dissolution of minerals and modeling those things in extreme environments. Which once you get to be a mile or two underground, those are kind of extreme environments. And so that's where a lot of my focus of my research was. And working with biologists to study the impact of the water for extremophiles, microbes that live in extreme environments, like high pressure, high-temperature environments. I worked there for six years, I think. I spent every day underground walking around in dark tunnels. And it was a pretty fascinating world to be exposed to.

SUSAN: 04:38

So do you park outside and go down an elevator every day into the tunnel system and go to your lab from there?

TESSA: 04:45

Kind of. It's not quite that sophisticated. It's not an elevator. So actually, the lab was put into an old abandoned gold mine that had been operating for over 100 years. 126 years, I think. And so basically what they did is they just leveraged the fact that they had this mine that went 2 miles deep and it had 300 miles worth of drift or tunneling throughout. And there's essentially two very large vertical openings that go all the way down. And they have shaft elevators. They're not your classical elevators. There are a lot more clunky and--

SUSAN: 05:28


TESSA: 05:28

Yeah, yeah, yeah. For one of them, you have to wear the harnesses to make sure you're not going to fall hundreds of feet and things like that. And so it was a really fun, interesting experience on multiple levels. Yeah. And then after that-- so I hopped to a couple of different universities during my grad experience. And my first one I had an adviser where I focused more on space science. So studying Mars mostly. And so she had connected with me-- I was wanting to move on to something different. She had connected me with a company in California that collaborates with NASA and JPL on missions. And so I got a job with them and ended up working on the Curiosity mission, where a lot of my job there was to help build some of the programming around automating the process. Basically, the daily life cycle of the rover. And a lot of my job there was helping to automate it with a much larger team and then sort of helping to be a liaison between the technical team and the geologists, so.

SUSAN: 06:44

Wow. Very cool. So were you really excited to see Perseverance recently?

TESSA: 06:47

Yes. That was so fun. Yes, very exciting. Felt very reminiscent. And it's a pretty neat thing to be a witness too, yeah. It was pretty wild. I think the wildest part of that experience for me was definitely trying to live on Mars time. Because when the rovers first land, everyone who's kind of contributing to the operation of the rover, you have to live relative to Mars time, which is 45 minutes different than Earth time, roughly. And so your days shift every day. So after roughly two weeks, your middle of the day becomes your middle of the night. And then two weeks later, you're back to where you started. But it's constantly rotating. So it was kind of an interesting thing. So psychological challenge. [laughter]

SUSAN: 07:38

Yeah. Yeah. So first, the underground psychological challenge and then the time warp psychological challenge. It's so interesting. So working across time zones not a big deal for you, I imagine, after--

TESSA: 07:48

Yeah. Not really. Sure. Yeah.

SUSAN: 07:52

So tell me a little bit about your current role and how you ended up doing what you do now?

TESSA: 07:58

Yeah. Well, from there I worked for a company called [SIMAR?], where I was more of an individual contributor on a data science team. And this was really my first exposure to the business world and kind of understanding where my background kind of met with the business world. And I was very much the technical individual contributor just building models. I had a lot of buffer and support from people to kind of mediate the direction and strategy and things like that. And that was really fun. I did that for a couple of years. But then I wanted to move into consulting because consulting is really interesting. Because you get to have exposure to lots of different problems and lots of different companies and I just thought it sounded really interesting. And actually, that's when I moved more into a strategic role of a more holistic view of data science as it lives within businesses. So sometimes I'm an individual contributor, but often times I'm more like strategic oversight for data science machine learning projects. Or even developing strategies on how to optimize the process of going into a company, understanding their needs, helping guide them through their needs, and then to deliver on them. Which we're in this weird time, where it's a very common problem, that people are faced within the business world, I think, so.

SUSAN: 09:33

Yeah. Yeah. Absolutely. I'd love to come back to that in a little bit and talk more about how you do that because that does sound like an interesting and frequent challenge.

TESSA: 09:41

Yeah. I'm pretty passionate about that challenge. It's a fascinating one. It's an interesting nut to crack.

SUSAN: 09:47

Yeah. Definitely. Definitely. But just to kind of wrap-up on your background a little bit, one thing that you mentioned to me in a previous conversation that we may have had prior to this was that you were very interested in bringing or maintaining the importance of the science of data science. And I wonder if you could tell me a little bit about what that means to you?

TESSA: 10:07

Yeah. I mean, for me, it means a few different things. So when I think about science, like when I did research and was fully engaged in academia and the scientific exploration, one of the most important parts of it is finding your fundamental question. What is it, we actually care about? And then being able to address those questions and answer them and actually move through the process of exploring that in a way that supports future development in that whole area of work. And piecing all that together is actually probably one of the more difficult components of the whole scientific process, in my mind. If you think about a ladder and at the bottom of the ladder, it's these very technical concepts of math or even knowledge about processes or how things work or very specific technical things, and then being able to go all the way up the ladder too, okay, what is actually the whole system, ecosystem, that it fits into? And what are the fundamental questions relative to that? That skill is probably the most important skill. It's the science in data science. And I think as we progressed through how data science as a market and as a human experience is evolving, the science, I think, has been the one piece of it that's been a little bit wavering. People are confused about what it means. And I've often wanted to refer to my team more as business scientists rather than data scientists, because then it's like you think about different kinds of scientists, right? In another world, I would have considered myself, maybe a chemist. A certain kind of scientist studying chemical interactions and what have you. And I really think it's critical to think about businesses as that. It's your field of exploration. And so thinking of it more in terms of the business being the ecosystem that you're studying and trying to improve. That's kind of what I mean by the science of data science. I think that that piece of it is pretty critical.

SUSAN: 12:23

Yeah. Yeah. I think there's so many interesting points in there. But I especially love the sort of metaphor of looking at data science within this larger ecosystem. Maybe we could talk a little bit about some of the tools that you use in your work and how you approach that?

TESSA: 12:41

For sure. Yeah. Relative to what we were just talking about, at one point, myself and another colleague, Hunter Barrett, wrote a book called Science Plus Data, and it's mostly intended for-- it's largely targeted for more non-technical people. But it kind of does get into the weeds of what is the scientific process. And in mapping that out, for me, what it did is it empowered me to differentiate between what part of that scientific process should be heavily leaning on machines and what part of it should we keep as a human process? And actually, finding the best of both of those worlds. And that's very much my philosophy, in general. And I think there are several different tools that are being developed out there that do that in various degrees. And I would say that the lowest hanging fruit is automating some of the very technical aspects of it, right? Making it relatively easy to build a decision tree model or a classification model of something and making just that very technical process easier, so that the person doesn't have to go in and code it every time. And that's been the lowest hanging fruit. And I think we're moving into a space where the technology is going to evolve more as a market. Where it's not really about replacing the data scientists. I think you need that highly skilled individual. But it's about optimizing the time of that individual so that you don't have to have a team of 10 PhD, very expensive, data scientists. You can actually reduce that down to just a couple if you're optimizing where their mind, attention, thoughtfulness is being put. And then also including it with a team that collaborates with the data engineering, data visualization side of things. And as a consultant, that's the most important thing to optimize. And we have to be able to insert all of that into whatever tools our customers have. The real benefit we've seen in Alteryx is that it can support lots of things. So the birth of Alteryx was really around data prep, right? And that's where its bread-and-butter was for a long time. It was very good at that. And it really democratized data in a really cool way. The first thing I loved about it is the fact that I could build something, whether it be in Alteryx or not, depending on where they were at in their data science machine learning journey and where we were trying to go. And I could just plop it into Alteryx into some sort of workflow that someone else has already built. And so in that way, it is very easy for me as a consultant to come in and either build it within Alteryx if the problem set is right for it or build it outside of Alteryx and just plop it in. And that's been a huge advantage for us when we're working with customers that are leveraging Alteryx, so.

SUSAN: 15:54

Yeah. That makes a lot of sense. And as we were talking about earlier, that issue of coming into that organization, looking at their process, trying to figure out the best ways to fit within it.

TESSA: 16:06

Yeah. Yeah. We're moving more into a model of, "We will be one of those data science heads for you and we will work with the technology however we feel appropriate in an ongoing fashion." One of the difficulties I've seen in companies that leverage data scientists is that they'll have a need and they hire one or many data scientists and they're excited and they're running and they get it going. And then there's this downfall in need. And it's this highly volatile sort of process. And it can end up meaning that your data scientists aren't optimally utilized because they kind of tend to have these lull times, and they're trying to figure out how to make things useful for the business. That's at least what I've heard from several different business owners. Obviously, that's not always the case. But, yeah.

SUSAN: 17:08

I can see how it would be like, "Well, we developed this new thing. Now what? Where do we go with that? And then how do we deal with the consequences of implementing that new thing?" Because that becomes a whole nother question, it would seem.

TESSA: 17:19

Definitely, yeah. Yeah. And a lot of times when you build something it kind of depends on the use case. But if you build it and you deploy it and it's implemented, you have to give it a little time before you can even determine--

SUSAN: 17:31

Right. Did this do anything? What is it doing?

TESSA: 17:34

Yeah. Yeah. Like, "What are the effects of this? Did we do what we thought we did? Did it have the effect we wanted it to have?" Yeah. You have to kind of let it live a little bit of a life before you can really determine that.

SUSAN: 17:46

Let my model live. Cool. Very interesting. So one word that has come up a number of times in our conversation so far is automation. And I'm curious, what kind of role is automation playing in the work that you're doing right now?

TESSA: 18:01

Yeah. Super good question. For me, it kind of goes back to what I was talking about earlier around the steps of the data science process and understanding what is automatable. And so for me, an ideal place to be, which there is no ideal, really, but we'll pretend there is for a second.

SUSAN: 18:22

We can dream. We can dream. [laughter]

TESSA: 18:23

Yeah. [laughter] Is to optimally automate what can be automatable to support the human input of the development of something or the maintenance of something. And it's just a complicated problem. So maybe you go into a business, and you have a problem statement or a problem you're trying to tackle. And the first thing to do is really just explore the data after you've defined your question. And you really know kind of what you're trying to do, you have to go through this whole process of exploring the data. And some of that is going to be incredibly repeatable. There's some things you just always do when you're looking at data. You're always going to look at correlations and distributions and things like that. There's just certain things you're going to always look at. And I think the more you can sort of automate that process and then the more time the scientist is spending just reviewing the output and looking at, "Okay. Well, what does this mean for what I'm trying to accomplish?" And that's where the focus of their time can be.

TESSA: 19:30

And then there's, of course, the huge nut to crack around deployment, right? Deployment and monitoring. I feel like that's one of the bigger things that can be what I would call probably semiautomated. Because it's going to be different for all systems and all end users. But as much as you can optimize or automate that, the better off we'll be. Because you want to be able to spend your time focusing on bringing value to the business. That is the critical thinking. Which, of course, deployment is critical thinking. I don't mean to imply that it's not because certainly, it is. There's absolutely technical critical hurdles that are not necessarily easy all the time, but there's the potential for them to be automated in ways that some of the critical thinking around how will my end-user use these results, how will they interpret these results, how can I explain how this object works in a way that makes it usable for them? Those types of critical thinking, I mean, we're just never going to automate that. I don't think we should. Because then also that's the piece where we start getting into the ethical conversation. You're not going to be able to hold yourself ethically compliant if you're not allowing room and space for you to actually look and observe that I think.

SUSAN: 20:55

Yeah. Automating ethical judgments is probably not too feasible.

TESSA: 21:00

Yeah. Well, and I've thought about ways where you could probably semiautomate-- I like to say semiautomate because it doesn't commit to complete automation. We've been kind of talking about this a lot lately, now that we're acquired by a UK company that has a lot more restrictions and mindfulness around ethics and privacy with data. And we've been thinking about, "Well, what are the ways in which we can mine for information automatically that allows us to go through that creative process?" And not that we would depend on it for the development of the explainability or the ethical assessment, but what information can we bring to the surface that supports that in an automated way?

SUSAN: 21:46

Right. Right. Can you tell me a little bit about what kinds of information specifically then you're looking at for that process?

TESSA: 21:56

For that process? Well, I mean, it's a big nut for us right now that we've just sort of started and so this will not be a complete answer by any means. But a lot of it is like so common fields that will automatically have to be required for consideration. And there's lots of fields like this. One example, we were building an attrition model. This was a couple of years ago. And we were building an attrition model for a company. And we ingested all the data just automatically and started doing our exploration and started building out some boosted models to predict attrition and realized that one of the features that was having a big impact on the gain of the model was actually race. And so obviously that's going to be a point of contention. You're going to have to sit there and consider-- well, in this case, we just threw it out because that was the responsible thing to do. But there are certain fields where you kind of escalate it. Like race, gender, birthday, things like that. PII type data. And so I think that's one of the things that I'm trying to learn, actually. And fortunately, the company that just acquired us, they have a fantastic privacy and ethics whole-of organization committed to that. So I'm looking forward to learning a lot more about that whole arena.

SUSAN: 23:25

So one of the other things that we touched on earlier, but maybe we could come back to is in your consulting role, coming into companies, identifying the kinds of things that they need to become more data-oriented to use their data more effectively. Do you have any tips, guidelines, advice for people who are facing that kind of process themselves?

TESSA: 23:46

I definitely have thoughts on this. Trying to decide how to condense them down. The ladder analogy I used earlier, I think it's a skill set that's critical to not only develop, but be aware of where you're at. And really, one of the more critical pieces is to, first, understand who you're talking to. So have curiosity about the person themselves. Because really, at the end of the day, you need to be able to understand how they're going to be able to relate to whatever it is you're going to develop. So I've had lots of different stakeholders, some of which don't want to know anything about the technical development. Some of them want to make sure they understand how all the business works. Some of them want to learn as we go. And I think it's really critical, first, to understand that because it really builds the relationship between you and the stakeholder in a way where you're just going to end up with a better product, but you're also just going to have a better overall experience, I think. So I would say that's the number one thing to do. And then--

SUSAN: 24:55

So human connection before technical depth?

TESSA: 24:59

Yeah. Yeah. And then this is where I think the scientific process really comes into play. Because then after that, I think what you do is you kind of go into a little bit of learning mode. Which means also asking lots of questions around the overall ecosystem of what they're dealing with. I think it's really important as a data scientist to not just accept what they tell you they need. You have to push in farther. Because, oftentimes, especially if you're-- well, almost all the time, you're talking to someone who doesn't necessarily and shouldn't necessarily understand the nitty-gritty of what data science is most useful for them. Sometimes they have a really great idea, and you can run with it, and you can, but I think it's important to also understand the greater landscape and to push in and make sure your kind of getting all the pieces that you possibly can. Because what that ultimately ends up doing is making sure that you're asking the right fundamental questions. And I think in order to be a good business scientist, you need to have that whole landscape.

TESSA: 26:04

And then the third thing I'll say I think is critical in those conversations is-- so the ladder analogy I used earlier, and I say ladder because it's what I use in my own head as I'm doing this. The bottom rung is probably something so detailed, as to a specific data point or something like that. And then the top rung is something more like, what is your business driving towards? Are you driving to reduce cost of your repair cycles, for example? What's your primary objective? How are you measured as an organization as a whole? So if you think about that ladder, as you're having these conversations, I think you need to be able to go up and down that ladder a lot.For example, if you have a manufacturing company, and when you're manufacturing things, there's the whole QA cycle. Where it's like, "Okay. Errors come up. How do you fix it and how do you optimize the repair cycle of actually fixing the thing?" Which actually has huge implications for your cost of your whole operation of product.

TESSA: 27:12

And the stakeholder might come at you with a question of, "Okay. I want to optimize this." And as a data scientist, assuming the stakeholder isn't going to care about all the nitty-gritty of, "Okay. What models are we creating?" All the technical kind of nuance. But you need to be thinking about it, right? You need to be thinking about, "Okay. Well, what are all the different kinds of models that we might be using for that? What kind of data might we need?" And you actually sort of have to be thinking about that simultaneously as talking to somebody. Which can be an incredibly difficult thing to do, but it's kind of what you have to do. Because then that's where you go to the bottom of the ladder and then you come back up with your questions to understand, "Okay. This is kind of what I'm thinking about. Understanding, okay, how does it actually affect the top rungs of the ladder?" And I don't know if the analogy really works, but it works for me.

SUSAN: 28:03

It sounds like a mental workout going up and down the ladder. [laughter] But yeah, no, I think that does work. I mean, the idea of having to think at those different scales, at those different business levels, different technical levels, and interpersonal levels. I mean, there's a lot going on there.

TESSA: 28:20

There is. Yeah. And I think the key element of that third kind of thing to keep in mind is having self-awareness around your ladder. [laughter] Because I think oftentimes data scientists tend to-- I mean, I've fallen prey to this where I get really excited at the bottom rungs because it's the cool practical part of things. And I kind of can find myself hovering over the bottom rungs and then realizing that I look up, and oh, my gosh, everybody's eyes are glossed over. Nobody really knows what I'm talking about. And then it's like the conversation doesn't end up being as useful, so.

SUSAN: 28:59

Sure. Sure. Well, let's geek out on technical things for a moment, then, so that you can enjoy those bottom rungs of the ladder. What are some interesting developments, things that you are most excited about in data science right now? Things you're intrigued by? Things you want to experiment with? What's interesting to you right now?

TESSA: 29:18

I think one of the things that-- well, there's a couple of things I'm really interested in. In a more technical space, I think we've been really excited about how to embed ROI into the actual models themselves. So a lot of times, there's different ways to think about ROI, right? If you think about ROI as-- if you're a business person, you say, "Okay. We built these models and then here's the return that we got based on that." Like, "This is how much cost savings we had after it was implemented." But there's actually ways where you can take the business logic, business understanding of how the cost cycles work, or how the revenue gain works. And you can actually embed it into the learning process of the machines themselves.

TESSA: 30:12

And I think that is a really cool technical problem. Someone on my team is actually Blake Rutherford. And he's kind of the brainchild of some of the original things that we've been popping up in this arena. What Blake has started to develop and we're developing as a team is taking it to the next step where, yes, error matters, but maybe what matters more is optimizing for the business logic we're trying to optimize. The end goal isn't necessarily-- the business stakeholder isn't saying, "I want the most optimized model. The most accurate model." Maybe that's it. But really, at the end of the day, that's not really it. I want a model that's optimized for my business objective."

SUSAN: 30:56

Right. Right. Precision is nice, but we like profit. [laughter]

TESSA: 31:00

Right. Yeah. Exactly. Exactly. Yeah. Yeah. Yeah. So, for example, in the manufacturing example that we were just talking about, you build a model. Classically, it would be optimized based on the model. It's honing in on its optimal state based on error, right? And instead, you optimize it based on how much it costs overall. So the recommendation might actually have you replacing a few different parts, but overall, it ends up being less expensive. Things like that. And so that's kind of the exciting thing that I'm most excited about. And really appreciative of the brilliant team that I have that's coming up with these cool ideas, so.

SUSAN: 31:47

Yeah. Yeah. That does sound very cool. Very exciting. So we have a little time left. So we have a recurring segment on the podcast called The Alternative Hypothesis. We just are curious about with each of our guests, what is something that people think is true about being a data scientist or about data science that you actually think is incorrect and why?

TESSA: 32:12

Wow. That's a good one. I have maybe five different answers to this because I think there's so many misconceptions about data science. And not necessarily misconceptions. Just different than the way I think of it, I guess is probably a more accurate way of putting it. One of the main misconceptions is that the main skill of a data scientist is to be able to code machine learning algorithms as opposed to build machine learning solutions for a business outcome. Those are two different things. I would say that's one. And then I'll go to the other side of the spectrum where I've heard a lot of people say that a data scientist is somebody who is really good at pulling data in from disparate sources and do cool visuals that make it meaningful to the business stakeholder. And I've seen data scientist's kind of defined in these two end cases. And I guess I feel like it's more of a merging of the two is how I've kind of experienced it.

SUSAN: 33:25

Yeah. Tell me more about that first point, the difference that you see between coding machine learning models and coming up with machine learning solutions.

TESSA: 33:34

Yeah. And I don't know a lot about the data science programs that exist in universities. I have to say that as a caveat. But just it seems like there's this overemphasis on the nuance of knowing the packages to run. Like, "Oh, this is the optimal package to use for building an XGBoost model or building a clustering model." And there's less emphasis on the nuances of when it matters or why it matters or the science. I consider that to be very mathematical coding oriented and then bringing in the science on top of that I think is the difficult piece. Because I think people are trying to, this is just my impression, stuff science-- being a scientist is kind of a-- it's a skill that I feel like some people have intuitively, but largely, it's learned through experience of the experimental process and things like that. And I think without that component to it, you end up building a lot of things that are really cool. They are. People build cool things and they're super fascinating. But ultimately, they don't always end up being super useful to the business because they're missing the scientific exploration that it applies to in the business science kind of analogy I used earlier.

SUSAN: 35:05

That question of needing to know the how and the why and the what for and what is the larger motivation and application that we're going to use for this really cool thing that we're making.

TESSA: 35:15

Yeah. Right. Yeah. Exactly. Honestly, and the really sad part of it for me that I've seen as an outcome is scientists or machine learning experts, coding computer science experts who build these things and they are so cool, and then, unfortunately, they're not always appreciated because the connection between those worlds isn't being made. And it's sad on both sides, honestly. [laughter]

SUSAN: 35:40

Right. Right. Everybody's strengths are kind of not being used to their best capacity. Yeah. Wow. Is there a recent project that has taught you something new that you're excited about?

TESSA: 35:53

Well, it's interesting because the last couple projects that I have been on have been more largely focused on more general consulting. Like consulting around, how do you manage business needs and the development of analytical tools? Everything from visual tools, to machine learning tools, to basic sort of statistical models. The whole array. And how you actually manage that when you are a BI organization. And I've had a couple of exposures to that throughout my career, both internally and externally. But as of late, I've had a couple of projects that are committed to that. Where it's just helping guide through how you put that together. And it's not as interesting technically or whatever, but it is critical to everything.

TESSA: 36:57

And I think that I'm a pretty passionate person, and I love the things that that I create. Typically, I have a lot of excitement about them. I have excitement for the things that my team creates. And knowing that, in order to properly facilitate it, it needs to live within the ecosystem that's healthy. And so it's been really fascinating for me to work with some high-level BI leaders who are asking themselves this question. Like, "How do you actually structure not only a team but a process and the data to allow for this?" Like, "Who owns different things? What does it look like to build it? Who's responsible? What's the right process and what's the right skill sets for the different stages of the different process?" And I get really excited about that, too, because I think it's just critical to all of this. So that's been a really fascinating thing to dive into. And unfortunately, the problem is, or the answer, quote-unquote, is that it's different for everyone, unfortunately. I mean, I think in my naivete, when Hunter and I wrote this Science Plus Data book, we had structured, oh, the ideal team. Which at the time, it's a good structure, but--

SUSAN: 38:20

Sure. Starting point.

TESSA: 38:21

--I've grown in my own working hypotheses around these things. And I don't think there's a one-size-fits-all. I think it depends on a lot of things, so.

SUSAN: 38:30

Yeah. You really get into all those subtleties and particular contextual issues as you dive in more, I'm sure?

TESSA: 38:36

Right. Yeah. Exactly.

SUSAN: 38:38

And it must really call upon your interpersonal and diplomatic skills too, as you're helping people figure all this out?

TESSA: 38:45

Oh, definitely. Yes. Yes. Because it requires that. Well, it requires first investigating, understanding, and knowing. And then, of course, there's the difficult pieces of, "This person's really skilled in these ways and can contribute in these ways, but they really need support in this way." And it's difficult. As humans, it's hard for us to always be humble and be like, "Oh, yes. This is the part where I need a partner for this part. And if actually, we're together as a unit, it will work better this way. Or pushing back on a company, saying, "Look no one's being held accountable to anything. So that's why nothing's actually happening." Those types of pushbacks can be really kind of tense situations.

SUSAN: 39:28

I bet. I bet. Interesting. Well, I imagine having worked in the underground lab situation and having been in those collaborative settings, you're probably very well equipped for navigating those kinds of conversations and dealing with folks and those issues I can just imagine.

TESSA: 39:47

Yeah. Well, and it was funny, too, in the underground setting, because the folks who were in charge of facilitating or maintaining the actual facility were all the old miners. So they were four, five, six generation miners. And just a super interesting whole culture to be exposed to. And they have a very different kind of social dynamic, especially when clashed with the science communities. It was kind of a fun social experiment, honestly.

SUSAN: 40:18

Yeah. I bet. Wow. That's so interesting. So you come into contact with all these different companies. You've had this interesting science background. Now, you've got potential royalty on your hands. I mean, just really the whole spectrum. That's very cool. Awesome. All right.

TESSA: 40:35

Well, this has been great. I really appreciate you reaching out. This has been a really fun conversation. Appreciate it.

SUSAN: 40:40

Yeah. Absolutely. Absolutely. Thank you so much for joining us today. And we're so glad that you were a guest with us here on Data Science Mixer.

TESSA: 40:47

Yeah. Thank you.

SUSAN: 40:49

Thanks for listening to our Data Science Mixer chat with Tessa Jones. Join us on the Alteryx Community for this week's Cocktail Conversation to share your thoughts. Tessa said sometimes she thinks of business scientists as a better name for her team than data scientists because it reflects how they understand the whole business landscape. That larger ecosystem. What do you think about that name? Do you think it better represents your work? Is there another option you'd like better? Share your thoughts and ideas by leaving a comment directly on the episode page at or post on social media with the hashtag #DataScienceMixer and tag Alteryx. Cheers.


This episode of Data Science Mixer was produced by Susan Currie Sivek (@SusanCS) and Maddie Johannsen (@MaddieJ).
Special thanks to Ian Stonehouse for the theme music track, and @TaraM  for our album artwork.