Data Science Mixer

Tune in for data science and cocktails.
Episode Guide

Interested in a specific topic or guest? Check out the guide for a list of all our episodes!

VIEW NOW
MaddieJ
Alteryx Alumni (Retired)

In this crossover episode, we flashback to some of our favorite moments from Alter Everything and Data Science Mixer.

 

This episode has also been published on our Alter Everything Podcast. 

 


Panelists

 


Topics

 

 

2022 LinkedIn.png

 


Transcript

 

Episode Transcription

MADDIE 00:00

[music] Welcome to this special crossover episode of Alter Everything and Data Science Mixer. I'm Maddie Johannsen, and I'm joined by our host of Data Science Mixer, Susan Currie Sivek. Susan and I are going to chat through some of the standout moments on both podcasts in 2021 to either refresh your memory or get you excited to check out some moments you might have missed. Let's get started. And I think the biggest event that happened in 2021 for me, at least, was the launch of Data Science Mixer itself.

SUSAN 00:33

Yeah. That was really awesome. We brainstormed names and concepts, and eventually, we came up with Top Shelf Data Science as the original name, which is a little-known fact here. But then, we found out that "top shelf" was potentially NSFW, you could say, [laughter] for our friends In the UK. So we were just going for the top-shelf concept, as in cocktails; like the best kind of alcohol kept on the top shelf, but. Just like Alteryx is awesome for doing data science at a high level. But yeah. [laughter]

MADDIE 01:07

I know. Yeah. That was such a surprise, finding out that "top shelf" as a concept in the UK is inappropriate. And it was such a chaotic time to quickly pivot and change to Data Science Mixer because I think at that point, we had the feed established, the artwork was finished, and we even had the trailer and the first episode out. So yeah. [laughter]

SUSAN 01:29

Yeah. Yeah. That was crazy. I actually still have the Top Shelf Data Science wallpaper on my desktop. It's like this limited-edition secret thing, but. [laughter] Oh, well. I do love the Data Science Mixer name, and it's been fun to ask our guests to bring a special drink with them to recordings, although I think we've only had one or two who actually brought alcohol so far.

MADDIE 01:48

Yeah. I think I need to start scheduling the episode recordings at the end of the day, so folks are more likely to let loose with their drink choices and treat themselves. I mean, even if it's just an upgraded kombucha at the end of the day, you're probably more likely to be in that happy hour mood.

SUSAN 02:06

Yeah. Definitely. Hmm, upgraded kombucha sounds really good. [laughter] Yeah. Yeah. But it's been a lot of fun, and I've also really enjoyed having our recurring segment, The Alternative Hypothesis, where I ask guests something that is thought to be true in the data science world but that they know from their experiences to be false.

MADDIE 02:24

I love that segment, and we've gotten so many interesting responses. And then, also, for people who might not be on the Alteryx community, Susan has sometimes posted a roundup of those responses, so definitely check that out. But this is actually a cool opportunity to ask you, Susan: What's your alternative hypothesis about data science, or even maybe being a podcaster?

SUSAN 02:52

Oh yeah. I would say, I think people hosting, or who are getting interviewed on podcasts, I think we tend to think that they're just super well-spoken, and everything comes out perfectly formed. And guests always seem a little surprised when I tell them how we can fix things after the fact with editing. So maybe I shouldn't reveal that, but not all of our sentences are perfectly formed, and there's actually a few more ums and ahs than people know.

MADDIE 03:19

Totally. Yeah. It really puts people at ease, including myself, as a person who uses a ton of filler words. And I can be a little awkward at times, especially when trying to make the segues and transitions from question over question sound really natural. So yeah, it's nice that it's not recorded live. And plus, it's been really fascinating during the editing process to hear the different speaking styles of all of our different guests. And you, Susan, you do a great job of putting our guests at ease and digging in to find some fascinating nuggets from our awesome--

SUSAN 03:51

Thanks.

MADDIE 03:51

--guests. So with that, let's jump into some of our favorite moments. So for me-- I'll kick us off. A standout for me this year was the bonus episode that we did for Data in the Sandbox miniseries that listeners can find on our Alter Everything podcast feed. And in that miniseries, you break down some analytics and data science concepts into really basic terms, and I play the role of the layman, so these episodes are great for kids and beginners. And in the bonus episode we released at the beginning of this year, we talked about machine learning and recommendation engines. And as a little behind-the-scenes tidbit for our listeners, Susan actually wrote and produced all of the episodes from the series. So yeah, it was just such a blast to work together on those.

SUSAN 04:43

Yep. That was a ton of fun. [music] So the algorithms are using your data. It's that data again like we've been talking about all through our conversations; just numbers that represent your behavior and your choices. So the algorithms crunch through all those numbers, and they make decisions. They say, "Well, Maddie watched five episodes of Show A, but only one episode of Show B, so let's recommend for her more shows like Show A."

MADDIE 05:14

And the more data the algorithms have, the better they work?

SUSAN 05:18

Yeah. For sure. We try to find exactly the right kinds of data that will help the computers make the best decisions.

MADDIE 05:24

That's crazy. Teaching computers with data. Too cool. So when the TV suggests a show for me, it's actually learned what I like and what I don't like.

SUSAN 05:36

Yeah. It's making its best guesses about that, at least. And just to make it a little more complicated, the recommendations that you get, those are often based not just on your data, but also other people's data. The streaming service or the shopping website, they have tons of data about what people like to watch or tend to buy. With all that data, the algorithms can learn - or, we would say, be trained - to make pretty good recommendations for you. That's why they might sometimes seem like they're in your head.

MADDIE 06:08

So let me see if I get it. Artificial intelligence, AI, is when computers seem like they're smart because they make decisions or learn or act kind of like humans do.

SUSAN 06:20

Mm-hmm.

MADDIE 06:21

And machine learning is part of helping them do that. So the data is what the algorithms use to get trained to do whatever it is they do.

SUSAN 06:31

That's right. Awesome. And it's important to remember that although all this stuff seems really high-tech and complicated, it's actually humans building all of it. We design the systems that collect data, we make the systems that learn from the data, and we create the algorithms that do things like make recommendations based on the data. [music] Yeah. So that's fun to revisit that. A highlight for me was from Margot Gerritsen's episode of Data Science Mixer. She was our first episode, and she was also the first person we invited as a guest, and she was so kind and so excited to join us. And her insights, they just felt so relevant to the craziness of launching Data Science Mixer. [music] I think it's so interesting because I can imagine some of our data scientists out there listening to this and thinking about, "Wow, being able to move from field to field. That's such an incredible aspect of this profession, being able to apply your knowledge in so many different areas." But I do feel like it takes a certain kind of confidence and curiosity to do that. How do you muster that within yourself, to explore these different areas and feel confident doing that?

MARGOT 07:44

Yeah. Well, [laughter] I don't feel very confident doing it, so it's not that I have confidence. At some point, you do it often enough that you think, "Okay. I'm going to panic for a while now, [laughter] but I've survived many times before, so it's probably going to be okay." But I always think about it as-- I think of it as diving into cold and deep water. Like you're in a mountain lake in the summer, and you know that water is going to be very cold, and yet you dive in. And then, you sort of sink or swim, and it's going to be okay. So typically, I go in because I really like learning, and so that drives me. And then, usually, I have a couple of months of sheer panic where I think, "What am I doing? I'm such a fake and a total imposter," and it feels very uncomfortable.

MARGOT 08:39

But then, I remind myself, "But I'm learning so much." You're on this learning curve, and it's super steep, but how fun it is to be learning for work. And then, after a few months, you start to understand things a little bit better. And I'm blatantly honest most of the time. When I was younger, it was a little bit harder to be so honest about my lack of knowledge. But I try to say now, "Hey, teach me." Or I get a group of courageous students around me who are willing to learn with me. I just recently dove into a new project on transportation modeling where we're interested in full decommissioning of internal combustion engine vehicles, and that is new for me. And I have four students in this group, and they know this is new for me. It's new for them. And so, we're exploring this together, and it's super fun. Also hard for the students. But I think it's really, really good to sort of learn to be comfortable with the uncomfortable because that's what research really is about. And if you're comfortable most of the time, I don't think you're really [music] learning all that much.

MADDIE 09:56

Each year, our advocacy program celebrates the amazing work from our users through the Alteryx Excellence Awards. Here's award winner Andy Bate. [music]

ANDY 10:06

In terms of the actual training pathway, 100% the Community. 100% its top skill is through that community, and I have absolutely no hesitation of saying that it's well-deserved that the community got that award. There's no other platform that does it like the Alteryx community, and I'm a huge advocate. And I know I'm known as the Andy Alteryx at Brookson, and as soon as someone mentions Alteryx, they mentioned Andy or something like that. But there's a genuine reason, there's a world of expertise out there. There's so much knowledge. And I've been doing this since 2016; I'm still learning.

ANDY 10:51

The weekly challenges? Amazing. You go in and you do something. You go, "I've absolutely nailed that. That is the best challenge I've ever done." You submit it, and someone's done it in two [tools?]. And you went, "How've you done that? [laughter] I didn't even think of doing it that way." And that's exactly what the training-- what I try and do, is to give them very, very small basics. And I've built - with the help of others as well, so it's a royal [laughter] "I" - we built Brookson Bespoke weekly challenges. So we're using Brookson data, but they have to parse it, transform it, summarize it into an output that we already show. But it gives them then that business knowledge of building it up, and that seems to have worked up to now. And I'm also a huge advocate of you get thrown into the deep end, that there's no point of sitting around and showing spreadsheets and PowerPoints and how to use it. Get hands-on. Get dirty when it comes to the application. You will make mistakes. That's not a problem. We're not afraid of mistakes. I make plenty daily. [laughter] So we're a huge advocate of you learn from their mistakes, you own up to their mistakes, and you move forward. And it's brilliant. [music] Absolutely brilliant.

SUSAN 12:17

The Data Science Mixer podcast also made a special appearance at Inspire 2021 with a video interview featuring data visualization expert [music] Alberto Cairo.

ALBERTO 12:27

--should never oversimplify the information that it presents. This is a big problem in the world where I come from, the world of journalism. We journalists sometimes tend to oversimplify the stories that we present to people. We just show, for example, a median or an average, when we should be showing the entire distribution of the data because the data is very skewed, for example, right? So in my classes, and also in my books, I explain the distinction between simplification, on one hand, all right-- which I don't think that is the goal of visualization. The goal of visualization is not to simplify. The goal of visualization is to clarify, which is completely different because when people talk about simplification, what we have in mind is usually reduction, right? Removing detail so the important information rises up, right? It pops up so you can see it immediately. My friend Nigel Holmes, who's also a famous-- is a famous infographics designer, has this idea that again, our goal should not be to simplify, our goal should be to clarify. So sometimes, in order to clarify, you need to reduce the amount of information that you show. But sometimes, in order to clarify, you need to increase the amount of information that you show in order to put the information that you're presenting into the right context. Now, how to decide what amount of data, what amount of detail, what amount of information to show? There're really not clear-cut rules. Every visualization is different. You need to take into account the nature of the data, the nature of the story that you're trying to tell, the nature of the audience that you are designing the visualization for. There are many, many factors that we need to weigh in order to come up with the right level of detail. Not too much detail, but not too little detail, [music] either.

SUSAN 14:22

And from Heather Lynch's episode about data science work studying penguins and climate change.

HEATHER 14:29

So we have this section of our website called Be A Penguin Detective. And so, what we do is, we teach people on the website what to look for when they're looking for penguin guano. And they can just go to Google Earth and load in the file that shows where all the penguin colonies are that we know about. And if they find one that we don't know about, then we'll track it down. So it's funny you should ask because just yesterday, I was going through all the leads - the tip line, I sort of think of it as the tip line - and writing back to people that had written us from all over the world, and responding to them. So in some cases, I can look at that and say, "Okay. It looks like penguin guano, but it's actually algae that grows in the snow." In other cases though, there's a colony. For example, that I think is a new, unknown gentoo colony, gentoo Penguin colony, that we need to go investigate. So that's really exciting. I think they really did find-- there were actually two people that wrote me about that location; independently, found that.

SUSAN 15:25

That's so cool.

HEATHER 15:26

We've had citizen scientists finding new emperor colonies. We had a woman who was recovering from knee surgery who spent, I think, five weeks looking for penguins and helped me understand how emperor penguin colonies were moving through-- when the sea ice moves, the emperor penguin colonies moved with the sea ice, and there's some really interesting dynamics there. So I've met people from all over the world that I've communicated with and entire classrooms. I went to a school closer to New York City that the fifth grade had dedicated the whole year to this kind of penguin project. And so, then I could go and talk to them about that, and they were really into it because I'd spent so much time I'm looking for penguins [music] in satellite imagery. So it was a lot of fun.

MADDIE 16:08

I learned about the use of analytics in sports when I chatted with my colleagues Will Davis and Luke Minors. They taught me a lot about the Euros and European football in general, which I've since come to love even more with my new Ted Lasso obsession. [laughter] [music] Here's Luke.

LUKE 16:25

Yeah. So actually, it's a funny one because it is a term that appeared in the media around football probably a couple of years ago now, but was never really explained to the general public. And it's taken a while for people to kind of understand it, but it became a really good talking point for Will and I on our webinar. "xG" essentially stands for expected goals, and it's a metric which essentially determines the likelihood that a particular player would score a particular point in a football match, based on whatever metrics you might want to include in there.

LUKE 17:04

So if you think about, on a really basic level, the most simple version of xG is where they are on the pitch. So how far away from the goal are they when they kick the ball, and what angle are they at? So how much of the goal can they actually see? And that then gives you a percentage chance likelihood of that goal of the ball going in the net on that shot. And then, as a model gets more and more complicated for xG - and this is the same with pretty much any model - you can add loads more factors in. So are they using their best foot? So are they using their left foot or their right foot, and which one are they better at shooting with? Are there defenders in the way that they have to avoid kicking the ball at? Are there-- I can't even think of another metric at [laughter] this point. But yeah, you can add all these different things in, and this was like a starting point for us when we tried to talk about doing predictive models because it's easily translatable into a business sense. Because if you were looking at a customer, and if you were doing some analysis around your customers and looking at whether they might buy a product from you or whether they might sign up to a particular promotion or something, there are metrics around that that would persuade them to do these things. And this could be a demographic information, or whether they have historical purchases of similar products, or things like that, that you can then include to help you determine [music] what might happen in the future.

SUSAN 18:41

Spotlighting underrepresented voices is super important to me and the community team in general. So we wanted to make sure that from the start of Data Science Mixer, we'd seek out diverse voices and perspectives. Here's Vukosi Marivate on his work in NLP with [music] low-resource African languages.

VUKOSI 19:00

There's understanding the history of the languages in themselves. If I'm South African-- so I come from a place where for a long period of time, our languages were literally seen as being second-rate, our local languages. So there was not much development in the universities. The languages are not even used for university teaching, and they're not developed, kind of, in that way. Other languages were chosen for the country, given our history of apartheid, to say "Yes. We're going to develop only these ones." So now, if you're trying to play catch up, the amount of money that it takes to get back to that point becomes something of, "Hmm. The money that we have currently as a country, or the whole economy, do we spend it on developing these other, let's say, nine other languages that are in the country which are official? [laughter] Or we spend it on other things in the country? Hey, we have poverty. We have all these other things."

VUKOSI 19:54

So you can now see that's a second issue that comes along in there. I'm working with Tswana right now because that's my mother's language. My father speaks Tsonga; I also know that. But there is not that much data on Tsonga, as much as Tswana, so I thought, "Okay. Let me start here and deal with that one while I'm working on that." And that's one of the other things, that you might have had these books, but they're not digitized. So a lot of the research that we do in the group tends to take into account as well, "Hey, I don't have that much data. How do we deal with that? How do we build augmentation method? How do we cleverly extend this data?" Or, 'How do we find innovative ways to get more data? How do we tune our models to work, even though there's not enough data, and then build on? And how do we also help other people build up capability in actually getting more language data into their spaces?" So you'll find that, yeah, with a lot of the things we've been doing over the last few years, it's been on that. Whether it's releasing, I think, Python libraries-- we're about to release the Masakhane web tool, which will be like a research-based translation service, almost like Microsoft Translate or Google Translate. It's just that it will now be solely for African languages. And--

SUSAN 21:06

That's very cool.

VUKOSI 21:06

--we can translate at masakhane.io because the Masakhane project came along as this big collaborative research project. And the first task we took on as a community was then on translation. And last-- this was like two years ago. But last year, we then got a grant - my research group - from Mozilla Open Source.

SUSAN 21:28

Congratulations.

VUKOSI 21:30

[inaudible] want to then build like the front end, to take the models and make them available. And from there, the reason it is a research project, sure, we're not prime time yet. So you will see mistakes. But it also allows for people to give feedback on the translation. And then from there, these can then go back to the researchers to then improve the [music] translation model.

SUSAN 21:51

And I also enjoyed Danielle Lyles' perspective on using data science in higher education to encourage inclusivity.

DANIELLE 21:59

We are definitely always thinking about diversity inclusion. It's really cool working with-- everyone that I work with cares about students and all the types of students that there are. So yeah, we're trying to build right now as diverse a class as we can. And Buffalo Trace is helping with that. Because if somebody is trying to get women engineering students, for example, they can select them all and prioritize their resources to recruit the ones that are most likely to come hopefully help us get more of them. And you can do that with any student subgroup. And the one thing that I've always done, I've done a lot of exploratory data analysis where people ask me-- for example, I looked at attrition and graduation of ACOd students. And if you haven't heard the term, they are students who didn't get into the college that they applied for. So we say, "Hey, you could still come here, but you can be in our program for exploratory studies. And you could try to transfer into the college that you wanted, or maybe we'll help you find something that you like better, that's better for you."

DANIELLE 23:03

So as like an attrition and graduation of them, when do they leave? How many of them leave? Where do they go? Do they go to another university or community college? Does it seem like they just drop out of higher ed? Which subgroups have the highest rates of attrition? And when I did that, I always look at everything, but I pay also attention to first-generation students in all of my analyses. So every time I do an analysis, there's this theme, and it's always true in there, in the data, that what I show, that is, if you control for first-gen status, ethnicity doesn't have that much of an effect on retention. So that's really the first-generation students. And it's got the leadership very interested in, "What are we doing with them, and how can we better support them?" And even then, it's hard to get the data on them. So she's working on pulling together all of the data from all the different places to even answer the question, "How many of our first-gen students are in special programming?"

SUSAN 24:06

Right. And Danielle, do I remember correctly that you were a first-gen college student?

DANIELLE 24:12

Yes. I was-- [laughter] I am-- or was a first-generation college student, so always looking out for them. And also, I look out for [music] everything, but I make sure I don't forget to check all that stuff.

MADDIE 24:25

On Alter Everything, we spoke with several of our Aces throughout the year. All amazing conversations. And we love to mix it up with guest hosts, so I thought I'd invite an actual Ace [music] to host an episode: Kenda Sanderson.

KENDA 24:40

Often, when I tell people that I'm an actuary, they usually just sum it up by saying "math". [laughter] And sometimes, I forget that it's not a very widely-known profession because we do it every day. But in the real world, it's not something that is maybe as common or as well-known as maybe a teacher or something like that, that everyone knows what a teacher is. So while it's true that we have some math tricks up our sleeves, data in general plays a big role in what we do as well. And when you think of careers and data, many people may think of maybe a consultant or a data engineer or a data scientist. But as actuaries, and obviously, users of all tricks, our use of data [music] is growing.

SUSAN 25:29

From the Data Science Mixer episode with John Thompson, we talked about creative ways to encourage and motivate [music] data science teams.

JOHN 25:38

So if you're working on something and it blows up and it doesn't work, you, as this kind of person; this dynamic, strategic, intelligent, engaged person, immediately switch to something else to engage your mind. So that project goes on the back burner. It goes into your subconscious. When you go out on a bike ride or take a shower or eat ice cream with your kids or whatever, you are going to solve that problem. And that may come in a day later, two days later, a week later, whatever it is. So I stopped having these crisis conversations. And in our weekly meetings, we started having-- first, you hear, "Hey, I'm working on Project A," and then you wouldn't hear anything about Project A for a while. And they'd start talking about Project B, and you intuitively knew that something didn't work out [laughter] in Project A. But then a week later, then they'd come to the team meeting, and they'd be exuberant. And they'd be like, "Hey, I figured out what the problem is with Project A, and I'm moving on. And I got new data, and I'm trying a different algorithm and a different approach, and it looks great. And I'm excited." So it was a way to give data scientists autonomy, responsibility, and the ability to time slice between projects that made them feel successful all the time, even though they were experimenting and failing. But they still had other things to work on and focus on rather than just the lack of success in [music] that attempt.

SUSAN 27:09

Most importantly, we made sure to make things fun. Here's Mathias Clasen on horror movies and fear and resilience.

MATHIAS 27:17

The immediate thought that comes to mind-- and this is probably my bias because my background and training are in the humanities, where many people have a kind of instinctive kneejerk reaction against quantification. And so, I come from a tradition in which quantitative research is viewed as reductionist; something that really strips away the richness and the beauty of whatever phenomenon you're trying to investigate. I think that's wrong. I mean, I think the only way to really get at mechanism, to really get at the-- I guess the causal underpinnings of a phenomenon, even an aesthetic phenomenon like horror movies, is through reduction and quantification. And you can study such a phenomenon using data using surveys and video recordings and heart rate data and so on, without stripping away any of its richness or, indeed, paradoxical beauty.

MADDIE 28:16

On Alter Everything, I spoke with Dan Schneider, the main subject from the Netflix docuseries The Pharmacist, who has worked to fight the opioid epidemic.

DAN 28:28

You mentioned that I used data, and I did video recordings and I took notes, and I did things. I was not consciously using data, I was doing whatever it took. In hindsight, looking back at it, I was. And that kind of data collection and record-keeping and being able to refer it back - and this was before the days of computers, exactly - it definitely helped me to not only solve my son's murder, but it taught me the skills I needed, later on, to shut down the next doctor. And it also has taught me ways to interact with the public. To make change, you have to have-- it's one thing to tell a story, and that's important, but you have to have some statistics. You have to have some data to back you up, and this is where I believe your company is-- and the people in your field, y'all have an important role to play. Now, again, about tips in the community. Obviously, I was in the right place. I think God kind of put me there. After I saw my son's murder, I wanted to go on a mission. I was mainly going to focus on the educating parents and students. But then, I had this situation that the police weren't taking care of, and I had a knowledge that almost nobody else had. And so, I took it as the right place and the right time.

DAN 29:50

But that leads back to the other part of this. It's what I have found now, and it's still the case, is you got three kind of people in the world. You got people that make things happen, and you got people that watch what happened, and then you got people that say, "What the hell happened?" Okay? And so, my word is, whatever field you're in and you specialize in, if you're on the alert and you're seeking and you're trying to enrich your knowledge, okay, it's there. Many times, I hate to say it, either consciously or subconsciously, we just don't want to get involved. We don't want to stand up and maybe take a little risk or maybe stand out. Okay? We have to do just the opposite. One person can make a difference. Usually not by himself, but that person can spark others. Okay? And so, absolutely, whatever your field is in or whatever you're in, pay attention to what's going on when you see something's wrong. Okay? Don't just look the other way and say it's somebody else's responsibility. Take it upon yourself. I kind of learned that, what I call, hard-found wisdom, and I hope others don't have to go through the same type of crisis or tragedy to be woken. Okay? My job right now is to wake the people who haven't went through that tragedy. But we all have a propensity to look the other way, not get involved, take the easy way out. You're never going to accomplish anything special, you're never going to be the type of person you could be, to be a leader and an expert in that field. So that's my message: "Go for it. You can make a difference." [music]

MADDIE 31:42

All of these highlights have been amazing. Just even chatting about them, it's been so fun to reflect on all of the amazing moments and more, some that we didn't get a chance to include in this episode. But yeah, so many great tidbits.

SUSAN 31:56

Absolutely.

MADDIE 31:57

And so, as we head into the new year, I'm so excited to plan more episodes for the podcast. And Susan, you actually have some awesome plans for the new year as well.

SUSAN 32:08

Yeah. It looks like in the new year, I'm going to be pursuing a new opportunity outside of Alteryx. So this is, sadly, my last episode; the last time you and I get to record together. But it's been wonderful to be able to meet our many amazing guests and to connect with our listeners out there. So yeah, this has been a lot of fun.

MADDIE 32:25

Totally. Yeah. You've been great, and it was so much fun to collaborate with you. Honestly, this whole Data Science Mixer launch experience couldn't have been done without you. So yeah, you were amazing as a host, and yeah, I'm so excited to continue to follow you. And where can our listeners keep up with you too, just to kind of see what's next for you?

SUSAN 32:45

Yeah. They can definitely find me on LinkedIn or on Twitter, where I'm just Susan Sivek.

MADDIE 32:50

Awesome. Cool. Thanks, Susan. And for any of our fans of Data Science Mixer, be sure to subscribe to Alter Everything on your favorite podcast feed, and as well as the Alteryx community where we will keep you posted on anything that's upcoming and that we want you to know about. And we'd also love to hear from you. So if there's anything that you'd love to hear in upcoming episodes, [music] be sure to comment on this episode page to get the conversation going, at community.alteryx.com/podcast.

 


This episode was produced by Maddie Johannsen (@MaddieJ) and Susan Currie Sivek (@SusanCS).
Special thanks to @andyuttley for the theme music track, and @TaraM for our album artwork.

Comments