For a full list of episodes, guests, and topics, check out our episode guide.
Go to GuideJoin us as we sit down with Christina Stathopoulos, founder of Dare to Data and former Google and Waze data strategist, to discuss the unique challenges and opportunities for women in data science and AI. In this episode, you'll learn how data bias and AI algorithms can impact women and minority groups, why diversity in tech teams is crucial, and how inclusive design can lead to better, fairer technology. Christina shares her personal journey as a woman in data, offers actionable advice for overcoming imposter syndrome, and highlights the importance of education and allyship in building a more inclusive future for data and AI.
Ep 193
===
[00:00:00] Introduction
---
[00:00:07] Megan Bowers: Welcome to Alter Everything, a podcast about data science and analytics culture. I'm Megan Bowers, and today I'm talking with Christina Stathopoulos, founder of Dare to Data. In this episode, we chat about our experiences as women in the data field, how data collection and AI bias can affect women and other minority groups, and how we should approach AI tooling to increase inclusivity.
Let's get started.
Hey, Christina, it's great to have you on our show today. Thanks so much for joining. Could you give a quick introduction to yourself for our listeners?
[00:00:43] Christina's Background
---
[00:00:43] Christina Stathopou: Of course. So, my name is Christina Stathopoulos. I come from the data and AI world. I've been in this space for over a decade now. Most recently, I worked at Google and Waze, leading data strategy and data projects with some of our largest advertising clients.
But I left at the end of 2022 to do my own thing. I call myself a data preneur. I founded a company called Dare to Data, and pretty much we're on a mission to help individuals and corporations take the next step in their data and AI journey. So now I spend all of my time on training, upskilling, and product evangelism.
I'm also a professor as well. I've been a professor for quite a few years at a couple of universities. I love education, love helping others, so I'm very involved in the field and helping others upskill or reskill when it comes to everything, data and AI.
[00:01:35] Megan Bowers: Very cool. I'm really excited to learn from you and your super diverse background and all the areas you are involved in.
And for this episode, I'd love to start out by talking about the intersection of women and data collection, data analysis. In what ways do you see women uniquely impacted by data bias?
[00:01:58] Data Bias and Women
---
[00:01:58] Christina Stathopou: Yeah. And before we get into it, by the way, super excited to be here for this episode. I think this is going to be, we're going to get to talk about some very interesting things.
So let's get right to it. So you asked about the intersection of women and data collection, data analysis. Right? I think that women, but really, all minorities are impacted by data bias. You have to consider that all of the technology, particularly technology with AI, with machine learning, it's surrounding us today.
And it learns from data, right? And if that data is biased, the machine in turn will make biased decisions. And when I usually talk about this topic, the first thing that I start out by doing is asking my class, "Close your eyes, and I want you to think, who are the people who are creating these technologies? Who is developing, deploying these machine learning models, these AI systems?"
Think about who they are. If you close your eyes and you think about that, the audience that usually comes to mind is a very specific group of people, right? There's usually not much diversity in this space. It's very male-driven. Of course, it's usually people working at big tech companies. They come from a mid-to-high socioeconomic background. There's very little cultural or racial diversity. So you have this very specific group of people working on this technology that has a lot of power over us. And the problem with this is that all of us have unconscious biases. So I have unconscious biases. You do. All of our listeners do. But when you bring a group of people together who are very similar, they are going to share similar unconscious biases. And that's the problem. We've got these, this very similar group working together to develop the technology, and they're not meaning to—it's not intentional—but they're cooking their unconscious biases into the systems, and because of that, we generate things like bias against women or bias against different races and cultures and so on.
One thing that I recommend to everyone, there's this amazing book that I read when it first came out, and now it's like a bestseller, but it's called Invisible Women. Caroline Criado Perez, I highly recommend it for both males and females to check it out. It really opens your eyes to all of these problems of data bias, particularly in how this is affecting women. It explores this like, "gender data gap," so how data collection, how analysis have been overwhelmingly focused on men for all of history, making women practically invisible in certain spheres of life. And so it covers all of these different sectors like healthcare and transportation and urban planning, technology, et cetera.
So for example, it talks about healthcare, medical research, and treatment often fail to account for the biological and the physiological differences between men and women. It treats men as the default, and thus there's a higher chance of misdiagnosing or mistreating female patients. And it gives others similar examples in other fields, like technology being designed for with male users in mind, for example, with default models, typically based on male characteristics and male preferences.
Before we jumped on this, you were telling me about a story as well that exactly this happened with technology, right, with you.
[00:05:10] Megan Bowers: I saw this likely on, it was a thing on the data collection side where for our podcast program, we have used this kind of AI voice editor, AI-assisted audio editing, and our producer noticed, at least the first version of this software did a lot better on male voices rather than like on my voice or on any other female guests that we had. It was just more refined, better performance on the male voices. And so that made me think, "Oh wow. This is probably an example of where the data collection and analysis was focused more on men," and obviously there's differences in voices between men and women, and so that was like a firsthand experience I had that made me, yeah, think about this issue.
[00:05:58] Christina Stathopou: For sure. Yeah, I, and I use like AI for helping edit my videos as well, but I'm just filming myself, so I don't have anything to compare to. But I would wonder if it's a common problem. This would be something that you would find in that book of Invisible Women. The training is defaulting to a male candidate rather than defaulting to both male and female, given the differences between them.
[00:06:21] Megan Bowers: Yeah, definitely. But I wonder too about the analysis. Is it compounded if people only have one gender analyzing the data too? Like, is there stuff that can happen there on top of maybe an un-a fully representative data sample?
[00:06:37] Christina Stathopou: I think so. I think really across every point in the data life cycle, if you don't have those diverse perspectives, diverse people working with the data, then at any moment it, it's not intentionally done, unintentionally.
There can be these biases that that pop up, and it's very hard to notice them because they're unconscious and it's not something that you're purposely meaning to do. I'm not blaming anybody for doing this necessarily. The problem is that we need these diverse teams, because if you bring in more diverse perspectives, if you bring in women and men, then they, they will notice these things before a system or an application like this or an analysis comes to market.
[00:07:17] Other Types of AI Bias
---
[00:07:17] Megan Bowers: So then thinking beyond gender, what are some other ways that AI can be biased?
[00:07:23] Christina Stathopou: AI and data can be biased literally in infinite ways. Culture and racial, like I just mentioned. It can be ageism as well. It can even be little things like right-handed versus left-handed. So any sort of marginalized minority group can be biased against within these systems.
So to give you a really good example, when YouTube launched their iOS app video upload feature, this was many years ago, they found that 10% of the videos were being uploaded upside down, and they could not figure out why. So do you know why it was? They realized that the app had been designed for right-handed users only.
So the phones are usually—I'm right-handed so I can't really imagine it—but supposedly phones are usually rotated 180 degrees when they're held in a user's left hand. They held, hold it and film it a different way, and they hadn't taken into account left-handed users who make up maybe 10% of the user base.
So this is showing you, this is not about just gender. This can be practically anything. And then another example to give you outside of gender, there was a, a recent lawsuit opened against a big software company. I don't wanna name any names, but they are being sued with victims claiming that their hiring algorithms are, or were, discriminating against anyone over the age of 40.
So both of these cases, this is left-handed, right-handed, or ageism. Both of these cases, I think, raise an important point that striving for diversity, it's not about checking off these DEI goals. It's not just the right thing to do. It is the right thing to do, but besides that, giving you these two examples for a company to be, they need to be proactive to combat risks like bias so that they can prevent these types of things from happening.
Bringing these systems to market and making mistakes, and what I like to say is that it, if they tackle this proactively, this would really be a part of like responsible AI, being more responsible with their AI. Then they can avoid, or they can minimize the risk trifecta, which is reputational risk, regulatory risk, and operational risk.
You can get all sorts of these types of risks from these three bubbles because you're not proactively tackling things like bias.
[00:09:44] Megan Bowers: That's really interesting. I like that risk. The risk trifecta you mentioned is interesting and something that I think a lot of listeners, a lot of companies care about all of those risks and sometimes don't know what they don't know when it comes to bias, but being proactive would be certainly the best option there. Yeah. To avoid those risks.
[00:10:06] Christina Stathopou: I can add another thing I thought about, because we're talking about moving beyond gender. Another book that I can recommend because I'm a big, I'm a big book fan. There's this book called Unmasking AI by Dr. Joy Buolamwini. So she has done groundbreaking work out of MIT, trying to really raise awareness around racial and gender bias in AI services.
And from what I understand, she was originally inspired because of a problem that she had with facial recognition applications that they couldn't recognize her face. So she's Ghanaian American. She has very beautiful, rich, dark skin. And when she was using these facial recognition software, these applications, it wouldn't recognize her face.
But if she had one of her lighter-skinned friends try the system, it had no, no problem recognizing. So she uncovered all of this racial bias that's hidden inside of the software, and it just shows that the applications had not been trained, had not been tested on users with darker skin. That's another example.
Looking more at like race now, of how that can pop up as a bias within technology. So all of these, I think all of these cases that I'm explaining, I think it can create some serious ethical dilemmas because we're literally marginalizing further these marginalized groups. And if we don't, if we don't take accountability for it, if we just overlook it and we blame it on the machine, which tends to happen a lot, then we're not gonna get to the root of that problem and fix it.
[00:11:38] Megan Bowers: And I feel like some people look at AI as a black box and there are some models that can be, but like, we have to do due diligence to dive into figuring out why it's making those decisions. And that makes me think about like, how important having AI models be able to explain how they got to the conclusions.
We had an episode on that a while ago with a professor that was super interesting, but AI explainability to me seems like it's just gonna be more and more important as we like, move into automating and pulling AI into just more and more critical decisions. It's gotta be able to explain how it got there.
Otherwise, we might risk missing some of these things until groups have been marginalized. Things like that.
[00:12:22] Christina Stathopou: Exactly.
[00:12:23] Megan Bowers: Yeah. I always
[00:12:24] Christina Stathopou: say we wanna break the black boxes down into glass boxes so we can see inside of them.
[00:12:29] Megan Bowers: I really like that. So shifting a little bit, I'd love to hear a bit about like your experience being a woman in the data field.
[00:12:38] Personal Experiences as Women in Data
---
[00:12:38] Megan Bowers: How has that changed your career journey?
[00:12:41] Christina Stathopou: Yeah, I think like many other women in the field, I've many times been the only woman in the room. It's been since early on when I started an engineering degree in my bachelor's, and then it's followed me throughout my career. To be honest, it can be very intimidating, but especially when you're starting your career, I feel like as you get older, more mature, more developed in your career and you have more confidence in yourself, you can push through.
But I feel like early on in your career, it can be quite tough when you're the only female in the room. When it's hard, it can feel harder to get your voice heard at the table. So this has been a, just a pattern throughout my career at different teams that I've worked on in different companies. More often than not, I'm one of a few women, or many times, the only woman on the team.
Now, I will say that I have had some fantastic male, uh, workmates. Like I'm not complaining about them because they were amazing. I remember, for example, my last team at Waze, I was the only female on the team. It was a data team. It was all males. I think it was like 10, 10 males and myself, but they were all amazing.
It was a fantastic team to be on. So I didn't feel intimidated by any means, but you have to get used to being like the only woman around. And I've also, in my experience, I've, I've never had a female manager in my entire career. I've only had male managers and I've had good managers and bad managers, both sides of the coin.
But it would've been nice, I feel like, to have a female manager or a female role model in that sense to, to help guide my career. I had to lean on my male managers, which were fine, and they've really supported my career. But it also would've been nice to have that female representation early on. So I think the message that I wanna get across with this is that I do see that the gap still exists and it can be very intimidating, especially for women just starting out in the field.
But my advice would be to push through because it's totally worth it. And of course, women can do anything that men can. There's no difference. And like a motto that I've always lived by, whether it's correct or not, is "fake it till you make it." So even when I got presented to do things that I had no idea how to do it, I, I jumped on it, I took it, and then I figured out how to do it.
I'll fake it till I make it. And that means also maybe faking confidence because being the only female in the room can pull down your confidence or make you more hesitant, but you don't wanna do that. You don't wanna be invisible in your work, so you're gonna have to fake it, fake that confidence to pull yourself up, to push through imposter syndrome and things like that so you can have your voice in the field.
And I think as well, try to find female role models. If you don't have them in your company or your manager, then look elsewhere. Look at, I don't know, professor that you had, or anyone that you might know. Connections that can help guide you as a woman who already has built a career in the field.
[00:15:29] Megan Bowers: I love
[00:15:29] Christina Stathopou: that.
What about you? Because you're a, you're also a female in, in data, so have you had similar experiences or...
[00:15:36] Megan Bowers: I've had a very interesting career path so far, but I actually, I did start out in engineering. And when I did some internships in like manufacturing, industrial engineering, that was super, super male dominated.
I was really lucky to have, at my first internship I had a female mentor, so she was like my buddy, quote unquote, just a few years older, mechanical engineer, and she was really awesome and great to look up to and feel like I had that kind of. Buddy throughout going through the process. But then, yeah, my second internship, I look around and we're huddled and I'm like, ah, not only am I the only female, but I'm young.
I, I look young. And there was a lot of imposter syndrome, I think, and a lot of uncertainty also mixed with, I wasn't sure if that was what I wanted to do, but yeah, I think your advice for faking some of that confidence is spot on. Trying to push through the imposter syndrome is important, but it's a little easier said than done.
But yeah, I think it is like powerful to stick with it and be in that career and then be able to hopefully guide other women and other minorities through it as well.
[00:16:47] Christina Stathopou: I can totally relate with your experience.
[00:16:50] Megan Bowers: So then, I mean, we talked about our personal experiences, but broader for all women in the field, even outside the data field.
[00:16:58] Women and AI Adoption Trends
---
[00:16:58] Megan Bowers: How should we be navigating AI adoption? Are you noticing any trends right now in women adopting AI?
[00:17:06] Christina Stathopou: I do want to highlight a couple of studies that I've come across recently. So some recent studies are showing that women are more hesitant to adopt AI technologies, especially generative AI, compared to their male counterparts.
And in many cases, they're saying that women are just they're more skeptical. They're questioning whether it's ethical to use these types of tools. And one, one statistic that I came across was that around 27% of ChatGPT app downloads come from women. So only 27% of the ChatGPT app downloads. Yeah. So if this trend of a lack of adoption continues, then I think it could contribute to widening that gender gap when it comes to pay and job opportunities.
Then another study that I came across, it was by the UN, the United Nations. They found that AI is increasingly being used in roles that are traditionally held by women more so than males, particularly in high-income countries. AI itself, it's not inherently designed to target women or to target women's jobs at all, but they happen to be highly susceptible to automation by AI.
So things like administrative assistants, secretaries, customer service, data entry, these types of things, which are traditionally held by a lot of women. They are being impacted in full force by AI now, by automation. And I think this is worrisome. If it's not, if it's not dealt with. We see that trend of men being more quick to, to adopt the technology versus women.
I think we need to change that. Of course, we need to be more on top of the technology to make these tools more ethical, more responsible, so it aligns with what female users would expect of those tools.
[00:18:56] Megan Bowers: It does make me a little nervous about these female-dominated roles shifting towards automation. So kind of an alarming study in my mind,
[00:19:04] Christina Stathopou: but I haven't seen studies on it.
But I wonder if there's also like other roles that females traditionally hold are very difficult for AI to ever automate, things like nursing. There might be some aspects that could be affected by AI, the human side of it, like the empathy that, that women are very good at, the people-to-people connection, these types of things.
I don't see AI being able to come in and replace. So there, there's like the bad side that we talked about, data entry, secretaries, et cetera. But then I look on the other side, I'm thinking about all of these other roles that females hold that at least in my opinion, seem to be pretty protected from AI just because of the people element.
And that's something that women are very good at, caring for others, the people-to-people connection. So at least on one side, females might be protected. But on the other side, these roles that have a high risk, those are the one that need to be dealt with. And I think we need to think about some mass reskilling to find where this, these parts of the population can be placed.
They can't just be left out, especially if it ends up being more, like I said, more traditionally female-held roles because that will of course, have a much harder impact on the female part of the population.
[00:20:19] Creating More Inclusive AI
---
[00:20:19] Megan Bowers: Definitely. So I'm wondering how women and allies can support more inclusive AI, whether that's more inclusive tooling and access, whether that's more inclusive data sets.
What are some of the things that have come up for you on that?
[00:20:36] Christina Stathopou: There's lots of different things that. We can do, and talking about all of this bias, keep in mind that it's not a very easy thing to to tackle first of all, but there's different things that we can do to help minimize it. So I think first of all, inclusive design and development of these tools, always asking, "Who is missing?"
During the product design, during the data collection, auditing for bias, testing for diverse users, like I gave the example of facial recognition software. I had a student in one of my classes. She was from China and she was living in Spain. She came here with a company that does facial recognition software and robotics, I believe, from China, but they were expanding to the European market.
And they have very good facial recognition algorithms in China. They have the leading company for this type of technology. They were expanding across Europe, and they found that their algorithms were not very accurate, like they were not performing well at all. Why is that? Because the data had not been trained on European faces.
This was developed originally on the Chinese population, but they hadn't spent as much time and development training on the European or the different European faces, which also has a lot of diversity. 'Cause it could be Spain, it could be Germany, it could be, I don't know, Norway. So they had to rewind back and redo, restart some training on the European population before they could launch it effectively to a European market.
So I thought that was an interesting
[00:22:07] Megan Bowers: use case.
At that point, it's not just the right thing to be inclusive. You're also gonna miss out on business if your product isn't designed for this whole subset of users that you're trying to bring it to. So that's really interesting.
[00:22:21] Christina Stathopou: It's literally for your business. If you want to be successful with your business, you need to consider all of the people that you want to cater to, all the diverse people.
So I think inclusive design and development, that's really important as well. Amplifying diverse voices. Promoting work by women, promoting work by marginalized groups. Another thing I would say is investing in education and also accountability. So education around how data and AI tools can be used effectively on the job.
Low-barrier learning opportunities, beginner-friendly workshops, sessions for employees to learn how they can use AI on the job. You also need to create safe learning spaces. This might be considering creating a women-led group or another minority type of group, but a safe space for their learning and for them to share experiences and to reduce intimidation.
Especially like we shared our experiences of being one female amongst many males, working. Sometimes it can be intimidating, so if you're brought into a room with others similar to you, you won't be as afraid to speak up, share experiences, learn and grow together.
[00:23:31] Megan Bowers: Yeah, totally. I'll share too, like this podcast episode came out of one of those shared spaces at Alteryx where we have our women and allies group.
They do a great job of having some open forums where folks can come and ask questions in like more of a safe, small space that's dedicated for that. And one of the discussions it was like, we wanna do some more outreach on this and help women feel excited about data, excited about tech and entry into that and answer some of these questions that people brought up.
So that was part of what prompted this episode. It's kind of a full-circle thing right there, but, but yeah, I really like that about the low barrier to entry learning too, like giving options that work for everybody is important.
[00:24:19] Christina Stathopou: It'll make a, it'll make a big difference. And I was gonna say, you, you mentioned something about trying to get women to be more excited to work in data and AI.
I do want to encourage listeners, male and female, whoever you are, you should be excited to work in data and AI because this is a very fast-moving field. There's lots going on and you shouldn't feel intimidated. A lot of us, many people are a minority in some way or another. It doesn't have to be because you're female.
It could be 'cause of your race, 'cause of your culture, your age, whatever it may be. But all of us deserve a spot at the table. We deserve to have our voices heard, and I think there's not a more important field to be in today. Data and AI is where it's at, and it's where the future is at, and we need to have more diverse perspectives coming in and being a part of this to make sure that we have the future that we all hope for.
[00:25:06] Closing
---
[00:25:06] Megan Bowers: I think that's a perfect place to wrap up. So if our listeners wanna find you, your content, where can they follow you?
[00:25:14] Christina Stathopou: They can find me on LinkedIn. Just search for Christina Stathopoulos and you can follow me there. I share daily content on all things data and AI.
[00:25:23] Megan Bowers: Awesome. Well, it's been great to have you, Christina.
Thank you so much for joining and sharing your perspective.
[00:25:28] Christina Stathopou: Of course. Thank you for having me.
[00:25:31] Megan Bowers: Thanks for listening to learn more about the topics in today's episode, including Christina's book, recommendations. Head over to our show notes on alteryx.com/podcast and if you like this episode, leave us a review.
See you next time.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.