Let's start the conversation

Want to build a more inclusive culture within your organization? Ready to support all your people and help them perform at their best? Looking to join a network of world-class coaches and take on the workplace’s biggest challenges? Then let’s talk.

Fill out the form below with your question or query, and we’ll get back to you shortly. Or use the information on the right and call or email us directly.


Close icon

Contact details

General inquiries

Americas

1350 Avenue of the Americas, 2nd Floor, New York, NY 10019, United States of America

T: +1 212-612-3329

EMEA

Milton Gate, 60 Chiswell Street, London, EC1Y 4AG, United Kingdom

T: +44 (0)1491 821850

Media inquiries

Marketing Team

E: Marketing@talking-talent.com

Berkeley Communications

E: TalkingTalentUS@berkeleypr.com

Financial inquiries

Global Finance Team

E: accounts@talking-talent.com

T: +44 (0)1491 821850

“We are on the hook right now. We are responsible. We’re responsible for awareness AND action!”

Vicki Krajewski, Executive Director of Talking Talent Online

The use of AI (Artificial Intelligence) in the workplace is becoming more and more common. It’s everywhere and in everything. As businesses adopt advancing technologies, they may have the right intentions: to improve efficiency and inclusion. However, there are heightened concerns that AI may actually be having a detrimental impact, especially in regard to DEI (Diversity, Equity, & Inclusion).

Vicki Krajewski is a digital product designer, former teacher, and writer who wants to help change our current trajectory towards online division and addiction. She worked in educational publishing through the switch from stuffing CD-ROMs into the backs of giant textbooks through the clumsy concoction of the first ‘eBooks’ and online learning applications and has since then helped build digital learning practices for companies big and small around the world. Talking Talent is proud to have her as the Executive Director of Talking Talent Online since 2017.

Vicki joins us on this episode for an eye-opening discussion about AI’s growing influence on our work and personal lives. We’ll also explore ways businesses and individuals can do their part to ensure that new technologies are benefiting ALL people.

Want to learn more? Vicki has shared a list of resources!

Watch the interview

Enjoyed the episode? Don’t forget to subscribe on your favorite player and leave us a review.

Or read on for the transcript

Queing (she/her/queen): Welcome back to Voices. This conversation that we are having today, it’s going to make you a bit uncomfortable. It’s very eye-opening and it’s very needed though. We’re talking about how we use artificial intelligence on an everyday basis in our personal lives and in our professional lives. We do this sometimes not even realizing the impact it’s having and not even realizing that we might be contributing to the oppression of people, including ourselves. We’re going to get into that in just a moment but let me tell you about our guest. Her name is Vicki Krajewski and her bio I just love it so I’m going to read it to you, okay. I know sometimes we don’t like to be read to, but you’ve got to hear this.

Vicki is a digital product designer, a former teacher, and a writer who helps to change our current route towards online division and addiction. She’s worked in educational publishing from through the switch from switching stuffing CD ROMs into the backs of giant textbooks, (remember that), all the way through the clumsy concoction of the first eBooks and online learning applications and she has since helped build digital learning practices for companies both big and small all around the world. She’s also been with us. We’re very fortunate to have her at Talking Talent as the Executive Director of Talking Talent Online since 2017. Please welcome my colleague and I’m going to say friend, because I got to visit her, and we just enjoy talking together. Vicki Krajewski.

Vicki Krajewski: Thank you so much, Queing. Thank you, and indeed friend. Seeing people in person in the time of COVID is a rare thing. So, I feel that’s pretty special.

Queing (she/her/queen): You know, I have a personal question to ask you, especially as we head into this topic. Do you ever feel like a person holding the sign saying the end is near, the end is near with, I don’t know, weird stuff on in the middle of a busy intersection or something and everybody’s walking by saying, okay, I think her nampere is Vicki? She’s always talking about… Because some of the things that you’re saying it’s almost like really, and we all have known those people who take the tape or the Post-it note and put it over the camera or you may hear people say, Alexa might be listening and some of that stuff we find out to be true. But as a person who is so passionate about this, and it seems so obvious to you but doesn’t even seem to be a flag on the play for most people. Do you and the folks in your field, do you sometimes feel like you’re othered?

This is such a great question. I’ve just lived my life that way in some aspects and so I’m willing to take that role in any given situation and I actually walk into that role in any given situation because I’m a creative person. I see a different perspective, but also personally being LGBT and an immigrant and moving. Having lived around the world, I have these experiences that do give me a perspective that is other than the majority a lot of the time. I’m good with that. It’s really interesting. In our prep, we were talking about Safiya Noble, and she’s the author of the book, “Algorithms of Oppression.” I was just reading an article from her yesterday where she was saying that exact same thing. She also had a kind of non-linear path in her career which is what she was describing. She had to take care of both of her parents, and she did not get finish her Ph.D. until she was in her forties.

Her Ph.D. was about peeling back the lid on what these algorithms are doing, and she talked about, during her entire study even just putting the thesis together and saying, this is what I want to look at. She kept running into that wall. She kept running into people saying, it’s math, it cannot be biased because it’s numbers and she said, that’s false. But thank God that she persisted in that, and “Algorithms of Oppression” is one of the most amazing books you can read because it details her work, but in a really accessible way, in a way that can be accessed by the public. You don’t need to be scared of that language. You don’t need to leave it to a techy and I think that’s part of the problem is that we think that’s for somebody else to sort out. We are the frogs in the water that are getting more hot and more hot and more hot. We’re living this every day and AI is being introduced at such speed. We wake up and we are living AI. It’s on our phone, it’s in our water. It determines the systems that deliver utilities and if I can sign up if I can get a lease if I can get a job if I am considered a criminal or a risk. If I go to the airport, am I a risk? That’s done by AI. So, there’s another really wonderful leader in this space named Sasha Costanza-Chock who is transgender, who will be tagged at the airport every time she goes because her body doesn’t conform to the norms once she gets the scan and so they go suspicious.

Queing (she/her/queen): Interesting.

Vicki Krajewski: And it goes to the very definition of, Are you a person or not?

Queing (she/her/queen): Before we get even further in it, I want to say to you the viewer, this is exactly what we’re talking about. Sometimes when these topics are presented as it relates to technology is digital, ethical and things like that. Sometimes it is presented in a way that you should be alarmed, you should be afraid. That’s not what we want to do today. Well, first of all, that’s what we want to do, we want to open your eyes to the fact that this is something you need to be aware of but then also let you know that there are ways that you can begin to turn this in the direction, at least based on your own usage and the way that your organization is using AI and technology. Turn it in a way that’s going to be beneficial, that’s going to be ethical. In a way that’s going to also be humane. Now, there are some things that we are doing every day, Vicki, that seem very helpful. They help us to do our jobs quicker and more efficiently and as purposeful as these things are, they at the same time although it’s not very obvious are a bit harmful. Can you talk about what some of those tools are especially ones that we are using in the workplace?

Vicki Krajewski: Yeah. Yeah, and I think that’s such a brilliant point, and thank you for making that point that I will go spout all these things, but it isn’t Armageddon and doom. It is about awareness first and then action second. This is the situation. The weird part is the detachment, because it all happens under the bonnet, under the hood. I’ve lived in the UK and the US I’ll say it both ways. It happens in this kind of box that we’re not looking at. So, to open that box and say, what’s going on in there when, as you’re saying, it’s really easy to just push the button and continue on. Right. I would invite people to stop and reflect and that’s something that Talking Talent does in every context of our coaching and the work we do. The work we are so fortunate to do with our clients and with coachees is to say, let’s stop, let’s have a pause, lets’ think. Let’s look at what’s happening here and then think about what do I want to happen and how do I get there and make that happen instead? So, yeah. Yeah. I’m sorry, I went off track and you had asked a…

Queing (she/her/queen): No that’s okay. You know, I was thinking about there are things that we use every day. So, most of the people who are tracking with us right now. We’re on LinkedIn every day. We’re catching up on articles that way. The people that we follow, thought leaders in our industry. We might refer to Twitter. We might catch up on Facebook or Instagram, excuse me, not Facebook, Meta. The last time we talked, the Metaverse was on the horizon. Now it’s a thing, all that kind of stuff. Teams, we’re doing this, we’re Zooming, doing all that kind of stuff. So, we’re using these things. Yeah, but we’re not thinking that we might be contributing to oppressive algorithms for example. Maybe you’ve got to, depending on where you live, you might be thinking about warmer temperatures. So, you might be thinking, okay, let me go on Airbnb and see what’s available to me in a warmer climate coming up not realizing that all of these different tools that we’re using, we’re creating a little bit of a monster, but I’m going to let Vicki talk about it.

Vicki Krajewski: Yeah, I see your question. I see your question and part of the issue is that the designs and the approaches that are embedded that are problematic have been created by Facebook, when it was Facebook and back in the day of 2021 and whatever, have been created by Google, have been created by Twitter and features that they’ve introduced and quite recently, really. We got emojis and the like button in 2009 but now we can’t imagine life, even a work meeting without emojis. Work texts have a big happy face on them and those things. These changes happen so exponentially and even though you can say, oh, well that it’s Facebook and Facebook are evil. I don’t want to go there because it’s the design and it’s the premise and it’s almost like a belief that we have that we need to question. What do I think this technology is able to do for me that it’s actually not? That’s where we pause and ask questions.

So, where are we reliant on technology, and then where are the risks inherent in that technology? I was putting together an application; a kind of awards application webinar and I was trying. I was talking about the work that we did and how we designed Talking Talent Online, and I was trying to express why it was important that we designed it in a different way, and I couldn’t get my brain to do the presentation. I was like, what’s going on here? I stopped and I realized, I had Teams messages popping up with little red bells and notifications. Inside of that Team, I had multiple streams going there, and inside of that, I had Zoom and then I had emails and then I had my phone and I have WhatsApp group on my phone. When I stopped and looked, I was like, well no wonder I cannot function right now. I actually can’t function.

I had to turn everything off and then I said, great, I can use that in my narrative, and I created a really horribly busy slide that was flashing on 10 different levels, and it sort of illustrated, but the researched fact is that your standard installation, your build from a company, a general, average company gives you a computer and configures that computer. That standard build is designed to interrupt you every 40 seconds, every 40 seconds and we’re so bathed in it. We just take a computer, we just go, go, go, go, go, go, go. Then if we’re feeling like nothing’s working, we just work longer hours and we just grind it out. We don’t stop because we’re not in that system encouraged to stop. You’re never encouraged to stop. You’ve got to take that for yourself and do it and go wait, why am I working 20 hours a day? That’s not sustainable. You’re not going to get the best me and my best output but why is that happening? And you will find that technology and those designs are a big contributor because there’s no way you can be interrupted every 40 seconds and function.

Queing (she/her/queen): Can you imagine? That is it. I put my phone on do not disturb for our conversation but as you were talking, I was thinking about how, if a little popup comes over here in the top right corner. Yes. I can continue my conversation with you, but my mind is then also shifting over here like I wonder what that’s about and then another one. I see my email thing is illuminating over here too and then I see I’ve got seven, the little red circle with seven in it. Seven unread messages on Teams. It’s like all of these things vying for your attention. Yes, you could just turn your notifications off, but see that is not the point that we’re making here. One of the things that Vicki opened my eyes to is not so much the things that are existing right now in their current form that we need to control. Just turn your notifications off or there should be something regulating the way that these things are affecting us on a psychological level. What I’m learning from you is the issue is in the design. I made myself a note to bring this up with you. That the issue is that there’s no regulation with the design itself. We use the term the wild-wild-west out here. It’s literally, like you said, Mark Zuckerberg, a kid in his dorm room, the people who created Zoom, they’re hanging out, some gamers hanging out and they do this thing. There were no checkpoints to make sure like, okay, we have this. Oh, we can’t do that. Okay. Let’s change that. We can’t do that. No. So, it just is what it is and then we’re out here using it not realizing that we have created something to interrupt us every 40 seconds that does have that addictive thing. You mentioned the influence of Twitter on things like LinkedIn and other office tools.

Vicki Krajewski: Yeah. Yeah. Excuse me. You will find that. I was beginning to make that point but didn’t get all the way there. So, the big, massive tech companies, and there are a handful, a tiny handful of them and you can name them, Google, Twitter, Facebook. Those giant, giant companies designed into an economic system where they made money based on keeping your attention. I work with an organization called the Center for Humane Technology and there are other organizations, but they’ve been at the forefront of creating public awareness about this. You might have seen the film on Netflix called “The Social Dilemma” and if you watched…

Queing (she/her/queen): I’ve watched it. Yeah.

Vicki Krajewski: Yeah. You want a quick primer that’s a good kind of place to go and sit and go, aha, I get the picture, but those companies have set a mold. They have cast a die because everybody got really used to that and then they were like, I like the like button. I love how that makes me feel but their intention and this came out in the Facebook files, which were unraveling at the end of last year. That was the Wall Street Journal reporting what happened in 2021 with the Facebook whistleblower Frances Haugen and the files that she leaked to the press and wound up in a congressional hearing. But they knowingly designed for addiction and then the very unfortunate thing is that everybody loves it because it’s like, oh, here’s some heroin. It’s literally…

Queing (she/her/queen): Right. You know that gives me chills just to think that they did this on purpose, knowingly.

Vicki Krajewski: However, these are the facts, but the thing is let’s understand the environment that we’re in so we can operate better inside of it. So, they’ve done that. The problem is well, if that was in Facebook and it was in social media, you could just, okay, I’m going to call my mom instead of posting a photo for her. I’m going to send her a postcard. You could do that, but the problem is, all of those things have been so, ooh, I love it. It makes me feel, it gives me a rush, and when you’re working in a corporate environment, when I’m working in tech, in corporate, I have all of my stakeholders, all of my buyers going, what’s the engagement rate? That very question comes from an addiction model. You’re thinking we want our user to be a rat who just keeps clicking and don’t ask me what the engagement rate is. So, this is part of how we need to shift the conversation and it’s about understanding and being reflective. What actually do I want? What’s the outcome that I want? Really, we get so spun up on how many clicks, how many clicks and all you have is a rat in a cage clicking. That doesn’t accomplish anything. It doesn’t show anything. So, we need to refocus ourselves because if you are asking for those things, if you’re asking, where is the feed? Oh, there’s not enough content. I need endless content.

These are all the design principles. So, it’s about going back and understanding which of those design principles come from an addiction model and saying can we do this a different way in a better way that’s what we had to do. I was tasked with creating a coaching platform for Talking Talent which is a coaching company. The whole mission of the company is wellbeing and inclusion. So, they were like, well can you just buy something off the shelf? I tried so hard to look at something because obviously it’s harder to build something than buy something. I looked, I looked really hard, and I said, I really want to find the right product. It didn’t exist because all of those practices have been so embedded because they’re in demand I want this, I’m looking, I want a learning management system for my company. I work in the L&D, learning and development department or organizational development, HR. I want learning management and we need that. So, I’m going to look and then what’s a good system? It’s just off the top of because I’m not a specialist and so I’m going to go, I’m not a techy specialist. So, I’m going to go, what do people like? What do people use? What’s the popular thing? What has good reviews? And then I’m going to buy that and then we end up replicating the systems, literally the algorithms of oppression that Safiya Noble talks about. So, yeah, yeah.

Queing (she/her/queen): I’d actually like to, and we won’t name any names, but I’d like to show folks what that looks like when you bring that into your organization not realizing that that system is already biased coming in even though you may have the best intentions as an HR leader, as a DEI professional. But the very tool you’re using itself might be showing you something within your organization or even at factory settings that you don’t want to see.

Vicki Krajewski: I mean, we know that algorithms are biased. There’s a researcher from the University of Michigan and you just have to go talk to any skilled data scientist and you will hear the same thing from any of them if you ask them this question. Is an algorithm biased? Yes.  The problem is you won’t see the bias and how it’s operating until it’s in situ and then you can go back and fix it. But it’s just that we are too early. We are pushing things into play that they haven’t been in a sandbox long enough because there’s just this huge incentive to innovate and deliver and deliver. I mean, whiplash from the change in technology I was just talking about in 2009, that’s when we got the like button and the emoji, and things like that.

Whiplash from this speed of that and now we are in a place of self-driving cars where the algorithm in that car has seen a person in a wheelchair and decided that that is a vehicle or a person with a bicycle and treated that where there was actually a fatality. It was, I believe the first fatality involved in a self-driving car incident. ? The person with a bike, the car went in the algorithm, that’s a vehicle. A vehicle will continue moving forward like this. No, a person pushing a bike acts in a different way and the car smacked into the person and unfortunately, that person died. The consequence there is that algorithm didn’t even identify a person as a person and we’re then talking about all right, anyone in a wheelchair is not a person in the Metaverse, in the tech world, in my Tesla car. I don’t want to name a brand because there are a lot of brands. I shouldn’t even call out Tesla. It is a market thing. It’s a culture thing. It’s an economic thing. Not pinpointing one brand, but this is why it’s so very important.

When you look at training, when you look at opportunities inside of corporate, who’s getting the opportunities, who’s called out as succeeding, and who is identified as not succeeding. When you look at future talent, programs. Where did that come from? Where did it come from and where in that process are we relying on technology and where in that technology are there inequities and where is that hidden? We have to as individuals within this system ask that question because has technology has gone too quick and regulation hasn’t kept up with the innovation. It’s just gone too quick. It’s a different model. The models that we used to use to keep publishing safe. It was slander and libel, those two things. There was a bill passed that said, no, we do not regard a media platform as publishing. We’re going to hold them to a different standard and therefore, and this is in the Communications Decency Act, and they passed this in 1996 in the United States, I’m talking about.

Queing (she/her/queen): Okay.

Vicki Krajewski: And there they said, no, you’re just a moderator and therefore you are not responsible for anything that people put in your system because that’s them and that has led to the inability to even hold people to a standard of truth, actual baseline truth and harassment. All these things happen online legally now…

Queing (she/her/queen): Right.

Vicki Krajewski: …and we were never in that situation. Yeah.

Queing (she/her/queen): So, in other words, social media is not considered media.

Vicki Krajewski: Yeah, yeah, they’re considered a moderator, a platform and it falls into a different milieu regulatory. So, I’m not in Congress, I’m not in parliament. I don’t make those decisions and that is a weedy situation because you have to protect free speech. I mean that is not clear-cut. It is not like bang the gavel and we are done here. It’s a difficult, difficult situation. But what our situation is, we have to say, okay, this is the world. How do I make the best choices within this environment? That if my aim is empowerment, if my aim is fairness, is equity, is leadership. What do I do? What do I do? And there are things you can do. There are things you can do as an individual right now. There are things that are so important to bring to the table if you’re at that table inside your company to say, this is not a place where we can let computers do that yet or if we are going to post this job on LinkedIn, we need to go to a diversity job board, at least one, if not five because otherwise, we’re going into the pool. There are algorithms driving who we see, and we are replicating what we understand what we have the responsibility to recognize right now in this world as systemic oppression.  We have that responsibility. It’s easy to put the blinkers on, put the blinders on and say, I’m just doing my job. It’s not.

Queing (she/her/queen): Or the computer did it.  I want to even look even further into this thing where you talk about employers and the language that they use not just the boards that they put their job description on. But you’ve talked about how employers may not even realize that the way that they describe what they’re looking for, the language that they’re putting in there is going to dictate who comes up. There may be diverse talent that they wouldn’t even see based on the words that they put in there. It’s going to show you certain folks and it’s going to shut out other folks based on even the words that you put in that job description. Can you talk about that a little bit more too?

Vicki Krajewski: I mean, that’s about an algorithm-driven process where it’s a machine going keyword, keyword and there’s so much jargon. I mean, I work in tech and it’s notoriously one of the areas that are unbalanced if we talk about inclusivity and gender and race and there are analogies, women in tech groups, and things like that. Black in AI, that’s a group that Timnit Gebru who left Google last year, I believe it was under circumstances of having raised some of these issues and feeling very pushed out. But there are groups now that are kind of working to raise awareness, which is great, but I think for every person in a company, it’s our responsibility to check in and listen on those things. and to say, okay, yeah, even in just writing a job description, that’s the whole question of inclusive language and being reflective about what words am I putting? What am I teaching the machine or am I just part of this machine? Am I just rolling and not thinking about it? Let’s, not ourselves become a machine, let’s stop and think and make those choices mindfully.

Queing (she/her/queen): I like this question you pose and pardon me if I mentioned it already, but it sticks with me. You said, we have to ask ourselves a question where is it okay to trust the computer? Maybe you need to explicitly share that with us. So, it’s your calendar or whatever the case may be but then there are other things that we cannot rely on AI to do. To share a personal example I’ve been seeing a lot of ads for a new technology called Jarvis. Not sure if you’re familiar with this and I’m very interested in it as a writer, but Jarvis is an AI writing assistant. Basically, what you would do is you would give Jarvis an idea. So, since this past Monday, it was Dr. Martin Luther King Jr’s, celebration of, of his birthday and his legacy. So, a couple of weeks ago, I knew I wanted to write something about leadership lessons that we can glean from Dr. King. I guess I’ve got queen on my mind because I’m a queen, but leadership lessons that we can glean from Dr. King’s quotes. So, I’m like, okay, let’s see what Jarvis has got. I’m going to do this free trial. You can sign up, and I probably shouldn’t be mentioning this by name. We said that. So, this is nothing against Jarvis. I’m just using this.

Vicki Krajewski: There are to be fair a million software that is in this language recognition and language modification.

Queing (she/her/queen): Oh, okay. Yeah, yeah, yeah. So, what happens is this AI writing assistant, doesn’t so much that it writes for you, but it’s to help give writers good prompts and share something with you. So, Dr. King has said so many transformational things so I’m thinking in terms of, I could save some time by asking for it to pull quotes specific to this versus his quotes on relationships and love and community and all that. So, I put in quotes by Dr. King about leadership and so then it generates the input, and it comes out with these things. Of the quotes that it pulled I really liked them. I thought that they were good, but being a writer, I’m a researcher by nature. I was looking at them thinking, okay, I’m familiar with this.

I’m familiar with this one. Wow. I didn’t realize he said that. So, I look at these quotes up double and not all of those quotes were from Dr. King. They were great quotes about leadership. But I thought, what if I was someone that just trusted that this thing was going to do what it said, but it just reminded me this is a technology meant to assist me. It’s not meant to actually do the work. It doesn’t have a mind. So, what it did was pull some quotes by Dr. King and it also pulls some quotes that were in line with the types of things that Dr. King would say.

Vicki Krajewski: Yeah.

Queing (she/her/queen): That’s just a small example, but there are times when we do rely on technology to do things that we should be using our mind and heart to do instead, and I think that’s an issue when we just leave it to the robot to do.

Vicki Krajewski: That’s so beautifully put and it’s an excellent example and I love that as the rudder. How do I decide if tech is okay here or if I need to show up? Because the seduction of tech is that I push a button and I go away for a long time.

Queing (she/her/queen): Right.

Vicki Krajewski: I was working in an agency, and I would have people come to me and they would want a website and they would then want that website to run itself and be their business and I’m like you still need people. You can’t just push a button and then walk away. You need humans and yeah, it sounds so much like a Wizard of Oz, but the Tin Man doesn’t have a heart. It sounds so ridiculous but what are we entrusting to an algorithm when we need to be reflecting on that ourselves because we know that algorithms are biased, we know that there’s misinformation online. So, anything that is learning online is going to just indiscriminately…

Queing (she/her/queen): Oh, yes.

Vicki Krajewski: …take that misinformation…

Queing (she/her/queen): Got it.

Vicki Krajewski: …and barf it back up at you. I mean, bots are everywhere now like the chatbot, the thing that pops up, sort of ubiquitous on every website. The first chatbot that Microsoft introduced, and was in 2016, I believe. So, it was meant to talk to people. It became racist. They had to take it offline and that whole thing unfolded in 24 hours. It was becoming abusive and racist.

Queing (she/her/queen): In 24 hours!

Vicki Krajewski: They already took it offline in 24 hours because it became abusive and racist so quickly from what it was crawling and taking in. There is no discernment so anything that requires discernment, driving a car and deciding, is that a mailbox or is that a human?

Queing (she/her/queen): That’s what I mean by are we creating the monster because we can’t say, well, oh look what it did when it doesn’t have…

Vicki Krajewski: No.

Queing (she/her/queen): …a mind. I was sharing with Vicki before we got on you all about how should I call it? An online company that helps people find lodging in different places in the world and they have found out the hard way that their system, the technology that they were using was incredibly biased. So, names like Krajewski, names like Jones were not coming up at the top and were not being accommodated in the same way that names like Stewart might be. Names like Emily versus Erica. Names like Maximillian versus Montana’s or something. I read that article and hear what’s happening with that case and you think. I was asking Vicki how does that happens? Because the actual owner of the property is not going through and thinking Jerome Johnson, no. Erica Jenkins, no. Vicki Krajewski, no. It has become its own thing and it’s deciding for you almost.

Vicki Krajewski: That’s the scary thing. I mean, when I signed up for Spotify. It’s another example, the music service I thought, oh, I love this, and this is genius and it’s amazing because I’m getting all this music that I didn’t know about, and I like it all. In about three months. I was like, I am in a rabbit hole and I’m only seeing music like all of the songs that I already have, and everything sounds the same. Think about that metaphor. That’s how algorithms work. You liked that. I’m going to give you more of that. Think about that in the context of inclusion.

Queing (she/her/queen): Yes.

Vicki Krajewski: Think about that in the context of inclusion and now transpose to a job selection process, or I’m going to choose who I’m renting my apartment to.

Queing (she/her/queen): But we can’t let ourselves off the hook. That’s the part that disturbs me because it would be too…

Vicki Krajewski: Yeah, yeah, yeah.

Queing (she/her/queen): Right. It would be too easy to say, well, I didn’t discriminate against them. I never saw their application or by the time that I did, I had already accepted Mary Catherine.

Vicki Krajewski: Absolutely.

Queing (she/her/queen): At what point do we then look? So now what has to happen. Now more technology has to come in to make you do something.

Vicki Krajewski: No. That’s the favorite fix the tech with more tech.

Queing (she/her/queen): So, it’s like why couldn’t we have just been more discerning? I don’t know what the answer is because at the same time we’re thinking about convenience and what I was sharing with Vicki and I’m repeating it for people who may not be familiar. This particular company that you’ve probably have figured out, what they’re doing is now we’re just going to go by initials. So, it’s just going to say QJ, it’s just going to say VK and then that way you won’t be able to know who the person is on the other line. I’m thinking is it that what we have come to that we have to just be letters and numbers to keep humans from doing human stuff against us.

Vicki Krajewski: Yeah.

Queing (she/her/queen): We have to let the robots. I’m thinking about kids who don’t want to solve the problem on their own so they’re going to an adult. She won’t let me play. They won’t let me see the thing. So, we’re going to the robot and saying, they won’t let me stay here. Make them let me stay here.

Vicki Krajewski: Yeah.

Queing (she/her/queen): They won’t let me get an interview, make them let me have an interview and that is a cry that does need to be addressed. That is not okay. But I almost feel like we’re running to the robots to help us solve these things. So, thank God for the work that you’re doing. Yeah, and the work of, you have to remind me of the name, The Center of Humane Technology.

Vicki Krajewski: Center for Humane Technology and there are so many other organizations right now and it’s starting in the UK. There are government organizations. UNICEF at the end of last year came out with guidance to say technology is not aligned with our treaties, our international human rights treaties. The Declaration of Human Rights…

Queing (she/her/queen): It’s huge.

Vicki Krajewski: …to protect children’s rights. So, UNICEF has just come out with a policy document and there are six principles in there that are meant as guidance. So, there’s change and it’s coming, but you’re so right and I’m so happy that you flagged that we are on the hook because we are responsible. We are responsible for awareness and action in this context, we’re responsible. If I’m sat at that table, I need to inform myself if I’m choosing a software, not even choosing a software. If I’m using the software at work and all of us use it what’s happening there? Then what can I do to mediate, ameliorate the harms that are inherent? How can I step outside? How can I add a human decision point in this process? How can I…

Queing (she/her/queen): Human decision point, I love that.

Vicki Krajewski: How can I use other systems outside of this? How can I intervene? Where do I? When do I? And we can all ask ourselves those questions. We shouldn’t be disempowered in this situation at all, the very opposite. This should be, and I hope people come away listening to this going there is so much I can do here that is going to turn that tide on inclusion. There are so many things, too many choices that I can make every day even for myself, for my kids, for my team. There are things that I can do. There are choices I can make, and it might take longer. It might take longer but we are at a point in history where we need to decide, well, what are our values. Is speed the value, because if you want to run into the wall really quickly if you want to go off the cliff really quickly what is your outcome?

So, speed cannot be the only value. Profit cannot be the only value. We need to reflect on that, come to the table and say, I’m going to interrupt this and it’s going to take 10 minutes longer and we need to feel empowered to say that because we can see the consequences mounting and that when the UN is involved, when the US Congress is having hearings, when the research just keeps rolling in about harms, your wellbeing, inclusion, physical life and limb. I mean, we’re not talking about a tiny little thing here that it’s okay to let somebody else think about. This is for every one of us, every one of us. Yeah.

Queing (she/her/queen): Thank you so much for that part, Vicki. Yeah. We can be interrupters. We are not subject to this. This is not some, what do you call those? This is not “Black Mirror” which I love by the way. This isn’t some Sci-Fi thriller where we all have to go live off the grid in the woods somewhere and just only use Morse code to communicate. This is not that situation. We are not subject to AI. We’re not subject to it being off the rails in some places, that we all have a part to play and there are things that we could do bit by bit. I like that you said there has to be a human decision point. We can be that interrupter.

Vicki Krajewski: Yeah. Yeah. You’ve got 100 opportunities every day to make a choice around this and it’s just about jumping in and learning. If you want a starting point watch “The Social Dilemma.” It is such a good starting point to get your feet in the water and get a grounding, understanding on what’s happening and what the dynamics are. Then the Center for Humane Technologies website is another really good public awareness. But there’s so much other work going on. So many fantastic people in this area, in what is an emerging new practice of ethical technology design justice. That what am I making, who am I making it for, and therefore who has to be at the table and making that with me? So, all those things are so important, and they are down to every one of us and we can make that difference.

Queing (she/her/queen): Well, thank you so much for being here with us today or this evening, whenever you’re watching this, and we will see you next time. Thank you so much, Vicki. I appreciate you.

Vicki Krajewski: An absolute pleasure and thank you. Thank you.

 

Sign-up here and we’ll let you know when the next episode is live.

Find out more about our work in diversity, equity, and inclusivity.

Follow us on LinkedIn and Twitter

Listen here

Episode #22

Is AI (Artificial Intelligence) Helping or Harming DEI?