WEBVTT 00:00:00.375 --> 00:00:01.376 Shelley, welcome. 00:00:01.376 --> 00:00:04.379 Is the rise of disruptive technologies, particularly 00:00:04.379 --> 00:00:08.008 AI, impacting on gender equality? 00:00:08.091 --> 00:00:10.969 Well, you know, listen, I think that there's bias in tech, 00:00:10.969 --> 00:00:12.470 and especially in AI. 00:00:12.470 --> 00:00:13.388 And I think one of the things 00:00:13.388 --> 00:00:17.475 that we have to work really hard on is how do we give more 00:00:17.559 --> 00:00:18.351 visibility? 00:00:18.351 --> 00:00:19.978 We have to make the invisible visible. 00:00:19.978 --> 00:00:21.146 And right now, 00:00:21.146 --> 00:00:24.774 one of the things that we're working on is the algorithm for equality. 00:00:24.858 --> 00:00:28.278 And, you know, we coined that phrase, because we definitely have to get 00:00:28.278 --> 00:00:33.408 some of the junk out of the trunk and put more women in AI 00:00:33.408 --> 00:00:37.162 and flood the internet with more profiles of women. 00:00:37.245 --> 00:00:40.206 It may sound like a really obvious question, but why does it matter? 00:00:40.206 --> 00:00:42.792 Why is that important? Why do you need to do that? 00:00:42.876 --> 00:00:44.627 Well, there's so many great 00:00:44.627 --> 00:00:49.132 women in tech and we're just not profiling them enough. 00:00:49.132 --> 00:00:52.594 And I think we've been sharing the voice of men 00:00:52.635 --> 00:00:57.098 so often that the women are just not rising to the top. 00:00:57.098 --> 00:01:01.227 And if we don't put more profiles of women, especially, look at Wikipedia. 00:01:01.227 --> 00:01:06.566 There are so few women in Wikipedia and so few stories shared that 00:01:06.566 --> 00:01:10.111 if you search CEO, you get white men 00:01:10.195 --> 00:01:13.740 when and there are so few CEOs as women. 00:01:13.740 --> 00:01:17.994 So we really have to tell more stories and we have to put them out there 00:01:17.994 --> 00:01:21.623 more and profile them more so that there's not the bias 00:01:21.623 --> 00:01:23.958 in AI. We’ll get onto that in a second. 00:01:23.958 --> 00:01:25.335 But are we getting better? 00:01:25.335 --> 00:01:27.378 It’s a very different scene from when I was young 00:01:27.378 --> 00:01:29.631 and thinking about what I wanted to do for a career. 00:01:30.090 --> 00:01:31.716 We have a long way to go. 00:01:31.716 --> 00:01:34.803 We do have a long way to go, and I think that we need 00:01:34.803 --> 00:01:37.806 to put more women in 00:01:38.014 --> 00:01:42.143 and get to equal before we can share equal voice. 00:01:42.185 --> 00:01:45.897 So let's talk about the impact that having that kind of gender disparity 00:01:45.897 --> 00:01:48.900 has on the actual tech, the creation of the tech, 00:01:48.942 --> 00:01:51.903 because it surely has an influence of bias? 00:01:51.903 --> 00:01:54.572 Well, bias in, bias out. 00:01:54.572 --> 00:01:56.741 And even think about a lot of tech development. 00:01:56.741 --> 00:01:58.993 I mean, you're not wearing high heels today. 00:01:58.993 --> 00:01:59.828 Neither am I. 00:01:59.828 --> 00:02:01.538 But I mean, think about escalators. 00:02:01.538 --> 00:02:03.706 I don't think those were made 00:02:03.706 --> 00:02:05.375 by women. 00:02:05.375 --> 00:02:08.294 I mean, my heels get caught all the time. You know how dangerous that is? 00:02:08.294 --> 00:02:10.421 Or think about airbags in cars. 00:02:10.421 --> 00:02:12.715 Number one fatality is women. 00:02:12.715 --> 00:02:15.718 I mean, they were made by men on male dummies. 00:02:15.844 --> 00:02:17.470 Or think about seatbelts. 00:02:17.470 --> 00:02:21.349 I mean, when you put that seatbelt on, is that comfortable on your boobs? 00:02:21.599 --> 00:02:22.934 Not comfortable. No. 00:02:22.934 --> 00:02:25.895 You know, so I mean, everything that. We need 00:02:25.895 --> 00:02:30.483 diversity in innovation on every level, because, 00:02:30.525 --> 00:02:34.529 I think that's why we talk about diversity and innovation, not just to have 00:02:34.779 --> 00:02:38.616 diversity for diversity’s sake, but diversity is good for business. 00:02:38.700 --> 00:02:43.204 So what should leaders be doing then to further gender equality? 00:02:43.288 --> 00:02:44.080 We should be 00:02:44.080 --> 00:02:48.251 hiring diversity in every single level. 00:02:48.251 --> 00:02:51.880 And it's not just diversity because it's the right thing to 00:02:52.046 --> 00:02:54.424 do, diversity is good for business. 00:02:54.424 --> 00:02:56.593 Shelley, thank you so much. Thank you so much. 00:02:56.593 --> 00:02:57.468 That was fun.