WEBVTT 1 00:00:00.125 --> 00:00:03.378 I'm Manveen Rana here in Davos, and I'm joined by Jessica, who's 2 00:00:03.378 --> 00:00:06.589 been looking at AI, which is one of the biggest themes here this 3 00:00:06.589 --> 00:00:06.840 year. 4 00:00:07.674 --> 00:00:11.928 Tell us, in particular, what is the thinking in terms of what AI 5 00:00:11.928 --> 00:00:14.014 might mean for gender diversity? 6 00:00:14.514 --> 00:00:17.934 Yeah, there's been a lot of talk about the representation of 7 00:00:17.934 --> 00:00:20.687 women in people that are developing, coding, and 8 00:00:20.687 --> 00:00:21.479 regulating AI. 9 00:00:21.479 --> 00:00:25.525 I actually think there's a whole other impact of AI that is not 10 00:00:25.525 --> 00:00:29.112 talked about enough, which is what does it do to female 11 00:00:29.112 --> 00:00:30.030 representation? 12 00:00:30.363 --> 00:00:34.117 If you think of your virtual experience, all the avatars, all 13 00:00:34.117 --> 00:00:37.704 of the voices you're interacting with, they're all female. 14 00:00:38.121 --> 00:00:39.539 They're overarchingly female. 15 00:00:39.748 --> 00:00:42.250 And when you do user tests, that is what people want. 16 00:00:42.250 --> 00:00:46.421 They actually want to interact with female agents in a massive 17 00:00:46.421 --> 00:00:48.923 way, digitally, and at the same time, 18 00:00:48.923 --> 00:00:51.718 there's a very negative effect of that from a diversity 19 00:00:51.718 --> 00:00:52.260 standpoint. 20 00:00:52.469 --> 00:00:56.347 Basically, everybody's ordering females around all the time. 21 00:00:57.432 --> 00:01:00.685 How will that impact the way people think about the role of 22 00:01:00.685 --> 00:01:01.686 females in society? 23 00:01:02.103 --> 00:01:05.190 That is a theme that is totally absent from the discussion, and 24 00:01:05.190 --> 00:01:06.900 I wish it was much more prevalent. 25 00:01:06.941 --> 00:01:07.776 You're right, 26 00:01:07.776 --> 00:01:10.028 I don't think people think about it, but it's Alexa this, it's 27 00:01:10.028 --> 00:01:10.403 Alexa that. 28 00:01:10.653 --> 00:01:11.780 What do you want to sort of see? 29 00:01:11.780 --> 00:01:13.615 Should there be more male voices 30 00:01:13.615 --> 00:01:14.449 you're asking for help? 31 00:01:15.033 --> 00:01:16.242 Well, there's two ways about that. 32 00:01:16.242 --> 00:01:19.746 You can do more male voices or you can try to aim for more 33 00:01:19.746 --> 00:01:23.124 robotic voices, and it's actually a real trust question. 34 00:01:23.124 --> 00:01:27.045 Do you want the voices that are there and the embodiment of the 35 00:01:27.045 --> 00:01:30.757 avatars that are used to look human-like, or to be what they 36 00:01:30.757 --> 00:01:34.344 are: robots, so that you can identify when you're actually 37 00:01:34.344 --> 00:01:36.304 interacting with a human or not? 38 00:01:36.846 --> 00:01:39.641 That is the type of question I think corporates should be 39 00:01:39.641 --> 00:01:40.517 asking themselves. 40 00:01:40.809 --> 00:01:44.687 They're part of what I call the micro-ethical decisions of AI. 41 00:01:45.146 --> 00:01:48.817 And today, the reality in most corporations, they're taken in a 42 00:01:48.817 --> 00:01:52.153 very decentralized fashion, whereas I think they should be 43 00:01:52.153 --> 00:01:54.155 central in the corporate strategy. 44 00:01:54.572 --> 00:01:56.616 Jessica, that's absolutely fascinating. 45 00:01:56.616 --> 00:01:58.118 Thanks so much for joining us here. 46 00:01:58.243 --> 00:01:58.827 My pleasure. 47 00:01:58.827 --> 00:01:59.702 Thank you for having me.