WEBVTT 00:00:00.480 --> 00:00:01.280 Benji, welcome. 00:00:01.840 --> 00:00:05.120 Benji, are telcos ready for the next generation of technologies? 00:00:06.000 --> 00:00:07.080 I believe we're over-ready. 00:00:07.080 --> 00:00:08.480 We're overcapitalized. 00:00:09.160 --> 00:00:13.520 I also see this era of AI. I mean everything and anything today has 00:00:13.520 --> 00:00:14.440 been about AI. 00:00:14.880 --> 00:00:19.000 And what is quite interesting that I find is that it's almost a repeat of the past. 00:00:19.000 --> 00:00:23.520 So before we had the hyperscalers, had the OTTs, we had WhatsApp. 00:00:23.800 --> 00:00:27.080 And it feels a little bit of a repeat of the past that we underestimate how ready 00:00:27.080 --> 00:00:30.600 we are for the future. And we should stay humble but at the same 00:00:30.600 --> 00:00:32.280 time actually take the lead on this. 00:00:32.280 --> 00:00:34.520 So there's AI for telco and telco for AI. 00:00:34.880 --> 00:00:38.280 So I think we are ready, and we need to get ahead of the curve. 00:00:38.280 --> 00:00:41.640 And more concerning is a little bit on the geopolitical side, 00:00:41.760 --> 00:00:45.040 so that I would say is where we might be a little bit less prepared. 00:00:45.680 --> 00:00:47.680 There's a shift there to be sure. 00:00:48.560 --> 00:00:53.840 What is the smartest way to navigate that shift though--to being ready to really 00:00:53.840 --> 00:00:55.360 fully taking advantage? 00:00:55.760 --> 00:01:01.120 I believe it's also about being humane in the way that we approach the future. 00:01:01.480 --> 00:01:03.120 It is just data, right? 00:01:03.120 --> 00:01:06.320 At the end of the day, without data, without networks, 00:01:06.320 --> 00:01:11.040 there is no proliferation of future digital dexterity. 00:01:11.040 --> 00:01:12.440 And there, keeping it humane, 00:01:12.440 --> 00:01:14.160 I think is going to be really important. 00:01:14.160 --> 00:01:20.280 So how far do you also embed in your DNA as a company the ethics that comes with 00:01:20.280 --> 00:01:23.600 the next future? How your leaders embrace it? 00:01:23.600 --> 00:01:27.160 How your leaders also become vulnerable when they don't understand something? 00:01:27.160 --> 00:01:33.440 How far they augment themselves with an agentic AI partner next to them? 00:01:33.440 --> 00:01:36.080 And also how they challenge your employees? 00:01:36.080 --> 00:01:40.280 How I also see my teams right now, and I've worked with some of the most 00:01:40.280 --> 00:01:44.800 intelligent and smart people possible that I believe. 00:01:44.800 --> 00:01:48.120 And it's also because I do the selection, of course, of who I hire, of course. 00:01:48.120 --> 00:01:53.240 Of course. Being ex-BCG, I've been trained well, 00:01:53.240 --> 00:01:55.400 but I believe there it's really, really important also how you challenge 00:01:55.400 --> 00:01:59.440 your team members to say, "Hey, I hear you telling me information, 00:01:59.440 --> 00:02:01.480 and you're giving me great trains, 00:02:01.480 --> 00:02:05.320 but are you just repeating what something . . . what a machine told you, 00:02:05.320 --> 00:02:09.000 or are you actually digesting it and actually giving me your advice? 00:02:09.000 --> 00:02:11.040 How are you augmenting it?" 00:02:11.040 --> 00:02:13.320 So that's going to be the interesting part, I think. 00:02:13.680 --> 00:02:15.080 Talking of advice, 00:02:15.080 --> 00:02:19.040 what one piece of advice would you give to telco leaders for 2025? 00:02:19.160 --> 00:02:20.920 Emotion is not a bad thing. 00:02:22.200 --> 00:02:24.120 Emotion is what keeps us human. 00:02:24.600 --> 00:02:27.360 Emotion is what differentiates us from machines. 00:02:27.720 --> 00:02:29.800 And emotion is what makes a leader great. 00:02:30.080 --> 00:02:33.960 So that empathy, that emotion--keep it protected, 00:02:34.160 --> 00:02:35.480 and then I think we'll be good. 00:02:35.920 --> 00:02:37.720 Benji, thank you so much for your time, a pleasure. 00:02:37.960 --> 00:02:38.120 Yeah, 00:02:38.480 --> 00:02:38.880 thank you.