WEBVTT 00:00:00.240 --> 00:00:01.680 Sophia, Karen, welcome. 00:00:02.000 --> 00:00:02.480 Sophia, 00:00:03.120 --> 00:00:06.400 we've heard so much about agentic AI but . . . 00:00:06.400 --> 00:00:07.920 I mean, first of all, what is it? 00:00:07.920 --> 00:00:11.040 And second, what is perhaps the biggest misconception around it? 00:00:11.320 --> 00:00:12.600 Yeah, that's a great question. 00:00:12.640 --> 00:00:15.560 And, in fact, we just wrote a report about it. 00:00:15.560 --> 00:00:19.800 It's called "Agentic AI, Finance, and the Do-It-for-Me Economy." 00:00:20.000 --> 00:00:23.960 So in terms of what is it, we have had AI in terms of machine learning. 00:00:24.360 --> 00:00:27.600 The quickest way to describe that is: analyze-it-for-me technology. 00:00:28.240 --> 00:00:29.120 And then we had GenAI, 00:00:29.120 --> 00:00:31.760 which is create-it-for-me technology. 00:00:32.000 --> 00:00:35.920 Then we have a agentic AI, which is do-it- for-me technology, 00:00:35.920 --> 00:00:37.000 action it for me. 00:00:37.120 --> 00:00:39.920 What we say in the report is that it turbocharges 00:00:39.920 --> 00:00:43.240 the do-it-for-me economy by taking action. 00:00:43.440 --> 00:00:44.720 And this is the misconception. 00:00:44.720 --> 00:00:47.360 The misconception is that it is fully autonomous. 00:00:47.720 --> 00:00:48.520 It is not. 00:00:48.760 --> 00:00:51.840 Of course, it operates with a degree of autonomy, 00:00:51.840 --> 00:00:56.040 but you're always going to need humans for input, for programming, 00:00:56.040 --> 00:00:57.160 and for oversight. 00:00:58.240 --> 00:01:00.960 Karen, what are perhaps the pitfalls that 00:01:00.960 --> 00:01:05.000 companies need to look out for when implementing agentic AI? 00:01:05.560 --> 00:01:09.720 Well, first, exactly as Sophia was saying, because there is a need for some human 00:01:09.720 --> 00:01:10.840 control at some point, 00:01:10.840 --> 00:01:12.040 the question is about the balance. 00:01:12.040 --> 00:01:16.120 How do you find the right balance between the autonomy that you want to provide, 00:01:16.120 --> 00:01:17.680 you want to give to the machine, 00:01:17.680 --> 00:01:20.560 versus what type of human control do you want to have? 00:01:20.920 --> 00:01:25.240 And that applies very differently depending on what type of agent you’re talking about. 00:01:25.240 --> 00:01:29.000 I like talking about the internal use cases and then the consumer-facing use 00:01:29.000 --> 00:01:32.880 cases because they're very different. When it's relates to internal use cases, 00:01:32.880 --> 00:01:35.840 actually, this is already starting, and this is where we're seeing a lot of 00:01:35.840 --> 00:01:37.600 companies actually learning to do it. 00:01:37.600 --> 00:01:41.480 For example, to create coding for their softwares. 00:01:41.480 --> 00:01:44.360 And this is really where, if you have the right supervision by a 00:01:44.360 --> 00:01:46.640 human, it actually works already quite well. 00:01:46.640 --> 00:01:50.880 But when you think about the consumer- facing use cases, like, for example, 00:01:50.880 --> 00:01:56.400 having a a customer being able to make to delegate decisions on their purchase, 00:01:56.400 --> 00:01:58.840 for example, on the customer journey, this is very new. 00:01:58.840 --> 00:02:03.080 And this is really where there will be a lot of pitfalls that most companies 00:02:03.080 --> 00:02:05.840 haven't encountered yet because that's very new. 00:02:05.840 --> 00:02:08.640 But the question will be somehow, you know, 00:02:08.640 --> 00:02:15.720 how much delegation do you want, do consumers want to give to their agent AI? 00:02:15.720 --> 00:02:17.240 So that's going to be very interesting. 00:02:17.480 --> 00:02:22.600 So what piece of advice would you give companies that are looking to capture 00:02:22.600 --> 00:02:23.680 value from this? 00:02:24.240 --> 00:02:27.160 Well, I think companies need to be really 00:02:27.160 --> 00:02:32.920 thoughtful in terms of how they present agentic AI and the value that it creates 00:02:32.920 --> 00:02:34.280 to their employees. 00:02:34.480 --> 00:02:38.520 Because to my earlier point, we said we always need to have the human 00:02:38.520 --> 00:02:42.840 in the loop or above the loop, and you need to bring the humans with you. 00:02:43.160 --> 00:02:46.280 Now, a narrative that is framed around: 00:02:46.280 --> 00:02:51.160 you can use this as a tool to better serve your clients, 00:02:51.160 --> 00:02:55.320 to be the best for your clients, or to eliminate these workflow processes 00:02:55.320 --> 00:02:57.640 that cause bottlenecks. 00:02:57.640 --> 00:03:02.520 A narrative like that is likely to go down well and create a positive culture 00:03:02.520 --> 00:03:07.520 and bring employees with you from the beginning when you're thinking about the 00:03:07.520 --> 00:03:11.800 agentic AI use case and not have your employees come in at the end. 00:03:12.040 --> 00:03:12.480 Yes. 00:03:12.520 --> 00:03:17.320 And maybe if I can add to this, because I think this idea of how do you 00:03:17.320 --> 00:03:23.600 convince your employees that AI is going to increase joy in your work and reduce toil, right, 00:03:23.600 --> 00:03:27.120 I think that's somehow the vision that needs to be created. 00:03:27.120 --> 00:03:30.200 But that's difficult because many people are going to be very worried that, 00:03:30.200 --> 00:03:31.960 of course, there will be some job changes. 00:03:31.960 --> 00:03:35.600 There will be some questions on, you know, what are the jobs that will not exist any 00:03:35.600 --> 00:03:38.600 longer, but what are the new ones that you're going to create? 00:03:38.600 --> 00:03:42.400 And this question about how do you use AI to focus, 00:03:42.400 --> 00:03:46.840 to have them focused on what the employees don't like doing, 00:03:46.840 --> 00:03:50.440 so that the employees can really do what they want, what they like doing, 00:03:50.440 --> 00:03:52.360 and how they create value in the organization. 00:03:52.360 --> 00:03:55.120 I think that's part of the transformation that needs to happen. 00:03:55.360 --> 00:03:57.280 Karen, Sophia, thank you so much for your time. 00:03:57.680 --> 00:03:58.120 Thank you.