WEBVTT 1 00:00:00.000 --> 00:00:01.600 Hi Sesh, thanks for joining us. 2 00:00:01.600 --> 00:00:03.400 When it comes to GenAI, 3 00:00:03.640 --> 00:00:05.720 There are tech challenges that business leaders face. 4 00:00:05.720 --> 00:00:06.720 What are they? 5 00:00:07.440 --> 00:00:09.320 I see many tech challenges. 6 00:00:09.320 --> 00:00:10.600 Let me just start with three of them. 7 00:00:10.600 --> 00:00:12.360 And and there are others. 8 00:00:12.720 --> 00:00:15.680 I think the first is just data quality and availability. 9 00:00:15.680 --> 00:00:16.240 A lot of GenAI 10 00:00:16.240 --> 00:00:19.720 is driven by underlying data quality, availability, how you 11 00:00:19.720 --> 00:00:23.320 actually go to the right sources of data, how you develop the 12 00:00:23.320 --> 00:00:26.200 right pipelines, how you implement the right data 13 00:00:26.200 --> 00:00:26.840 governance. 14 00:00:27.440 --> 00:00:30.040 These are things that have to be put in place to be able to 15 00:00:30.040 --> 00:00:30.720 extract insight. 16 00:00:30.720 --> 00:00:33.160 I think that's one, and that's a real area of focus. 17 00:00:33.920 --> 00:00:38.200 I think a second thing is there are so many different failure 18 00:00:38.200 --> 00:00:39.840 modes when you implement 19 00:00:39.840 --> 00:00:41.880 GenAI solutions. 20 00:00:42.440 --> 00:00:46.640 Hallucinations, prompt injection and jailbreaking, degeneration 21 00:00:46.640 --> 00:00:50.200 of models, catastrophic forgetting of models, privacy 22 00:00:50.200 --> 00:00:54.320 violations, these are all things that have to be put in place. 23 00:00:54.320 --> 00:01:00.600 LLM bias, fair use of of LLMS, all of these start to become 24 00:01:00.600 --> 00:01:02.800 challenges that you have to overcome when you're deploying 25 00:01:02.800 --> 00:01:04.160 at-scale solutions. 26 00:01:04.160 --> 00:01:05.480 That's the second thing that I see. 27 00:01:06.000 --> 00:01:09.640 The third thing is just human capital and scarcity of talent. 28 00:01:10.440 --> 00:01:13.840 We are clearly seeing most organizations having to upskill 29 00:01:13.840 --> 00:01:17.440 or reskill their people to be able to put these people against 30 00:01:17.440 --> 00:01:18.720 GenAI work. 31 00:01:18.720 --> 00:01:25.360 46% of people, based on the survey that we did recently, in 32 00:01:25.360 --> 00:01:27.320 these organizations on average need to be upscaled over the 33 00:01:27.320 --> 00:01:28.600 next three years. 34 00:01:29.280 --> 00:01:32.800 Then there are of course a broader set of challenges like 35 00:01:32.800 --> 00:01:35.600 legacy integrations, we talked about security. 36 00:01:36.600 --> 00:01:39.880 There is a lot that needs to be done around the explainability 37 00:01:39.880 --> 00:01:43.080 or interpretability of the results that these models give you 38 00:01:43.560 --> 00:01:46.240 So there are a host of other things that need to be overcome. 39 00:01:46.440 --> 00:01:49.160 But the nice thing is we are starting to find solutions. 40 00:01:49.160 --> 00:01:53.440 We're able to actually address these issues and get solutions 41 00:01:53.440 --> 00:01:54.040 to scale. 42 00:01:54.320 --> 00:01:57.080 And are there opportunities, tech-driven opportunities, that 43 00:01:57.080 --> 00:01:59.120 business leaders are overlooking, I imagine? 44 00:02:00.840 --> 00:02:05.520 We did a recent survey and we surveyed 1400 executives. 45 00:02:06.480 --> 00:02:09.400 The big thing that came out of that is there's only one in ten 46 00:02:09.400 --> 00:02:11.160 executives that are actually pioneers. 47 00:02:11.440 --> 00:02:14.560 Nine out of ten are actually waiting and watching, to the 48 00:02:14.560 --> 00:02:16.160 first question that you asked. 49 00:02:17.040 --> 00:02:19.800 And maybe the way I answer the question is, what are the things 50 00:02:19.800 --> 00:02:21.240 that the pioneers are doing well? 51 00:02:21.240 --> 00:02:23.960 The first thing is they're massively ambitious. 52 00:02:24.200 --> 00:02:26.080 So when they think about investing in GenAI 53 00:02:26.080 --> 00:02:31.720 for top line and productivity, their ambition is 1.3 to 1.4 54 00:02:31.720 --> 00:02:38.160 times higher than what we see [in] folks who are not ready to invest. 55 00:02:38.880 --> 00:02:41.400 I think the second is around upskilling their talent. 56 00:02:43.240 --> 00:02:47.040 There are close to, you know, 60% of these executives who 57 00:02:47.040 --> 00:02:50.400 think that their senior leadership is not ready to 58 00:02:50.400 --> 00:02:51.640 actually capitalize 59 00:02:51.640 --> 00:02:52.440 on GenAI solutions. 60 00:02:53.160 --> 00:02:54.920 So upscaling is a massive opportunity. 61 00:02:55.560 --> 00:02:58.720 A third one is really understanding the cost base of 62 00:02:58.720 --> 00:02:59.360 these GenAI 63 00:02:59.360 --> 00:03:02.200 solutions, what is the cost to actually implement the 64 00:03:02.200 --> 00:03:05.000 solutions, direct costs and indirect costs and really 65 00:03:05.000 --> 00:03:08.680 understanding the detail of what these costs are going to be over time. 66 00:03:08.720 --> 00:03:09.680 That's the third one. 67 00:03:10.280 --> 00:03:13.760 I think the fourth is around partnerships and ecosystems. 68 00:03:14.480 --> 00:03:17.600 Many of these pioneers are investing in really deep 69 00:03:17.600 --> 00:03:20.280 partnerships, building out their ecosystems. 70 00:03:20.760 --> 00:03:23.080 And then the last one I would say is when you develop these 71 00:03:23.280 --> 00:03:26.200 GenAI solutions, you have to do it with responsible AI principles. 72 00:03:26.840 --> 00:03:31.400 We still see only, you know about 17% of organizations have 73 00:03:31.400 --> 00:03:34.120 responsible AI on the CEO's agenda. 74 00:03:34.480 --> 00:03:37.240 So these are the things that the winners are doing. 75 00:03:37.240 --> 00:03:41.080 I think the observers need to learn from this and start to 76 00:03:41.080 --> 00:03:43.320 really focus on these five things. 77 00:03:43.440 --> 00:03:44.840 It's a really overwhelming issue. 78 00:03:45.320 --> 00:03:46.880 Sesh, thank you for breaking it down for us. 79 00:03:47.080 --> 00:03:48.040 Thank you for having me. 80 00:03:48.040 --> 00:03:48.600 This is great. 81 00:03:49.040 --> 00:03:49.720 Good talking to you.