WEBVTT 1 00:00:00.083 --> 00:00:01.835 So, Azeem, great to see you here in Davos. 2 00:00:01.918 --> 00:00:02.711 It's a little bit chilly. 3 00:00:03.003 --> 00:00:04.713 It's chilly, but it's wonderful to be here. 4 00:00:04.713 --> 00:00:07.298 Your company is warming me up. 5 00:00:07.340 --> 00:00:08.717 That's great to hear. 6 00:00:08.717 --> 00:00:11.845 But finally we meet in person tons of activity. 7 00:00:12.012 --> 00:00:13.638 You're my go to AI guru. 8 00:00:13.680 --> 00:00:14.097 Thank you. 9 00:00:14.681 --> 00:00:16.850 What are you seeing out there in terms of '24? 10 00:00:16.850 --> 00:00:17.726 Any big surprises? 11 00:00:17.726 --> 00:00:20.562 Did anything change for you as you with incoming perceptions 12 00:00:20.562 --> 00:00:22.981 versus what you're hearing and as you look at '24? 13 00:00:23.231 --> 00:00:26.401 Well, the big surprise for me has been hearing directly from 14 00:00:26.401 --> 00:00:29.738 company bosses about how they're using AI, both at a dinner that 15 00:00:29.738 --> 00:00:32.449 you hosted, but also in the official annual meeting. 16 00:00:33.491 --> 00:00:37.662 I'm surprised at how far large companies are in deploying that, 17 00:00:37.662 --> 00:00:41.499 not just traditional AI but generative AI as well and also 18 00:00:41.499 --> 00:00:44.669 the sense that they've only just begun to do it. 19 00:00:44.836 --> 00:00:46.546 So that's big surprise number one. 20 00:00:46.546 --> 00:00:50.508 I think the second thing that I was very pleased to see was that 21 00:00:50.508 --> 00:00:53.428 the World Economic Forum has been able to bring 22 00:00:53.428 --> 00:00:57.223 representatives from all of the key foundation model, makers, 23 00:00:57.223 --> 00:01:01.102 and the key scientists and get them in a room together to start 24 00:01:01.102 --> 00:01:04.773 to talk about your practical, tangible issues of what comes 25 00:01:04.773 --> 00:01:05.106 next. 26 00:01:05.398 --> 00:01:08.943 Love it and I love that the tech leaders are actually embracing 27 00:01:08.943 --> 00:01:09.944 that conversation. 28 00:01:09.944 --> 00:01:11.196 They're not shying away from it. 29 00:01:11.196 --> 00:01:14.741 Is that same sentiment that you feel? Yeah, I think that they 30 00:01:14.741 --> 00:01:16.326 are definitely embracing it. 31 00:01:16.326 --> 00:01:19.245 We always have to be a bit careful with with technology 32 00:01:19.245 --> 00:01:22.499 firms because they always have their own agenda and sometimes 33 00:01:22.499 --> 00:01:25.376 that agenda is a simple commercial agenda that you can 34 00:01:25.376 --> 00:01:28.630 look at and sometimes it's also about some really fundamental 35 00:01:28.630 --> 00:01:29.047 beliefs. 36 00:01:29.047 --> 00:01:32.967 And one of the points of distinction that is emerging is 37 00:01:32.967 --> 00:01:36.930 around the extent to which models should sit behind APIs, 38 00:01:36.930 --> 00:01:41.184 sit behind proprietary doors, the extent to which they should 39 00:01:41.184 --> 00:01:43.353 be open source and fine tunable. 40 00:01:43.895 --> 00:01:45.396 What is better for industry? 41 00:01:45.396 --> 00:01:48.817 What is better for society at large> Because this is a 42 00:01:48.817 --> 00:01:52.570 powerful technology and there is still I think a sort of enough 43 00:01:52.570 --> 00:01:55.824 creative disagreement about those all types of issues. 44 00:01:55.865 --> 00:01:58.743 OK then that's your role to keep us on our toes and make sure 45 00:01:58.743 --> 00:02:01.204 that that it's the overall productivity right of the 46 00:02:01.204 --> 00:02:02.080 workforce etcetera. 47 00:02:02.080 --> 00:02:05.125 So how do you frame that conversation when you're talking 48 00:02:05.125 --> 00:02:07.293 about it and is it real, is it not real? 49 00:02:08.294 --> 00:02:11.422 Well, the technology is so general and it can be applied in 50 00:02:11.422 --> 00:02:12.590 so many different ways. 51 00:02:12.632 --> 00:02:16.845 Its powers is in its generality and in a way when you start to 52 00:02:16.845 --> 00:02:21.099 pull out specific use cases, you actually weaken the promise of 53 00:02:21.099 --> 00:02:22.225 of generative AI. 54 00:02:22.225 --> 00:02:26.020 And what I I love to think about is the way in which my team has 55 00:02:26.020 --> 00:02:27.188 started to use this. 56 00:02:27.188 --> 00:02:29.149 We use it for desk research. 57 00:02:29.149 --> 00:02:30.775 We use it for analytical thinking. 58 00:02:30.775 --> 00:02:32.068 We use it for creative thinking. 59 00:02:32.277 --> 00:02:35.029 We use it to automate our marketing analysis. 60 00:02:35.280 --> 00:02:36.614 And where do you begin? 61 00:02:36.614 --> 00:02:38.032 Where do you where do you start? 62 00:02:38.032 --> 00:02:41.161 And when you talk to bosses of companies with fifty 100 63 00:02:41.161 --> 00:02:44.247 thousand, 200 thousand employees, they're in a similar 64 00:02:44.247 --> 00:02:47.792 situation because every employee has got four or five different 65 00:02:47.792 --> 00:02:50.086 unique use cases that can come to bear. 66 00:02:51.212 --> 00:02:54.966 I'm quite optimistic that over over the course of this year 67 00:02:54.966 --> 00:02:59.012 where firms have been a little bit slower to roll things out so 68 00:02:59.012 --> 00:03:02.432 they can make sense of the technology, they will widen 69 00:03:02.432 --> 00:03:05.101 participation across their organizations. 70 00:03:05.268 --> 00:03:09.063 And then the question really will be will that productivity, 71 00:03:09.063 --> 00:03:12.692 those productivity gains result in real changes to the way 72 00:03:12.692 --> 00:03:16.487 business is done or will it just create more clutter the way 73 00:03:16.487 --> 00:03:19.449 e-mail has created more clutter in our inboxes? 74 00:03:19.449 --> 00:03:21.242 Interesting, but on the topic: 75 00:03:21.242 --> 00:03:24.787 Is it also fair to assume that what I love in your when you 76 00:03:24.787 --> 00:03:27.999 talk about AI, you're not just theoretisizing, right? 77 00:03:28.291 --> 00:03:29.584 You're actually doing it right? 78 00:03:29.584 --> 00:03:33.004 Is it fair to say that you're using a lot of these tools, 79 00:03:33.004 --> 00:03:34.505 right in your day-to-day? 80 00:03:34.505 --> 00:03:36.758 You did not shrink your team, but it increased the 81 00:03:36.758 --> 00:03:39.385 productivity and you could actually grow and scale faster? 82 00:03:39.385 --> 00:03:39.552 Yeah. 83 00:03:39.552 --> 00:03:40.887 So our team is small. 84 00:03:41.262 --> 00:03:46.142 We have a team of a handful of researchers, and when we started 85 00:03:46.142 --> 00:03:51.147 using ChatGPT, GPT4, Claude from Anthropic, we actually were able 86 00:03:51.147 --> 00:03:52.607 to delay 2 hirings. 87 00:03:53.233 --> 00:03:56.319 And that's just because the team themselves is so much more 88 00:03:56.319 --> 00:03:56.903 productive. 89 00:03:56.903 --> 00:04:00.823 The quality of their work is higher, they feel much more in 90 00:04:00.823 --> 00:04:04.786 control and in some sense they also have more time on their 91 00:04:04.786 --> 00:04:05.161 hands. 92 00:04:05.161 --> 00:04:08.498 I mean it's worked very, very well for us, but we are really 93 00:04:08.498 --> 00:04:11.501 canonical desk workers, knowledge workers who trade in 94 00:04:11.501 --> 00:04:14.712 ideas and it happens that LLMS are really, really good for 95 00:04:14.712 --> 00:04:15.004 that. 96 00:04:15.255 --> 00:04:18.383 Of course, across a large business very few people are 97 00:04:18.383 --> 00:04:20.760 normally involved in processes like that. 98 00:04:20.927 --> 00:04:25.598 But I had a great example when I was in Germany just before 99 00:04:25.598 --> 00:04:30.603 coming into to Davos which was of the phone company had created 100 00:04:30.603 --> 00:04:34.649 a chatbot, a field service chatbot using an LLM for 101 00:04:34.649 --> 00:04:37.277 technicians as they sat by street 102 00:04:37.277 --> 00:04:38.194 side cabinets. 103 00:04:38.653 --> 00:04:42.490 What do we need to do with this yellow cable and the green cable 104 00:04:42.490 --> 00:04:45.827 and the red cable? And it's actually speed up the work. 105 00:04:45.827 --> 00:04:49.205 It's meant that the field service agents are not sitting 106 00:04:49.205 --> 00:04:52.625 in the cold in a call centre trying to get get through to 107 00:04:52.625 --> 00:04:53.084 support. 108 00:04:53.376 --> 00:04:54.919 And that for me was a really great example. 109 00:04:55.044 --> 00:04:57.463 But can you apply that to a pharmaceutical company or an 110 00:04:57.463 --> 00:04:58.172 accounting firm? 111 00:04:58.464 --> 00:05:00.341 No, You have to have your own particular user. 112 00:05:00.341 --> 00:05:02.510 It's very distinct. And it's not just productivity. 113 00:05:02.510 --> 00:05:05.221 It's also allows people to have their jobs treated with respect 114 00:05:05.221 --> 00:05:07.765 in a way, right, so that they are not sitting in the cold if 115 00:05:07.765 --> 00:05:08.975 they don't have to, etcetera. 116 00:05:08.975 --> 00:05:12.103 Yeah, I think it's a tremendously humanising 117 00:05:12.103 --> 00:05:12.854 technology. 118 00:05:13.396 --> 00:05:17.900 We in the UK are coming off the back of a tremendous IT scandal 119 00:05:17.900 --> 00:05:21.988 called the Post Office Horizon IT scandal where a billing 120 00:05:21.988 --> 00:05:26.576 system rolled out across the the post offices had bugs in it and 121 00:05:26.576 --> 00:05:31.247 several 100 post office workers were sent to prison wrongly for 122 00:05:31.247 --> 00:05:32.373 so-called fraud. 123 00:05:32.790 --> 00:05:35.710 And the point when you look at the software, it was very, very 124 00:05:35.710 --> 00:05:36.252 dehumanized. 125 00:05:36.252 --> 00:05:40.506 It was the the old style screen with a terrible UI, easy to make 126 00:05:40.506 --> 00:05:41.090 mistakes. 127 00:05:41.591 --> 00:05:44.635 The beauty of generative AI is that we get to talk to this 128 00:05:44.635 --> 00:05:46.220 thing in our natural language. 129 00:05:46.262 --> 00:05:48.431 It talks back to us in our natural language. 130 00:05:48.431 --> 00:05:52.268 It's the most natural thing in the world and we can express 131 00:05:52.268 --> 00:05:55.146 politeness to it and it can joke back to us. 132 00:05:55.438 --> 00:06:00.318 A couple of weeks ago I finished a session with ChatGPT, where I 133 00:06:00.318 --> 00:06:04.238 was looking. At a session you were conversing with? 134 00:06:04.280 --> 00:06:04.572 Oh yeah. 135 00:06:05.281 --> 00:06:08.618 And I was doing some work on, you know, pharmaceutical 136 00:06:08.618 --> 00:06:11.037 technology transitions and I got tired. 137 00:06:11.454 --> 00:06:14.499 And at the end, I just whimsically said you're a very 138 00:06:14.499 --> 00:06:15.625 clever little robot. 139 00:06:15.625 --> 00:06:16.167 Thank you. 140 00:06:16.292 --> 00:06:19.462 And I'll come back tomorrow to talk about the moisture of 141 00:06:19.462 --> 00:06:20.004 apparatus. 142 00:06:20.463 --> 00:06:22.465 Moisture of apparatus is a phrase from Star Wars. 143 00:06:22.465 --> 00:06:23.383 Star Wars One. 144 00:06:23.716 --> 00:06:25.176 Luke Skywalker has to fix them. 145 00:06:25.676 --> 00:06:29.263 I had not mentioned Star Wars and ChatGPT response. 146 00:06:29.597 --> 00:06:30.014 Thanks. 147 00:06:30.014 --> 00:06:34.394 We had a productive set session and may the force be with you. 148 00:06:34.394 --> 00:06:34.977 Love it. 149 00:06:35.019 --> 00:06:38.147 And it figured that out and that is incredibly humanized. 150 00:06:38.147 --> 00:06:40.775 I can't think of any other software to put a smile on my 151 00:06:40.775 --> 00:06:41.984 face in that way, Love it. 152 00:06:41.984 --> 00:06:44.862 But then, like, how do I, how do I evolve that into the 153 00:06:44.862 --> 00:06:46.989 conversation on collective intelligence? 154 00:06:46.989 --> 00:06:47.448 Like what? 155 00:06:47.448 --> 00:06:47.740 Like that. 156 00:06:47.740 --> 00:06:49.575 Is that a first step towards that? 157 00:06:49.575 --> 00:06:51.494 How do you even think about collective intelligence? 158 00:06:51.494 --> 00:06:53.371 Yeah, I think this is a really important point. 159 00:06:53.371 --> 00:06:57.375 When science fiction writers and and computer scientists were 160 00:06:57.375 --> 00:07:01.129 thinking about AI, they often imagined it as a singleton. 161 00:07:01.129 --> 00:07:05.883 There would be a single AI that would almost be, you know, like 162 00:07:05.883 --> 00:07:07.093 a God somewhere. 163 00:07:07.260 --> 00:07:09.887 But I think what we're increasingly seeing is that the 164 00:07:09.887 --> 00:07:12.974 way we'll move towards a society of AI is in the same way that 165 00:07:12.974 --> 00:07:15.101 we've moved towards a society of databases. 166 00:07:15.268 --> 00:07:17.979 Our world lives on lots of databases that interact with 167 00:07:17.979 --> 00:07:19.730 each other for better or for worse. 168 00:07:20.314 --> 00:07:23.526 And we are going to see a collective intelligence emerge 169 00:07:23.526 --> 00:07:26.821 because every one of your clients and BCG itself will have 170 00:07:26.821 --> 00:07:30.450 hundreds of different AI systems that are some of which are very 171 00:07:30.450 --> 00:07:34.120 deterministic, some of which are much more stochastic like these 172 00:07:34.120 --> 00:07:34.328 LLM. 173 00:07:34.620 --> 00:07:38.166 And they'll be interacting with us, they'll be interacting with 174 00:07:38.166 --> 00:07:40.460 each other and things will feel smarter. 175 00:07:40.460 --> 00:07:43.880 And from that will come this sense of collective 176 00:07:43.880 --> 00:07:44.755 intelligence. 177 00:07:45.089 --> 00:07:49.051 It raises all sorts of really interesting questions about, you 178 00:07:49.051 --> 00:07:52.722 know, the governance of a distributed set of technologies 179 00:07:52.722 --> 00:07:56.517 that are, that his behaviour is emergent, and questions of 180 00:07:56.517 --> 00:07:59.896 control as well and predictability, which I think we 181 00:07:59.896 --> 00:08:02.940 will start to figure out over the coming years. 182 00:08:02.940 --> 00:08:07.320 But my sense is that we're going to move to a syncretic 183 00:08:07.320 --> 00:08:09.197 collective intelligence. 184 00:08:09.238 --> 00:08:10.740 I call it the Society of AI. 185 00:08:10.907 --> 00:08:11.282 OK. 186 00:08:11.282 --> 00:08:13.576 So it's going to be collection of these things that are going 187 00:08:13.576 --> 00:08:15.453 to have a perception of a collective intelligence. 188 00:08:15.453 --> 00:08:16.496 But will be a lot of things. 189 00:08:16.496 --> 00:08:17.622 There'll be a lot of different things. 190 00:08:17.622 --> 00:08:18.873 I'll be working together. 191 00:08:18.873 --> 00:08:22.418 And and in fact, what I've just described is how human society 192 00:08:22.418 --> 00:08:23.127 works, right? 193 00:08:23.127 --> 00:08:25.087 We all have different capabilities. 194 00:08:25.505 --> 00:08:29.258 And then we sometimes cluster in groups that we call teams or 195 00:08:29.258 --> 00:08:33.012 work groups or companies that also have their own identity at 196 00:08:33.012 --> 00:08:35.223 a slightly higher, more meta level. 197 00:08:35.223 --> 00:08:39.602 And I think that those parallels are important because they take 198 00:08:39.602 --> 00:08:43.773 away some of the anxiety people might feel about technologies 199 00:08:43.773 --> 00:08:44.315 like AI. 200 00:08:44.315 --> 00:08:49.445 They help us understand that we have actually got tools to make 201 00:08:49.445 --> 00:08:53.908 individual intelligences work together in healthy ways. 202 00:08:54.617 --> 00:08:55.701 Well, Azeem, thank you. 203 00:08:55.868 --> 00:08:57.036 Thank you for joining me today. 204 00:08:57.036 --> 00:09:00.081 And even more importantly, thank you for being that collective 205 00:09:00.081 --> 00:09:03.167 intelligence that is unpacking all of this for us continuously, 206 00:09:03.167 --> 00:09:06.295 keeping us on our toes and also, you know, forcing our thinking 207 00:09:06.295 --> 00:09:07.713 and making sure we stay sharp. 208 00:09:07.713 --> 00:09:08.339 So thank you a lot. 209 00:09:08.506 --> 00:09:09.340 Thank you so much, Vlad. 210 00:09:09.340 --> 00:09:09.840 My pleasure.