WEBVTT 00:00:00.166 --> 00:00:01.126 Thomas, welcome. 00:00:01.501 --> 00:00:07.799 Can you give me some examples of digital, AI, GenAI, that you 00:00:07.799 --> 00:00:10.427 think have the most use in health care? 00:00:10.427 --> 00:00:11.845 Broadly in health care, 00:00:11.845 --> 00:00:15.932 I'm convinced that AI it will help making better decisions. 00:00:15.932 --> 00:00:18.935 So in the not-too-distant future, we will see the life of 00:00:18.935 --> 00:00:22.188 physicians changing dramatically because there will be GenAI. 00:00:22.522 --> 00:00:24.566 So we'll see validated GenAI 00:00:24.566 --> 00:00:28.611 being assisting physicians, and I suspect in a not-too-distant 00:00:28.611 --> 00:00:32.282 future working more directly involved in decision-making 00:00:32.282 --> 00:00:36.119 diagnostics, leaving the human interactions to the doctors. 00:00:36.578 --> 00:00:39.706 So that's one way. In the pharmaceutical industry, we will see 00:00:39.706 --> 00:00:41.916 we are already seeing AI drive 00:00:41.916 --> 00:00:45.920 enormous acceleration in thenprocesses. 00:00:45.920 --> 00:00:49.466 So from productivity point of view, but also from a scientific 00:00:49.466 --> 00:00:52.635 insight point of view, quality by design point of view. 00:00:52.969 --> 00:00:56.056 So we see AI already being deployed in our company. 00:00:56.056 --> 00:00:58.683 We already deployed broadly, but I expect it will be accelerated 00:00:58.683 --> 00:01:04.481 significantly in the next 6, 8, 12, 18 months. 00:01:05.023 --> 00:01:07.150 And that's just an R&D, but we'll see the same in 00:01:07.150 --> 00:01:09.611 production, in other parts of the industry. 00:01:09.611 --> 00:01:13.782 Where you're seeing the most success... 00:01:13.782 --> 00:01:15.158 Are there themes? 00:01:15.909 --> 00:01:20.038 It really depends on what success is defined as because, 00:01:20.038 --> 00:01:24.292 because you can say we already see it in report writing. 00:01:24.292 --> 00:01:28.004 So we do no longer have human beings writing the study reports 00:01:28.004 --> 00:01:29.380 of our clinical trials. 00:01:29.380 --> 00:01:30.715 It's all done by AI. 00:01:31.007 --> 00:01:34.052 So that's an example of very successful implementation and 00:01:34.052 --> 00:01:37.055 scaling, and we're expanding that into other parts of the 00:01:37.055 --> 00:01:38.932 what we call the regulatory dossier. 00:01:39.849 --> 00:01:42.894 Another example is in deployment and code generation. 00:01:42.894 --> 00:01:46.481 So we basically, will gradually see no need for starting to code 00:01:46.481 --> 00:01:47.482 manually yourself. 00:01:48.066 --> 00:01:49.943 Image analysis--very successful. 00:01:49.943 --> 00:01:53.530 You know, we, we do know that AI is better at looking at images 00:01:53.530 --> 00:01:56.282 than radiologists, quite frankly, in many cases. 00:01:56.658 --> 00:02:00.328 And again, it goes much faster adjudication processes, which is 00:02:00.328 --> 00:02:03.581 during very large clinical trials, if you have endpoints 00:02:03.581 --> 00:02:07.210 that need to be adjudicated by physicians where we are building 00:02:07.210 --> 00:02:10.672 AI algorithms that can do that. And the list goes on . . . 00:02:10.672 --> 00:02:12.799 So that's just in development. In research you see it in 00:02:12.799 --> 00:02:16.219 molecule design, target discovery, protein-folding 00:02:16.219 --> 00:02:20.098 predictions, which are very complex, and then simulations of 00:02:20.098 --> 00:02:22.642 clinical trials and simulations of processes. 00:02:22.934 --> 00:02:25.812 And then there are lots of other process automation errors. 00:02:25.812 --> 00:02:30.191 So the list is really long and success is defined in many ways: 00:02:30.191 --> 00:02:34.112 is scientific insight is success, increased productivity 00:02:34.112 --> 00:02:37.282 is success, you know, reduction of time spent is success. 00:02:37.282 --> 00:02:39.033 So I wouldn't pick one area. 00:02:40.410 --> 00:02:44.497 I was speaking about digital and AI, but what else, what other 00:02:44.497 --> 00:02:48.585 trends, innovations do you see really shaping this year ahead? 00:02:48.585 --> 00:02:52.088 From a digital point of view or an AI point of view, 00:02:52.088 --> 00:02:57.886 it's the enormous development in generative AI that we keep seeing. 00:02:58.094 --> 00:03:00.722 Actually, half a year ago, I would have thought we were 00:03:00.722 --> 00:03:03.558 plateauing a bit, but I can say that's not going to happen. 00:03:03.892 --> 00:03:04.976 We now see the GenAI 00:03:04.976 --> 00:03:09.564 models developing very rapidly into, into areas where you start 00:03:09.564 --> 00:03:13.943 seeing something which would be closer to real intelligence. 00:03:14.194 --> 00:03:18.740 So reasoning models that really now start behaving very differently. 00:03:19.115 --> 00:03:22.410 And . . . and it's . . . it's just amazing the speed of that. 00:03:22.493 --> 00:03:24.871 It's also a challenge for us in the regulated area. 00:03:25.163 --> 00:03:28.708 How do we keep that balance between the pace of model 00:03:28.708 --> 00:03:31.836 development and being able to validate and say this is good 00:03:31.836 --> 00:03:35.632 enough also versus regulators and legislation? 00:03:35.632 --> 00:03:37.967 How do you strike that balance? 00:03:37.967 --> 00:03:42.680 We strike that balance by having built a very good AI governance 00:03:42.680 --> 00:03:44.557 set-up first and foremost. 00:03:44.557 --> 00:03:48.019 We do not just deploy everything into production so that we have 00:03:48.019 --> 00:03:49.979 experimentation--that's good enough. 00:03:50.230 --> 00:03:52.815 But if it goes into production, it has to go through our 00:03:52.815 --> 00:03:54.984 governance, which is . . . includes a lot of technological 00:03:54.984 --> 00:03:58.655 checks and balances, but also ethical and legal checks and balances. 00:03:58.655 --> 00:03:59.781 And that's one way of doing it. 00:04:00.073 --> 00:04:03.493 But I don't have the full answer yet because it's not a something 00:04:03.493 --> 00:04:05.036 that there isn't a manual for 00:04:05.036 --> 00:04:05.954 how do you balance that. 00:04:05.954 --> 00:04:07.163 Thomas, thank you so much. 00:04:07.163 --> 00:04:08.122 An absolute pleasure. 00:04:08.331 --> 00:04:09.165 Likewise, Thank you.