WEBVTT 1 00:00:00.000 --> 00:00:03.128 This is the end of the first day of Davos, 2 00:00:04.045 --> 00:00:06.381 I think that one of the key themes of Davos today, 3 00:00:06.381 --> 00:00:10.593 I think it's AI, clearly, and you and I were in Davos last year, 4 00:00:10.593 --> 00:00:13.847 I think at the time was emergence of ChatGPT, 5 00:00:13.847 --> 00:00:16.307 I think that we could see the power of the AI. 6 00:00:16.307 --> 00:00:19.269 But this year I think that the phase has really changed. 7 00:00:19.269 --> 00:00:23.231 I see that a lot of the executives try to deploy more 8 00:00:23.231 --> 00:00:26.860 the power of the AI but how do you see that, 9 00:00:26.860 --> 00:00:31.448 the emergence of the AI in 2023 and how do you see the trend overall? 10 00:00:31.448 --> 00:00:34.701 Yeah, I think it's really fascinating that we have like a 11 00:00:34.701 --> 00:00:38.580 generative AI actually catch that more general public 12 00:00:38.580 --> 00:00:41.791 interest as well, not just experts. 13 00:00:41.791 --> 00:00:46.087 And I think one of the keys was ChatGPT and its hallucination. 14 00:00:46.087 --> 00:00:47.756 So people get excited about it. 15 00:00:47.756 --> 00:00:50.425 Yeah, all kind of interesting phenomena they observed as well. 16 00:00:50.425 --> 00:00:54.262 But this is not really happening all of the sudden, right? 17 00:00:54.262 --> 00:00:58.308 Yes, that's a long research, underlying research to get here. 18 00:00:58.308 --> 00:01:03.730 So 2012 the AI community got deep learning and then we have a 19 00:01:03.730 --> 00:01:08.777 GAN, generative adversarial network, and then generative AI. 20 00:01:08.985 --> 00:01:12.614 So there's continuous improvement— every five years we 21 00:01:12.614 --> 00:01:16.242 have like a quantum leap on the technology in the AI. 22 00:01:16.451 --> 00:01:18.328 So this is really there still. 23 00:01:18.495 --> 00:01:20.038 I think this is really the beginning. 24 00:01:20.663 --> 00:01:25.001 People say this is going to be the high point now, I kind of doubt 25 00:01:25.001 --> 00:01:26.544 that actually. 26 00:01:26.544 --> 00:01:30.006 So I think we are witnessing very early stage of the internal 27 00:01:30.006 --> 00:01:31.299 industrial revolutions. 28 00:01:31.674 --> 00:01:35.637 In the industrial revolution we got the power, the steam engine 29 00:01:35.637 --> 00:01:37.847 and the internal combustion engine. 30 00:01:37.931 --> 00:01:41.226 You have a dramatical change in the way we do things like 31 00:01:41.226 --> 00:01:44.270 society and mobility and manufacturing and all that. 32 00:01:44.521 --> 00:01:48.066 I think we are getting the power of the intelligence now and we 33 00:01:48.066 --> 00:01:50.235 are at the very beginning of that. 34 00:01:50.235 --> 00:01:53.404 So at this moment, I say generative AI is still in the 35 00:01:53.404 --> 00:01:55.365 stage of the steam engines. 36 00:01:55.657 --> 00:01:58.660 Internal combustion engines are yet to arrive. 37 00:01:58.660 --> 00:02:02.038 And probably in the next five years we're going to see a couple major 38 00:02:02.038 --> 00:02:04.082 quantum leaps on the technology. 39 00:02:04.249 --> 00:02:05.959 So that's something that we are witnessing. 40 00:02:06.126 --> 00:02:06.960 Very insightful. 41 00:02:07.043 --> 00:02:09.838 You are a long time champion of AI, so great to see that 42 00:02:09.838 --> 00:02:12.507 we are still the beginning of the big wave. 43 00:02:12.507 --> 00:02:16.803 My next question is: How do you see the impact of AI to 44 00:02:16.803 --> 00:02:20.431 particularly to the area of the creation and innovation. 45 00:02:20.431 --> 00:02:23.226 And it's often said that AI has the power, but usually 46 00:02:23.226 --> 00:02:26.229 we imagine that you know power applied to the productivity 47 00:02:26.229 --> 00:02:28.648 increase, I think efficiency and effectiveness. 48 00:02:28.982 --> 00:02:32.819 But I think that Sony is now the company with both technology 49 00:02:32.819 --> 00:02:36.531 and entertainment trying to deploy more the power of the AI in the 50 00:02:36.531 --> 00:02:37.824 world of creation. 51 00:02:37.824 --> 00:02:40.243 Could you elaborate on that part? 52 00:02:40.243 --> 00:02:43.204 I think that Sony is the company for the creators. 53 00:02:43.204 --> 00:02:46.749 So for the R&D we are saying that we are here for creators. 54 00:02:47.167 --> 00:02:51.462 So that is the decision criteria for us to go or no go. 55 00:02:51.462 --> 00:02:54.966 What kind of technology we want to build? Is the technology beneficial to 56 00:02:54.966 --> 00:02:55.633 the creators? 57 00:02:55.633 --> 00:02:57.427 And makes the world a better place? 58 00:02:57.427 --> 00:02:58.970 Make people more creative? 59 00:02:58.970 --> 00:03:03.057 And that's really the essence of Sony's R&D. 60 00:03:03.057 --> 00:03:04.434 And generative AI... 61 00:03:04.434 --> 00:03:07.520 This is really interesting because you prompt, you got all 62 00:03:07.520 --> 00:03:10.815 the images and you got a prompt, you got all kind of sentences 63 00:03:10.815 --> 00:03:11.900 and things like that. 64 00:03:12.192 --> 00:03:15.570 So it helps us create things. 65 00:03:15.570 --> 00:03:21.242 So the issue is: is that going to be helpful for the creators, who 66 00:03:21.242 --> 00:03:22.869 could benefit from that? 67 00:03:23.036 --> 00:03:27.207 And is it going to make people even more creative, or just generating 68 00:03:27.207 --> 00:03:28.917 some image and that's it? 69 00:03:29.125 --> 00:03:32.629 So I think that creators start using generative AI in a very 70 00:03:32.629 --> 00:03:33.379 creative way. 71 00:03:33.963 --> 00:03:38.259 So I'll say like people worry about like whether 72 00:03:38.259 --> 00:03:41.888 creativity will be lost because of generative AI. 73 00:03:41.888 --> 00:03:46.893 I kind of doubt that. My message would be: trust human creativity. 74 00:03:46.976 --> 00:03:47.560 Got it. 75 00:03:47.602 --> 00:03:51.272 We are very creative and I think we should have a big trust on 76 00:03:51.272 --> 00:03:52.607 how creative we can be. 77 00:03:53.149 --> 00:03:55.610 My last question is about responsible AI. 78 00:03:55.944 --> 00:03:59.948 So technology has both the positive side and negative side. 79 00:04:00.031 --> 00:04:04.202 We feel that you know the awe, and also the fear. AI as well. 80 00:04:04.410 --> 00:04:07.455 I mean we see that great benefit of the increasing the 81 00:04:07.455 --> 00:04:09.832 productivity or creation as you mentioned. 82 00:04:09.999 --> 00:04:12.460 But at the same time we worry about elimination of the 83 00:04:12.460 --> 00:04:13.419 workforce and others. 84 00:04:13.836 --> 00:04:14.712 How do we see the, 85 00:04:14.712 --> 00:04:18.424 How do we have set up the guideline to make the right 86 00:04:18.424 --> 00:04:19.467 responsible AI? 87 00:04:20.218 --> 00:04:23.888 So there are a few different aspects for that. 88 00:04:23.888 --> 00:04:27.767 Since the generative AI requires an abundancy of data, whether 89 00:04:27.767 --> 00:04:32.605 the data or copyrights of the rights holder is protected or 90 00:04:32.605 --> 00:04:35.733 or something that just generates, is too close to that someone's 91 00:04:35.733 --> 00:04:39.821 artwork, or that kind of a copyright issue is really the 92 00:04:39.821 --> 00:04:43.825 critical, there's all kind of legal action being taken right 93 00:04:43.825 --> 00:04:45.660 now to resolve these issues. 94 00:04:46.119 --> 00:04:49.789 So that's one part, OK. 95 00:04:49.789 --> 00:04:53.584 So different aspect is that who benefit from the generative AI. 96 00:04:53.751 --> 00:04:58.923 So because this is like a data-driven machine learning 97 00:04:58.923 --> 00:05:05.179 system, basically the performance and what we get 98 00:05:05.179 --> 00:05:08.683 depends on what kind of data we use. 99 00:05:08.683 --> 00:05:13.521 So the capability to access the generative AI doesn't equal to 100 00:05:13.521 --> 00:05:16.065 the someone benefit from the generative AI. 101 00:05:16.524 --> 00:05:22.780 If someone wants to get a data, issues in specific regions. 102 00:05:22.947 --> 00:05:26.617 If data is not trained, the system is not trained on data 103 00:05:26.617 --> 00:05:28.119 related to that region, 104 00:05:28.494 --> 00:05:30.121 you only get the lousy answer. 105 00:05:30.538 --> 00:05:34.125 So most of the software, if you get access to software, you get 106 00:05:34.125 --> 00:05:36.461 the same function anywhere in the world. 107 00:05:36.669 --> 00:05:38.546 And a generative AI, that is not the case. 108 00:05:38.755 --> 00:05:43.843 So for us to make sure everyone benefits from generative AI, we 109 00:05:43.843 --> 00:05:48.014 need to be more transparent, we need to have a fair balance of 110 00:05:48.014 --> 00:05:49.974 the kind of data we're gonna use. 111 00:05:49.974 --> 00:05:54.145 So that is the second aspect because we're gonna have anyone 112 00:05:54.145 --> 00:05:55.188 lag behind on it. 113 00:05:55.521 --> 00:05:59.317 The third one is we have the employment issue and other risk 114 00:05:59.317 --> 00:05:59.734 issues. 115 00:06:00.109 --> 00:06:02.904 So that is what we call the responsible deployment. 116 00:06:02.945 --> 00:06:06.949 So someone, anyone who build the AI system need to check whether 117 00:06:06.949 --> 00:06:10.370 that is complying with the specific standard or ethical 118 00:06:10.370 --> 00:06:14.248 standard or some sort of legal guideline and then deploy that. 119 00:06:14.415 --> 00:06:17.919 So it's a little bit like the pharma industry for example, you 120 00:06:17.919 --> 00:06:21.130 have like a preclinical and a phase one, phase two, phase 121 00:06:21.130 --> 00:06:23.800 three clinical trial and then go for it, right. 122 00:06:24.092 --> 00:06:28.179 So I'm not sure if we're gonna have like FDA-like regulatory 123 00:06:28.179 --> 00:06:31.057 body, but I think we need some guidelines. 124 00:06:31.057 --> 00:06:34.936 Yeah, guidelines and decent company should be very conscious 125 00:06:34.936 --> 00:06:38.731 about how you ensure the quality and then the safety of the AI 126 00:06:38.731 --> 00:06:41.234 system they're building and then deploying it. 127 00:06:41.234 --> 00:06:42.485 Thank you very much.