WEBVTT 1 00:00:00.000 --> 00:00:02.200 2 00:00:02.200 --> 00:00:04.850 SAM RANSBOTHAM: How is AI helping one beauty company spot 3 00:00:04.850 --> 00:00:06.050 key trends? 4 00:00:06.050 --> 00:00:09.020 Find out on today's episode. 5 00:00:09.020 --> 00:00:12.000 Stéphane Lannuzel: I'm Stéphane Lannuzel from L'Oréal, and you 6 00:00:12.000 --> 00:00:15.550 are listening to Me, Myself, and AI. 7 00:00:15.550 --> 00:00:18.860 SAM RANSBOTHAM: Welcome to Me, Myself, and AI, 8 00:00:18.860 --> 00:00:21.870 a podcast on artificial intelligence in business. 9 00:00:21.870 --> 00:00:25.650 Each episode, we introduce you to someone innovating with AI. 10 00:00:25.650 --> 00:00:29.930 I'm Sam Ransbotham, professor of analytics at Boston College. 11 00:00:29.930 --> 00:00:33.470 I'm also the AI and business strategy guest editor 12 00:00:33.470 --> 00:00:35.140 at MIT Sloan Management Review. 13 00:00:35.140 --> 00:00:37.340 SHERVIN KHODABANDEH: And I'm Shervin Khodabandeh, 14 00:00:37.340 --> 00:00:41.360 senior partner with BCG, and I colead BCG's AI practice 15 00:00:41.360 --> 00:00:42.360 in North America. 16 00:00:42.360 --> 00:00:46.850 Together, MIT SMR and BCG have been researching and publishing 17 00:00:46.850 --> 00:00:49.780 on AI for six years, interviewing hundreds 18 00:00:49.780 --> 00:00:51.900 of practitioners and surveying thousands 19 00:00:51.900 --> 00:00:55.390 of companies on what it takes to build and to deploy and scale 20 00:00:55.390 --> 00:00:57.860 AI capabilities and really transform 21 00:00:57.860 --> 00:00:59.180 the way organizations operate. 22 00:00:59.180 --> 00:01:04.160 SAM RANSBOTHAM: Today, Shervin and I are talking with Stéphane 23 00:01:04.160 --> 00:01:07.390 Lannuzel, Beauty Tech program director at L'Oréal. 24 00:01:07.390 --> 00:01:08.770 Stéphane, thanks for joining us. 25 00:01:08.770 --> 00:01:09.270 Welcome. 26 00:01:09.270 --> 00:01:09.933 27 00:01:09.933 --> 00:01:11.350 SHERVIN KHODABANDEH: Hi, Stéphane. 28 00:01:11.350 --> 00:01:11.920 Stéphane Lannuzel: Hello. 29 00:01:11.920 --> 00:01:13.837 I'm really happy to be talking with you today. 30 00:01:13.837 --> 00:01:14.470 31 00:01:14.470 --> 00:01:15.970 SAM RANSBOTHAM: Let's get started. 32 00:01:15.970 --> 00:01:17.250 Stéphane, you're at L'Oréal. 33 00:01:17.250 --> 00:01:19.220 Can you tell us about your current role? 34 00:01:19.220 --> 00:01:22.090 What does a Beauty Tech program director do? 35 00:01:22.090 --> 00:01:24.570 Stéphane Lannuzel: I'm in charge of the beauty-tech 36 00:01:24.570 --> 00:01:27.310 transformation, which is a global transformation in all 37 00:01:27.310 --> 00:01:29.720 functions and in all geographies. 38 00:01:29.720 --> 00:01:33.760 And that adventure in beauty tech for me started in 2019 39 00:01:33.760 --> 00:01:37.190 with the vision of our CEO -- at that time, 40 00:01:37.190 --> 00:01:41.820 that was Jean-Paul Agon -- that was really visionary 41 00:01:41.820 --> 00:01:45.660 in the fact that tech will disrupt the beauty industry. 42 00:01:45.660 --> 00:01:51.130 And he really listened and talked with all the CEOs 43 00:01:51.130 --> 00:01:53.590 of the big tech companies, asking them the question 44 00:01:53.590 --> 00:01:57.420 of what tech will do to beauty -- how tech will impact beauty. 45 00:01:57.420 --> 00:01:59.970 And that's how we started on that journey, 46 00:01:59.970 --> 00:02:02.790 saying that we want to be the champion, the leader of beauty 47 00:02:02.790 --> 00:02:03.740 tech, to be the No. 48 00:02:03.740 --> 00:02:06.680 1, because at L'Oréal, we are the leader of the beauty 49 00:02:06.680 --> 00:02:09.360 industry, so we decided that we wanted to be the leader 50 00:02:09.360 --> 00:02:10.310 of beauty tech. 51 00:02:10.310 --> 00:02:13.510 And I started with that mission, being the Beauty Tech program 52 00:02:13.510 --> 00:02:16.420 director, with only these two words: beauty and tech. 53 00:02:16.420 --> 00:02:19.440 And basically, the simple motto of Beauty Tech 54 00:02:19.440 --> 00:02:23.160 is to invent the beauty of the future while transforming 55 00:02:23.160 --> 00:02:24.580 into a company of the future. 56 00:02:24.580 --> 00:02:27.950 So what I'm doing every day is inventing the beauty 57 00:02:27.950 --> 00:02:30.777 of the future and transforming L'Oréal into a company 58 00:02:30.777 --> 00:02:31.360 of the future. 59 00:02:31.360 --> 00:02:33.100 SAM RANSBOTHAM: What did you learn? 60 00:02:33.100 --> 00:02:34.660 How will technology affect beauty? 61 00:02:34.660 --> 00:02:36.275 What's going to be the big change? 62 00:02:36.275 --> 00:02:38.150 Stéphane Lannuzel: There are two big impacts. 63 00:02:38.150 --> 00:02:42.110 There is the development of services for our consumers, 64 00:02:42.110 --> 00:02:45.090 but there is also, "How can we leverage technology 65 00:02:45.090 --> 00:02:48.220 to make the life of our employees easier 66 00:02:48.220 --> 00:02:51.930 and make them faster and more creative and more nimble?" 67 00:02:51.930 --> 00:02:53.550 Let's start first with consumers. 68 00:02:53.550 --> 00:02:57.070 L'Oréal has been around for more than 110 years, 69 00:02:57.070 --> 00:03:00.890 and we produce 7 billion physical cosmetic products 70 00:03:00.890 --> 00:03:04.090 every year, but we are more and more into services, 71 00:03:04.090 --> 00:03:07.940 and basically tech is playing a huge part in developing beauty 72 00:03:07.940 --> 00:03:08.670 services. 73 00:03:08.670 --> 00:03:12.190 We have been working a lot on helping consumers, 74 00:03:12.190 --> 00:03:14.420 through services and through technology, 75 00:03:14.420 --> 00:03:17.190 to be able to find the right product for them. 76 00:03:17.190 --> 00:03:20.290 [For] example, we have developed some solution 77 00:03:20.290 --> 00:03:23.010 using AI and computer vision and augmented 78 00:03:23.010 --> 00:03:27.990 reality to be able to do virtual try-ons of makeup. 79 00:03:27.990 --> 00:03:30.700 Another example is a skin diagnostic 80 00:03:30.700 --> 00:03:32.610 to make some recommendation in terms 81 00:03:32.610 --> 00:03:36.010 of what are your top concerns and what are the products 82 00:03:36.010 --> 00:03:38.410 that you should apply in your daily routine. 83 00:03:38.410 --> 00:03:43.370 So this is how technology is really transforming beauty, 84 00:03:43.370 --> 00:03:46.730 because we want to develop a beauty that is more and more 85 00:03:46.730 --> 00:03:50.192 inclusive -- a beauty that is more and more personalized. 86 00:03:50.192 --> 00:03:51.470 87 00:03:51.470 --> 00:03:53.760 SAM RANSBOTHAM: You've got a varied background 88 00:03:53.760 --> 00:03:57.915 in banking and consulting and luxury and consumer goods. 89 00:03:57.915 --> 00:03:59.290 Can you tell us a little bit more 90 00:03:59.290 --> 00:04:01.710 about what got you to your current role 91 00:04:01.710 --> 00:04:03.570 and, in particular, what got you interested 92 00:04:03.570 --> 00:04:06.055 in artificial intelligence and applying these technologies 93 00:04:06.055 --> 00:04:06.930 in your current role? 94 00:04:06.930 --> 00:04:11.310 Stéphane Lannuzel: I've been in the beauty industry for 15 95 00:04:11.310 --> 00:04:15.870 years now, so after consulting, I worked for many luxury 96 00:04:15.870 --> 00:04:17.700 or cosmetic goods companies. 97 00:04:17.700 --> 00:04:20.822 I decided to really be part of that industry, 98 00:04:20.822 --> 00:04:22.280 and I joined the cosmetic industry, 99 00:04:22.280 --> 00:04:25.470 first in Shiseido and then in L'Oréal. 100 00:04:25.470 --> 00:04:28.280 Technology was not that prominent 101 00:04:28.280 --> 00:04:30.800 when I graduated a long time ago, 102 00:04:30.800 --> 00:04:33.370 but what I've done throughout my career 103 00:04:33.370 --> 00:04:40.500 is really look at how we can make the organization evolve 104 00:04:40.500 --> 00:04:43.390 to be able to cope with new trends. 105 00:04:43.390 --> 00:04:46.490 And obviously, people play a major part in that -- 106 00:04:46.490 --> 00:04:49.940 the way you add new skills in your organization, 107 00:04:49.940 --> 00:04:52.260 the way you structure the organization. 108 00:04:52.260 --> 00:04:55.380 And more and more, technology is also playing a key role. 109 00:04:55.380 --> 00:04:58.790 I've always been very curious in looking at how technology 110 00:04:58.790 --> 00:05:02.360 was evolving and trying to see in my industry, 111 00:05:02.360 --> 00:05:06.040 in the beauty industry, what could be the impact: What can I 112 00:05:06.040 --> 00:05:06.700 leverage? 113 00:05:06.700 --> 00:05:10.380 I think that's why I'm the head of Beauty Tech: 114 00:05:10.380 --> 00:05:14.410 because I'm really into being curious about technology, 115 00:05:14.410 --> 00:05:18.120 being a strong believer of transformation, 116 00:05:18.120 --> 00:05:23.430 and being a strong believer that organizations and people can 117 00:05:23.430 --> 00:05:27.710 adapt and can be even better if you give them the incentive 118 00:05:27.710 --> 00:05:30.970 and you upskill them in the use of technologies. 119 00:05:30.970 --> 00:05:34.150 I'm a very optimistic person, and I've always 120 00:05:34.150 --> 00:05:37.340 seen tech as a force of good for the people, 121 00:05:37.340 --> 00:05:40.920 good for the planet, and really very curious 122 00:05:40.920 --> 00:05:42.980 at embracing new technology. 123 00:05:42.980 --> 00:05:44.980 SHERVIN KHODABANDEH: You mentioned upskilling. 124 00:05:44.980 --> 00:05:47.840 Do people at L'Oréal need to know anything about artificial 125 00:05:47.840 --> 00:05:49.140 intelligence? 126 00:05:49.140 --> 00:05:50.890 Stéphane Lannuzel: Yes, they need to know. 127 00:05:50.890 --> 00:05:53.950 And not only the data scientists or the few people 128 00:05:53.950 --> 00:05:57.650 that are really a practitioner of artificial intelligence, 129 00:05:57.650 --> 00:05:58.510 but everybody. 130 00:05:58.510 --> 00:06:00.860 And here we've created the tech and data university 131 00:06:00.860 --> 00:06:04.680 for L'Oréal that is targeting [our] 88,000 employees. 132 00:06:04.680 --> 00:06:06.470 Obviously, we have different programs -- 133 00:06:06.470 --> 00:06:09.920 some programs [that are] more acculturation, 134 00:06:09.920 --> 00:06:13.010 and some programs that are really hard-core. 135 00:06:13.010 --> 00:06:16.860 And we do get requests from general management: 136 00:06:16.860 --> 00:06:20.160 "Tell me about artificial intelligence 137 00:06:20.160 --> 00:06:21.900 and what I need to know." 138 00:06:21.900 --> 00:06:24.660 What I am saying is that we don't need everybody 139 00:06:24.660 --> 00:06:29.470 at L'Oréal to know how to code in Python or to select 140 00:06:29.470 --> 00:06:31.380 the right hyperparameters of models. 141 00:06:31.380 --> 00:06:33.580 But what they need to understand is 142 00:06:33.580 --> 00:06:36.510 what we can do with artificial intelligence, what 143 00:06:36.510 --> 00:06:40.790 we can't do, what we can expect, what we can't expect, in order 144 00:06:40.790 --> 00:06:44.190 to deal with these solutions that are making 145 00:06:44.190 --> 00:06:47.020 some recommendations; [and also] how we should use them, 146 00:06:47.020 --> 00:06:48.350 and what are the limitations. 147 00:06:48.350 --> 00:06:50.612 And I think every manager needs to develop 148 00:06:50.612 --> 00:06:51.570 some knowledge on that. 149 00:06:51.570 --> 00:06:53.660 And I'll give you one example. 150 00:06:53.660 --> 00:06:57.950 We are developing some solutions to help the people in the labs 151 00:06:57.950 --> 00:07:02.520 to do the formulation of all our products -- more precisely, 152 00:07:02.520 --> 00:07:06.560 to help them to predict the performance of the formula when 153 00:07:06.560 --> 00:07:10.280 you change one or a few ingredients, 154 00:07:10.280 --> 00:07:13.770 so that they don't have to formulate in the real world, 155 00:07:13.770 --> 00:07:15.660 make the test, and get the results. 156 00:07:15.660 --> 00:07:19.100 They can do that digitally using the algorithms. 157 00:07:19.100 --> 00:07:22.370 And the type of reaction that you get when you start working 158 00:07:22.370 --> 00:07:24.990 on that -- there are people that are saying, 159 00:07:24.990 --> 00:07:29.440 "I'm not going to help you train or validate the models 160 00:07:29.440 --> 00:07:34.223 on a solution that will probably impact my job in the future." 161 00:07:34.223 --> 00:07:35.640 And there are people that are also 162 00:07:35.640 --> 00:07:38.430 saying, "This is not working. 163 00:07:38.430 --> 00:07:41.180 I found one case where it's wrong. 164 00:07:41.180 --> 00:07:44.170 Even if it's right in 99% of the cases, 165 00:07:44.170 --> 00:07:47.400 I'm not going to use it, because I don't trust the system." 166 00:07:47.400 --> 00:07:49.920 And these obviously are two extreme cases, 167 00:07:49.920 --> 00:07:52.215 but that's where the middle management 168 00:07:52.215 --> 00:07:54.090 and the general management have a strong role 169 00:07:54.090 --> 00:07:58.110 to play to help the democratization 170 00:07:58.110 --> 00:08:00.953 and to help people have the right interaction 171 00:08:00.953 --> 00:08:02.370 with these solutions that are part 172 00:08:02.370 --> 00:08:04.527 of the artificial intelligence that we develop. 173 00:08:04.527 --> 00:08:07.110 SAM RANSBOTHAM: In some of the research that Shervin and I are 174 00:08:07.110 --> 00:08:09.382 currently working on, this idea -- 175 00:08:09.382 --> 00:08:11.590 and what you're describing is how you would work with 176 00:08:11.590 --> 00:08:15.340 a coworker, not really how you'd work with a technology -- 177 00:08:15.340 --> 00:08:17.460 I think a theme that's starting to emerge 178 00:08:17.460 --> 00:08:20.080 is that, just like it's true that I've 179 00:08:20.080 --> 00:08:21.540 made a mistake or two in my past (I 180 00:08:21.540 --> 00:08:23.290 know that's hard for everyone to believe), 181 00:08:23.290 --> 00:08:25.430 but I'm glad that my colleagues didn't immediately 182 00:08:25.430 --> 00:08:28.050 throw me out and say, "Oh, you're useless. 183 00:08:28.050 --> 00:08:28.950 You're pointless. 184 00:08:28.950 --> 00:08:30.600 Why would I ever work with you again?" 185 00:08:30.600 --> 00:08:33.830 And what you just said there was that same sort of perspective 186 00:08:33.830 --> 00:08:34.530 that we have. 187 00:08:34.530 --> 00:08:36.830 We have an expectation of a technology 188 00:08:36.830 --> 00:08:39.027 that we don't necessarily have of people. 189 00:08:39.027 --> 00:08:40.610 It seems like people are shifting more 190 00:08:40.610 --> 00:08:44.049 to think about a team being composed of humans 191 00:08:44.049 --> 00:08:45.750 and of machines. 192 00:08:45.750 --> 00:08:47.240 Stéphane Lannuzel: I fully agree. 193 00:08:47.240 --> 00:08:50.910 And I can tell you that in all the different solutions that we 194 00:08:50.910 --> 00:08:55.830 have developed with AI in them, I've always underestimated 195 00:08:55.830 --> 00:09:00.130 that aspect, but people really still see that as a technology 196 00:09:00.130 --> 00:09:03.550 and not as a help to achieve some task. 197 00:09:03.550 --> 00:09:06.070 And it takes a lot of convincing, 198 00:09:06.070 --> 00:09:07.400 a lot of explanation. 199 00:09:07.400 --> 00:09:09.950 SHERVIN KHODABANDEH: Do you have any examples of AI projects 200 00:09:09.950 --> 00:09:12.210 or products that are being well received at L'Oréal? 201 00:09:12.210 --> 00:09:16.150 Stéphane Lannuzel: We are developing a solution to detect 202 00:09:16.150 --> 00:09:16.980 beauty trends. 203 00:09:16.980 --> 00:09:18.490 It's called TrendSpotter. 204 00:09:18.490 --> 00:09:20.540 When you look at what is happening 205 00:09:20.540 --> 00:09:23.190 in the academic world, the research 206 00:09:23.190 --> 00:09:26.830 world, the macro-influencer world 207 00:09:26.830 --> 00:09:28.510 we are reading the different posts 208 00:09:28.510 --> 00:09:31.640 that they make on social media, reading some papers, 209 00:09:31.640 --> 00:09:36.840 getting some ideas, and seeing some trends emerging 210 00:09:36.840 --> 00:09:39.390 on ingredients and on the whole thing. 211 00:09:39.390 --> 00:09:43.530 So those are the initial seeds of a new trend 212 00:09:43.530 --> 00:09:44.590 that is emerging. 213 00:09:44.590 --> 00:09:47.590 And then you are looking at all these trends 214 00:09:47.590 --> 00:09:50.180 that are then being amplified by, let's say, 215 00:09:50.180 --> 00:09:51.580 the general population. 216 00:09:51.580 --> 00:09:54.140 So without revealing all the secrets, 217 00:09:54.140 --> 00:09:57.540 it's listening in on these different groups across 218 00:09:57.540 --> 00:10:01.550 different geographies -- in Asia and the U.S., in Europe -- 219 00:10:01.550 --> 00:10:03.050 that you see some trends emerging. 220 00:10:03.050 --> 00:10:03.845 221 00:10:03.845 --> 00:10:04.970 SAM RANSBOTHAM: Sign me up. 222 00:10:04.970 --> 00:10:06.640 I mean can I get access to it? 223 00:10:06.640 --> 00:10:08.670 I certainly need all the help that I 224 00:10:08.670 --> 00:10:10.240 can get in that department. 225 00:10:10.240 --> 00:10:13.540 I like the idea there, though, because obviously that 226 00:10:13.540 --> 00:10:16.160 can aggregate a ton of information 227 00:10:16.160 --> 00:10:18.388 from lots of different sources and let you pick up 228 00:10:18.388 --> 00:10:19.930 on things early, because I'm guessing 229 00:10:19.930 --> 00:10:23.510 you face a time crunch, too, because to go 230 00:10:23.510 --> 00:10:25.930 from an idea for a product to a product 231 00:10:25.930 --> 00:10:27.590 isn't an instantaneous thing. 232 00:10:27.590 --> 00:10:29.250 And so the more lead time you can 233 00:10:29.250 --> 00:10:32.570 get on when those products are coming, 234 00:10:32.570 --> 00:10:34.910 the greater you're likely to have them on the shelves 235 00:10:34.910 --> 00:10:36.150 when someone comes in. 236 00:10:36.150 --> 00:10:37.410 How does that process work? 237 00:10:37.410 --> 00:10:40.100 How do all those pieces come together? 238 00:10:40.100 --> 00:10:42.600 Stéphane Lannuzel: When we were working on that TrendSpotter 239 00:10:42.600 --> 00:10:47.810 solution, we were really not starting from the technology 240 00:10:47.810 --> 00:10:51.550 and the idea but starting from what you said, which is, 241 00:10:51.550 --> 00:10:52.920 what is the usage? 242 00:10:52.920 --> 00:10:55.820 What is the usage of knowing a trend? 243 00:10:55.820 --> 00:10:59.910 And basically, what we realized [is] there was one use -- 244 00:10:59.910 --> 00:11:02.310 the one that you mentioned -- which is, OK, 245 00:11:02.310 --> 00:11:06.570 how can we identify a trend that will then translate 246 00:11:06.570 --> 00:11:09.920 into a product that will be launched, as you rightly said, 247 00:11:09.920 --> 00:11:11.570 in 12 to 18 months? 248 00:11:11.570 --> 00:11:13.310 So basically, the obvious need [is] 249 00:11:13.310 --> 00:11:16.510 where we need early on to be able to detect 250 00:11:16.510 --> 00:11:18.560 some early-stage trends. 251 00:11:18.560 --> 00:11:22.220 But there was another use that we discovered: 252 00:11:22.220 --> 00:11:27.430 You can also use a new trend to be able to activate part 253 00:11:27.430 --> 00:11:31.470 of your existing portfolio, meaning that you see a trend 254 00:11:31.470 --> 00:11:35.460 popping [up] -- and we do have quite a wide range of products. 255 00:11:35.460 --> 00:11:37.870 So there are probably products that 256 00:11:37.870 --> 00:11:41.510 are corresponding to that trend, so then you will work on, OK, 257 00:11:41.510 --> 00:11:44.408 what can I do in terms of making the activation 258 00:11:44.408 --> 00:11:46.450 to these products to be able to answer the trend? 259 00:11:46.450 --> 00:11:50.230 So what you can see is, trends have different horizons, 260 00:11:50.230 --> 00:11:52.820 and depending on the horizon, then 261 00:11:52.820 --> 00:11:53.950 you can choose what you do. 262 00:11:53.950 --> 00:11:56.460 And that's what you discover when 263 00:11:56.460 --> 00:12:00.490 you do proper user research to understand their needs then. 264 00:12:00.490 --> 00:12:03.000 That was a new expertise for us that we 265 00:12:03.000 --> 00:12:06.120 have acquired to make sure that in all the solutions 266 00:12:06.120 --> 00:12:09.870 and services that we develop, UX is really at the center of it. 267 00:12:09.870 --> 00:12:11.600 And there are some skills that we have 268 00:12:11.600 --> 00:12:13.490 developed and internalized. 269 00:12:13.490 --> 00:12:16.370 Basically, we were doing consumer research. 270 00:12:16.370 --> 00:12:20.510 We were doing some research on the design of the packaging, 271 00:12:20.510 --> 00:12:22.980 but we are doing exactly the same work. 272 00:12:22.980 --> 00:12:26.810 We have different specialists on the digital services 273 00:12:26.810 --> 00:12:28.210 that we developed. 274 00:12:28.210 --> 00:12:31.520 And it's really key to develop services that 275 00:12:31.520 --> 00:12:33.130 make an impact for consumers. 276 00:12:33.130 --> 00:12:35.580 SHERVIN KHODABANDEH: That's fantastic. 277 00:12:35.580 --> 00:12:37.540 Sam, do you want to move to the five questions? 278 00:12:37.540 --> 00:12:39.860 SAM RANSBOTHAM: We have a segment 279 00:12:39.860 --> 00:12:42.500 where we ask you a series of rapid-fire questions. 280 00:12:42.500 --> 00:12:45.480 So just say the first thing that comes to the top of your mind. 281 00:12:45.480 --> 00:12:47.520 What's been your proudest moment with AI so far? 282 00:12:47.520 --> 00:12:52.230 Stéphane Lannuzel: It's always difficult to pick one. 283 00:12:52.230 --> 00:12:56.900 Very top of my mind, I think, is when 284 00:12:56.900 --> 00:13:00.080 we've launched a solution that is helping 285 00:13:00.080 --> 00:13:02.880 to do a quick analysis of reviews and ratings 286 00:13:02.880 --> 00:13:07.910 and really to see how people can now leverage what consumers are 287 00:13:07.910 --> 00:13:10.330 saying on our products, and really leveraging 288 00:13:10.330 --> 00:13:13.060 that information at scale throughout the world, 289 00:13:13.060 --> 00:13:14.060 throughout the category. 290 00:13:14.060 --> 00:13:18.170 So it's really for them, it was mind-blowing to get access 291 00:13:18.170 --> 00:13:20.290 to that information at that scale. 292 00:13:20.290 --> 00:13:23.758 SAM RANSBOTHAM: What worries you about artificial intelligence? 293 00:13:23.758 --> 00:13:25.550 Stéphane Lannuzel: Always the same subject, 294 00:13:25.550 --> 00:13:30.110 which is about bias -- bias that we don't see that creeps in. 295 00:13:30.110 --> 00:13:33.410 SAM RANSBOTHAM: What's your favorite activity that does not 296 00:13:33.410 --> 00:13:36.730 involve technology -- that has no technology? 297 00:13:36.730 --> 00:13:39.020 Stéphane Lannuzel: On the personal side 298 00:13:39.020 --> 00:13:40.340 or the professional side? 299 00:13:40.340 --> 00:13:41.200 SAM RANSBOTHAM: Personal side, yeah. 300 00:13:41.200 --> 00:13:42.825 Stéphane Lannuzel: I would say running, 301 00:13:42.825 --> 00:13:44.430 because it is my favorite activity. 302 00:13:44.430 --> 00:13:47.470 But it doesn't qualify as not involving technology, 303 00:13:47.470 --> 00:13:51.100 because I use a watch, and then I track my performance 304 00:13:51.100 --> 00:13:51.860 using technology. 305 00:13:51.860 --> 00:13:53.860 SAM RANSBOTHAM: I think that's a common trend -- 306 00:13:53.860 --> 00:13:56.027 that everyone finds that whatever they're doing that 307 00:13:56.027 --> 00:13:58.367 involves no technology actually does involve technology. 308 00:13:58.367 --> 00:13:59.825 Stéphane Lannuzel: It may, and yes. 309 00:13:59.825 --> 00:13:59.942 310 00:13:59.942 --> 00:14:01.650 SAM RANSBOTHAM: What was the first career 311 00:14:01.650 --> 00:14:02.970 that you wanted as a child? 312 00:14:02.970 --> 00:14:04.678 What did you want to be when you grew up? 313 00:14:04.678 --> 00:14:07.260 Stéphane Lannuzel: I graduated as a civil engineer, 314 00:14:07.260 --> 00:14:10.860 and I wanted to build bridges, so that's what I started to do 315 00:14:10.860 --> 00:14:12.880 in the very beginning of my career. 316 00:14:12.880 --> 00:14:15.830 But I quickly moved away from it and worked, 317 00:14:15.830 --> 00:14:17.600 as you mentioned earlier, in the banking 318 00:14:17.600 --> 00:14:20.280 industry doing project finance for infrastructure projects. 319 00:14:20.280 --> 00:14:22.030 SAM RANSBOTHAM: What's your greatest 320 00:14:22.030 --> 00:14:24.720 wish for AI in the future? 321 00:14:24.720 --> 00:14:29.700 Stéphane Lannuzel: I think it's really helping us to improve 322 00:14:29.700 --> 00:14:34.420 the world in which we live and to help us solve the climate 323 00:14:34.420 --> 00:14:36.010 issue that we face. 324 00:14:36.010 --> 00:14:39.230 I'm a strong believer that only technology will 325 00:14:39.230 --> 00:14:42.960 help us find solutions to face the difficult situation 326 00:14:42.960 --> 00:14:43.743 in which we are. 327 00:14:43.743 --> 00:14:44.910 SAM RANSBOTHAM: Sounds good. 328 00:14:44.910 --> 00:14:46.752 Stéphane, thanks for a great discussion. 329 00:14:46.752 --> 00:14:48.210 We really enjoyed talking with you. 330 00:14:48.210 --> 00:14:50.738 Thanks. 331 00:14:50.738 --> 00:14:52.280 Join us next time, when Shervin and I 332 00:14:52.280 --> 00:14:54.660 speak with Teddy Bekele, chief technology 333 00:14:54.660 --> 00:14:55.965 officer of Land O'Lakes. 334 00:14:55.965 --> 00:14:58.592 335 00:14:58.592 --> 00:15:00.050 ALLISON RYDER: Thanks for listening 336 00:15:00.050 --> 00:15:01.560 to Me, Myself, and AI. 337 00:15:01.560 --> 00:15:04.010 We believe, like you, that the conversation 338 00:15:04.010 --> 00:15:06.230 about AI implementation doesn't start and stop 339 00:15:06.230 --> 00:15:07.347 with this podcast. 340 00:15:07.347 --> 00:15:09.180 That's why we've created a group on LinkedIn 341 00:15:09.180 --> 00:15:11.020 specifically for listeners like you. 342 00:15:11.020 --> 00:15:13.760 It's called AI for Leaders, and if you join us, 343 00:15:13.760 --> 00:15:15.790 you can chat with show creators and hosts, 344 00:15:15.790 --> 00:15:18.500 ask your own questions, share your insights, 345 00:15:18.500 --> 00:15:21.150 and gain access to valuable resources about AI 346 00:15:21.150 --> 00:15:23.990 implementation from MIT SMR and BCG. 347 00:15:23.990 --> 00:15:29.110 You can access it by visiting https://mitsmr.com/AIforLeaders. 348 00:15:29.110 --> 00:15:31.830 We'll put that link in the show notes, 349 00:15:31.830 --> 00:15:34.270 and we hope to see you there. 350 00:15:34.270 --> 00:15:39.000