WEBVTT 1 00:00:00.000 --> 00:00:03.470 SAM RANSBOTHAM: Many retailers are personalizing their product 2 00:00:03.470 --> 00:00:06.790 offerings, but few do it using AI products inspired 3 00:00:06.790 --> 00:00:09.530 by employees with limited technical background. 4 00:00:09.530 --> 00:00:13.170 Join us when we talk with Katia Walsh, chief global strategy 5 00:00:13.170 --> 00:00:15.300 and AI officer at Levi Strauss & Co., 6 00:00:15.300 --> 00:00:18.670 about how the company's AI boot camps upskill its workforce 7 00:00:18.670 --> 00:00:21.660 and inspire innovation. 8 00:00:21.660 --> 00:00:24.400 Welcome to Me, Myself, and AI, a podcast 9 00:00:24.400 --> 00:00:26.550 on artificial intelligence in business. 10 00:00:26.550 --> 00:00:30.260 Each episode, we introduce you to someone innovating with AI. 11 00:00:30.260 --> 00:00:33.590 I'm Sam Ransbotham, professor of information systems 12 00:00:33.590 --> 00:00:34.920 at Boston College. 13 00:00:34.920 --> 00:00:38.280 I'm also the guest editor for the AI and Business Strategy 14 00:00:38.280 --> 00:00:41.890 Big Idea program at MIT Sloan Management Review. 15 00:00:41.890 --> 00:00:44.370 SHERVIN KHODABANDEH: And I'm Shervin Khodabandeh, 16 00:00:44.370 --> 00:00:48.520 senior partner with BCG, and I colead BCG's AI practice 17 00:00:48.520 --> 00:00:49.710 in North America. 18 00:00:49.710 --> 00:00:53.240 Together, MIT SMRand BCG have been 19 00:00:53.240 --> 00:00:56.660 researching AI for five years, interviewing hundreds 20 00:00:56.660 --> 00:00:59.000 of practitioners and surveying thousands 21 00:00:59.000 --> 00:01:03.110 of companies on what it takes to build and to deploy and scale 22 00:01:03.110 --> 00:01:05.710 AI capabilities across the organization 23 00:01:05.710 --> 00:01:09.080 and really transform the way organizations operate. 24 00:01:09.080 --> 00:01:11.230 SAM RANSBOTHAM: Today, Shervin and I 25 00:01:11.230 --> 00:01:12.730 are excited to be talking with Katia 26 00:01:12.730 --> 00:01:15.300 Walsh, chief global strategy and AI officer 27 00:01:15.300 --> 00:01:16.447 at Levi Strauss & Co. 28 00:01:16.447 --> 00:01:18.530 Katia, thanks for taking the time to talk with us. 29 00:01:18.530 --> 00:01:19.030 Welcome. 30 00:01:19.030 --> 00:01:19.520 31 00:01:19.520 --> 00:01:20.870 SHERVIN KHODABANDEH: Thank you for joining us, Katia. 32 00:01:20.870 --> 00:01:21.840 KATIA WALSH: My pleasure. 33 00:01:21.840 --> 00:01:22.600 Thanks for having me. 34 00:01:22.600 --> 00:01:24.933 SAM RANSBOTHAM: Can you tell us about your current role? 35 00:01:24.933 --> 00:01:27.030 What are you doing for Levi Strauss now? 36 00:01:27.030 --> 00:01:30.780 KATIA WALSH: I'm responsible for a fusion of strategy 37 00:01:30.780 --> 00:01:33.920 and artificial intelligence, and to tell you a little bit more 38 00:01:33.920 --> 00:01:38.200 about that, it's really building an integrated capability that 39 00:01:38.200 --> 00:01:41.360 connects emerging technologies, data, 40 00:01:41.360 --> 00:01:45.130 and AI in one holistic capability in service 41 00:01:45.130 --> 00:01:46.450 of our strategic goals. 42 00:01:46.450 --> 00:01:48.750 SAM RANSBOTHAM: How long have you been doing that? 43 00:01:48.750 --> 00:01:49.470 44 00:01:49.470 --> 00:01:51.470 KATIA WALSH: I've been the chief strategy and AI 45 00:01:51.470 --> 00:01:54.380 officer for Levi Strauss & Co. for the last two and a half 46 00:01:54.380 --> 00:01:54.880 years. 47 00:01:54.880 --> 00:01:59.190 I joined very, very soon before the pandemic began, 48 00:01:59.190 --> 00:02:02.240 so it has been an induction by fire, 49 00:02:02.240 --> 00:02:05.160 and it has been also a great opportunity 50 00:02:05.160 --> 00:02:07.880 to show the power of technology in times like this. 51 00:02:07.880 --> 00:02:08.067 52 00:02:08.067 --> 00:02:10.400 SAM RANSBOTHAM: Yes, it's a great time and a challenging 53 00:02:10.400 --> 00:02:11.550 time, I'm sure. 54 00:02:11.550 --> 00:02:15.490 You didn't start off, though, in this role. 55 00:02:15.490 --> 00:02:17.035 Tell us what was happening before two 56 00:02:17.035 --> 00:02:17.910 and a half years ago. 57 00:02:17.910 --> 00:02:21.300 I believe you actually started as a journalist in Bulgaria. 58 00:02:21.300 --> 00:02:24.640 Connect the dots between that and Levi Strauss. 59 00:02:24.640 --> 00:02:25.560 60 00:02:25.560 --> 00:02:27.790 KATIA WALSH: I did grow up in communist Bulgaria 61 00:02:27.790 --> 00:02:31.450 at a time when Levi's signified so much 62 00:02:31.450 --> 00:02:35.010 more than fashion or clothing; it was really 63 00:02:35.010 --> 00:02:36.380 the flag of freedom. 64 00:02:36.380 --> 00:02:41.750 It was about independence and democracy and the unattainable. 65 00:02:41.750 --> 00:02:44.700 I did start as a journalist for one 66 00:02:44.700 --> 00:02:48.180 of the very few independent publications in the country, 67 00:02:48.180 --> 00:02:51.200 and very early in my life, I learned 68 00:02:51.200 --> 00:02:53.380 the value and the power of information, 69 00:02:53.380 --> 00:02:55.190 and information is really data. 70 00:02:55.190 --> 00:02:59.440 Newspaper stories are data; there is a field now called 71 00:02:59.440 --> 00:03:00.270 data journalism. 72 00:03:00.270 --> 00:03:02.390 I didn't know that at the time, but I 73 00:03:02.390 --> 00:03:04.780 learned about the power of information, 74 00:03:04.780 --> 00:03:07.082 and I developed a real passion for it. 75 00:03:07.082 --> 00:03:09.540 And then I had the opportunity to come to the United States 76 00:03:09.540 --> 00:03:12.520 on a full scholarship and continue my education, 77 00:03:12.520 --> 00:03:14.320 and that was in the heyday of the internet, 78 00:03:14.320 --> 00:03:16.630 so that's when I developed my second passion, 79 00:03:16.630 --> 00:03:19.850 for the power of technology to amplify the power 80 00:03:19.850 --> 00:03:21.450 of information or data. 81 00:03:21.450 --> 00:03:22.910 Then I continued my education. 82 00:03:22.910 --> 00:03:26.510 I got into academia and developed my third passion, 83 00:03:26.510 --> 00:03:28.500 for machine learning and statistics. 84 00:03:28.500 --> 00:03:30.750 I'm not an engineer, I'm a statistician, 85 00:03:30.750 --> 00:03:34.250 but through my education in statistics and machine 86 00:03:34.250 --> 00:03:36.550 learning, I developed this third passion 87 00:03:36.550 --> 00:03:39.440 about the power of machine learning to help 88 00:03:39.440 --> 00:03:40.745 us to drive desired outcomes. 89 00:03:40.745 --> 00:03:41.245 90 00:03:41.245 --> 00:03:43.570 SAM RANSBOTHAM: What are some of those outcomes you're 91 00:03:43.570 --> 00:03:45.740 trying to drive right now at Levi Strauss? 92 00:03:45.740 --> 00:03:48.050 KATIA WALSH: This is an industry, 93 00:03:48.050 --> 00:03:51.700 whether it's apparel or fashion or retail in general, 94 00:03:51.700 --> 00:03:54.360 outside of the likes of Amazon. 95 00:03:54.360 --> 00:04:00.930 This industry has been quite analog, manual, imprecise, 96 00:04:00.930 --> 00:04:04.000 and traditionally not the best citizen of the planet. 97 00:04:04.000 --> 00:04:07.190 So my mission at Levi's, together 98 00:04:07.190 --> 00:04:10.590 with the teams with which I partner and create and grow, 99 00:04:10.590 --> 00:04:16.649 is to help the company become what used to be analog is now 100 00:04:16.649 --> 00:04:19.959 digital, what used to be manual is now automated, 101 00:04:19.959 --> 00:04:23.560 and what used to be intuitive is now precise. 102 00:04:23.560 --> 00:04:26.380 We are driving the full digital transformation 103 00:04:26.380 --> 00:04:29.860 of the enterprise, but also a disruption 104 00:04:29.860 --> 00:04:31.170 of the entire industry. 105 00:04:31.170 --> 00:04:32.680 This is the biggest thing to happen 106 00:04:32.680 --> 00:04:35.732 to this whole industry since the first industrial revolution. 107 00:04:35.732 --> 00:04:37.940 SHERVIN KHODABANDEH: That's really quite fascinating, 108 00:04:37.940 --> 00:04:38.620 Katia. 109 00:04:38.620 --> 00:04:42.450 Share a little bit more about some examples of that 110 00:04:42.450 --> 00:04:44.830 revolution that's happening in the industry, 111 00:04:44.830 --> 00:04:48.250 either at Levi's -- whatever you're free to share -- 112 00:04:48.250 --> 00:04:51.000 or anything outside so that our audience can have a better 113 00:04:51.000 --> 00:04:51.720 sense. 114 00:04:51.720 --> 00:04:53.200 KATIA WALSH: Yes, Shervin. 115 00:04:53.200 --> 00:04:55.770 I'm happy to share what we are doing at Levi Strauss & Co. 116 00:04:55.770 --> 00:04:58.500 There are three C's that we center 117 00:04:58.500 --> 00:05:01.980 on when we deploy these great capabilities 118 00:05:01.980 --> 00:05:04.810 around digitization, data, and AI. 119 00:05:04.810 --> 00:05:09.190 And the first C is always about connections with our consumers. 120 00:05:09.190 --> 00:05:14.020 Levi's as a company, in its 169 years, for most of that time 121 00:05:14.020 --> 00:05:15.270 had been a manufacturer. 122 00:05:15.270 --> 00:05:18.760 It had not had a direct connection with its consumers. 123 00:05:18.760 --> 00:05:22.150 But we recognize the importance of deepening 124 00:05:22.150 --> 00:05:24.280 the connections we have, not just with consumers 125 00:05:24.280 --> 00:05:25.870 but with our fans. 126 00:05:25.870 --> 00:05:30.010 What is great about Levi's -- an iconic brand like it -- 127 00:05:30.010 --> 00:05:32.830 is that our consumers are not just consumers; 128 00:05:32.830 --> 00:05:37.280 they are ardent fans that literally tattoo the brand 129 00:05:37.280 --> 00:05:38.300 on themselves. 130 00:05:38.300 --> 00:05:40.110 And so we want to deepen that connection, 131 00:05:40.110 --> 00:05:43.280 and we want to use everything technology has to offer to us 132 00:05:43.280 --> 00:05:44.250 to do so. 133 00:05:44.250 --> 00:05:46.650 And of course you know what I'm talking about. 134 00:05:46.650 --> 00:05:48.880 It starts with some things like personalization, 135 00:05:48.880 --> 00:05:51.840 [which] everyone else is doing but we would like to think 136 00:05:51.840 --> 00:05:53.830 we are doing even more. 137 00:05:53.830 --> 00:05:56.830 One example of that is, we are completely personalizing 138 00:05:56.830 --> 00:05:58.560 the online experiences. 139 00:05:58.560 --> 00:06:02.100 So when you go -- whether it's on the app or the e-commerce 140 00:06:02.100 --> 00:06:06.570 site -- what you see should be somewhat customized 141 00:06:06.570 --> 00:06:08.960 to your previous browsing behaviors, 142 00:06:08.960 --> 00:06:12.910 to your needs and desires, to everything we know, 143 00:06:12.910 --> 00:06:14.620 always shared with permission. 144 00:06:14.620 --> 00:06:17.060 So deepening the connections with our consumers 145 00:06:17.060 --> 00:06:19.270 is a big part of what we do. 146 00:06:19.270 --> 00:06:22.390 Another C that we center on is, we 147 00:06:22.390 --> 00:06:26.010 want to make sure that we use technology, digitization, data, 148 00:06:26.010 --> 00:06:30.630 and AI to create smarter commerce. 149 00:06:30.630 --> 00:06:34.850 And this is where anything around internal efficiencies 150 00:06:34.850 --> 00:06:36.490 can be very helpful to the company. 151 00:06:36.490 --> 00:06:39.860 It may not immediately or directly touch the consumer, 152 00:06:39.860 --> 00:06:43.100 but it certainly has an impact on the consumer. 153 00:06:43.100 --> 00:06:45.150 One example of what we've been able to do 154 00:06:45.150 --> 00:06:49.150 during and after the recent pandemic crisis 155 00:06:49.150 --> 00:06:51.450 was pricing optimization. 156 00:06:51.450 --> 00:06:56.430 When we were facing the depth of the lockdowns, 157 00:06:56.430 --> 00:07:01.640 as a global company we certainly felt the impact of the pandemic 158 00:07:01.640 --> 00:07:03.000 all over the world. 159 00:07:03.000 --> 00:07:06.270 We had, at one point, two-thirds of our stores 160 00:07:06.270 --> 00:07:07.750 that had to be closed. 161 00:07:07.750 --> 00:07:10.760 On the one hand, being global gave us diversification, 162 00:07:10.760 --> 00:07:13.900 and where we had to be closed, in other parts of the world 163 00:07:13.900 --> 00:07:17.440 we were able to be open, and that gave us some learnings. 164 00:07:17.440 --> 00:07:20.580 On the other hand, we did feel the impact all over the world, 165 00:07:20.580 --> 00:07:25.550 and we used AI to determine the optimal price at which 166 00:07:25.550 --> 00:07:28.320 our products would sell anywhere in the world, 167 00:07:28.320 --> 00:07:31.810 through what channel, at what price, to which consumer. 168 00:07:31.810 --> 00:07:34.530 And that was very helpful, because we did not 169 00:07:34.530 --> 00:07:36.240 have to actually discount. 170 00:07:36.240 --> 00:07:40.710 A lot of our competitors did not have this incredibly powerful 171 00:07:40.710 --> 00:07:43.000 tool, and they had to discount as they 172 00:07:43.000 --> 00:07:44.560 were facing piles of inventory. 173 00:07:44.560 --> 00:07:47.270 But because of the strengths of the Levi's brand, 174 00:07:47.270 --> 00:07:49.850 and because of the application of machine learning, 175 00:07:49.850 --> 00:07:53.480 we were able to predict that our products would sell 176 00:07:53.480 --> 00:07:56.630 at full price, so that helped the consumers to get what they 177 00:07:56.630 --> 00:07:58.860 needed, but it also helped the financial margins 178 00:07:58.860 --> 00:07:59.900 of the company. 179 00:07:59.900 --> 00:08:02.140 And then the third C that we are also 180 00:08:02.140 --> 00:08:05.590 making smarter, where we apply artificial intelligence, 181 00:08:05.590 --> 00:08:09.730 is creation, the very nature of what this company does. 182 00:08:09.730 --> 00:08:11.870 And one example is that we have recently 183 00:08:11.870 --> 00:08:15.950 started to use AI in the design process. 184 00:08:15.950 --> 00:08:19.440 We now use convolutional neural networks, for example, 185 00:08:19.440 --> 00:08:24.620 to create new designs that process thousands and thousands 186 00:08:24.620 --> 00:08:28.220 of images -- for example, van Gogh's Starry Night, 187 00:08:28.220 --> 00:08:32.590 or David Hockney's artwork or Jasper Johns's artwork. 188 00:08:32.590 --> 00:08:35.240 And we can now create trucker jackets, 189 00:08:35.240 --> 00:08:39.500 which is a legendary product that Levi's literally invented, 190 00:08:39.500 --> 00:08:44.240 but on that there is now van Gogh artwork, which 191 00:08:44.240 --> 00:08:47.380 we will be producing and selling to the world. 192 00:08:47.380 --> 00:08:49.640 SAM RANSBOTHAM: I think I need some DalĂ­ pants. 193 00:08:49.640 --> 00:08:50.140 194 00:08:50.140 --> 00:08:52.530 SHERVIN KHODABANDEH: [Laughs.] It's quite fascinating, 195 00:08:52.530 --> 00:08:55.150 particularly the design example. 196 00:08:55.150 --> 00:08:57.130 It tees up the next question that I 197 00:08:57.130 --> 00:09:00.840 have in my mind: the role of the human here, 198 00:09:00.840 --> 00:09:02.850 particularly since you mentioned design, 199 00:09:02.850 --> 00:09:04.880 and I have to imagine that in the past 200 00:09:04.880 --> 00:09:10.560 this has always been a very human-centered process. 201 00:09:10.560 --> 00:09:14.450 So how is Levi's bringing humans and AI together 202 00:09:14.450 --> 00:09:17.920 to achieve outcomes that neither one could do on its own? 203 00:09:17.920 --> 00:09:18.450 204 00:09:18.450 --> 00:09:20.950 KATIA WALSH: I would venture to say that humans are actually 205 00:09:20.950 --> 00:09:24.020 the most important part of artificial intelligence, 206 00:09:24.020 --> 00:09:27.360 whether it's human-centric design, which we of course 207 00:09:27.360 --> 00:09:32.040 aspire to, or it's humans that are making machines 208 00:09:32.040 --> 00:09:35.260 smarter and of course in turn machines 209 00:09:35.260 --> 00:09:37.080 help us become even better. 210 00:09:37.080 --> 00:09:40.150 In the case of the AI-powered design that I mentioned, 211 00:09:40.150 --> 00:09:44.120 what is even more fascinating is that this work at Levi's was 212 00:09:44.120 --> 00:09:47.460 pioneered by one of our young designers 213 00:09:47.460 --> 00:09:50.990 who had no formal training in machine learning or computer 214 00:09:50.990 --> 00:09:51.700 science. 215 00:09:51.700 --> 00:09:57.370 He's one of the 101 graduates of our industry-first machine 216 00:09:57.370 --> 00:10:01.930 learning boot camp that we pioneered in 2021. 217 00:10:01.930 --> 00:10:04.750 For that boot camp, we took a number 218 00:10:04.750 --> 00:10:07.810 of people across the entire company 219 00:10:07.810 --> 00:10:10.920 everywhere in the world, from 24 locations, 220 00:10:10.920 --> 00:10:13.940 from every single function, including 221 00:10:13.940 --> 00:10:16.250 retail stores and design. 222 00:10:16.250 --> 00:10:19.420 We fully democratized this process of teaching machine 223 00:10:19.420 --> 00:10:23.290 learning so that we could get the change agents we needed 224 00:10:23.290 --> 00:10:25.810 for digital transformation across the company 225 00:10:25.810 --> 00:10:31.150 and also help ourselves in this ongoing war for talent in AI. 226 00:10:31.150 --> 00:10:34.300 So the creator -- going back to the AI design part -- 227 00:10:34.300 --> 00:10:37.350 the creator of this was actually a graduate of this boot camp 228 00:10:37.350 --> 00:10:40.230 and is absolutely central to the design process. 229 00:10:40.230 --> 00:10:42.040 SHERVIN KHODABANDEH: Can you comment more 230 00:10:42.040 --> 00:10:47.000 about this university you were talking about to really educate 231 00:10:47.000 --> 00:10:48.140 and upscale and reskill? 232 00:10:48.140 --> 00:10:48.640 233 00:10:48.640 --> 00:10:50.897 SAM RANSBOTHAM: I think you said 101 graduates so far; 234 00:10:50.897 --> 00:10:51.480 is that right? 235 00:10:51.480 --> 00:10:52.970 KATIA WALSH: That's right. 236 00:10:52.970 --> 00:10:56.550 We've had two classes -- we call them cohorts -- 237 00:10:56.550 --> 00:11:00.350 from our 2021 application process. 238 00:11:00.350 --> 00:11:03.730 We had about 450 applications. 239 00:11:03.730 --> 00:11:06.140 This is not a program for everyone, 240 00:11:06.140 --> 00:11:09.190 just to be clear, because it takes people out of their day 241 00:11:09.190 --> 00:11:10.850 job for eight weeks. 242 00:11:10.850 --> 00:11:13.070 And I have my colleagues to thank 243 00:11:13.070 --> 00:11:15.980 for making their people available for eight weeks. 244 00:11:15.980 --> 00:11:19.110 So we'll take people for eight weeks out of their day job. 245 00:11:19.110 --> 00:11:22.920 It was incredibly immersive and intensive and hands-on. 246 00:11:22.920 --> 00:11:25.360 We called it a boot camp for a reason. 247 00:11:25.360 --> 00:11:27.860 They literally had no time to do anything else, 248 00:11:27.860 --> 00:11:31.670 and they were exhausted when the time for graduation came. 249 00:11:31.670 --> 00:11:36.060 They worked with real data to solve Levi's problems, 250 00:11:36.060 --> 00:11:39.470 and we were actually able to deploy the models they created 251 00:11:39.470 --> 00:11:43.750 after the boot camp -- models that were looking at prediction 252 00:11:43.750 --> 00:11:45.430 of demand, as I mentioned earlier; 253 00:11:45.430 --> 00:11:47.600 that's incorporated into that work. 254 00:11:47.600 --> 00:11:51.670 The AI-powered design will also be taken further, 255 00:11:51.670 --> 00:11:53.910 but also other things like personalization 256 00:11:53.910 --> 00:11:56.490 of our marketing messages; that is something that the boot 257 00:11:56.490 --> 00:11:57.920 camp graduates worked on. 258 00:11:57.920 --> 00:12:03.180 So we created this for Levi's people with Levi's data 259 00:12:03.180 --> 00:12:07.570 to solve Levi's problems, and we are now in the process 260 00:12:07.570 --> 00:12:12.480 of selecting the next graduates, who will start in April 261 00:12:12.480 --> 00:12:16.000 and graduate in May, and then we'll have another cohort -- 262 00:12:16.000 --> 00:12:19.330 another class -- in the fall of 2022 as well. 263 00:12:19.330 --> 00:12:22.330 So this is an ongoing effort, and I'm very proud of that. 264 00:12:22.330 --> 00:12:24.930 SHERVIN KHODABANDEH: This is really great; it's fascinating. 265 00:12:24.930 --> 00:12:26.970 So it seems like in a few years, you'll 266 00:12:26.970 --> 00:12:31.010 have several hundred folks that I'm 267 00:12:31.010 --> 00:12:32.397 assuming are going to be embedded 268 00:12:32.397 --> 00:12:33.980 in different lines of business, right? 269 00:12:33.980 --> 00:12:37.060 These are not your technical folks 270 00:12:37.060 --> 00:12:40.990 that are maybe in the technology or AI or data science 271 00:12:40.990 --> 00:12:42.290 or engineering groups. 272 00:12:42.290 --> 00:12:47.490 What a great way to upscale and immerse folks in the business 273 00:12:47.490 --> 00:12:49.200 about the power of AI. 274 00:12:49.200 --> 00:12:54.400 Is the ambition to continue this until almost everyone's gone 275 00:12:54.400 --> 00:12:55.982 through this, or what's the ambition? 276 00:12:55.982 --> 00:12:57.690 KATIA WALSH: Well, Shervin, first of all, 277 00:12:57.690 --> 00:13:00.120 you are right that most of the people who 278 00:13:00.120 --> 00:13:03.280 graduate from the boot camp go back to their roles. 279 00:13:03.280 --> 00:13:06.960 They're not necessarily changing their job. 280 00:13:06.960 --> 00:13:10.543 There are people who want to become very advanced data 281 00:13:10.543 --> 00:13:11.960 scientists, and of course we don't 282 00:13:11.960 --> 00:13:14.110 want to deprive them of that opportunity, 283 00:13:14.110 --> 00:13:16.610 and we do give them that opportunity when 284 00:13:16.610 --> 00:13:17.540 the time comes. 285 00:13:17.540 --> 00:13:20.500 But the vast majority stay in their existing roles 286 00:13:20.500 --> 00:13:23.350 and thus upgrade their own roles. 287 00:13:23.350 --> 00:13:26.940 And in that context, I want to mention two other C's 288 00:13:26.940 --> 00:13:30.760 that we are also targeting through this combination 289 00:13:30.760 --> 00:13:34.050 of digitization, data, and AI capability we are building. 290 00:13:34.050 --> 00:13:36.980 I did mention earlier the connections with consumers, 291 00:13:36.980 --> 00:13:40.920 the commerce that we are making even smarter, and, of course, 292 00:13:40.920 --> 00:13:42.670 the creation process. 293 00:13:42.670 --> 00:13:45.030 We have now two other smarter C's. 294 00:13:45.030 --> 00:13:46.340 One is careers. 295 00:13:46.340 --> 00:13:48.390 People who go through this boot camp 296 00:13:48.390 --> 00:13:51.040 do change their outlook and the abilities 297 00:13:51.040 --> 00:13:53.590 to have a career, whether it's at Levi's or outside 298 00:13:53.590 --> 00:13:55.720 of Levi's, although I'm proud to also say 299 00:13:55.720 --> 00:13:58.820 that in this time of the Great Resignation, 300 00:13:58.820 --> 00:14:02.570 the very vast majority of the graduates 301 00:14:02.570 --> 00:14:05.490 have stayed with Levi's, and I do credit the boot camp, 302 00:14:05.490 --> 00:14:07.430 at least to a certain extent, for that. 303 00:14:07.430 --> 00:14:10.170 And then the other smart C is culture. 304 00:14:10.170 --> 00:14:12.410 These people are now helping us change 305 00:14:12.410 --> 00:14:15.390 the culture in the whole enterprise globally. 306 00:14:15.390 --> 00:14:18.330 They think differently; they know the language they speak; 307 00:14:18.330 --> 00:14:21.010 they connect with data scientists, engineers, 308 00:14:21.010 --> 00:14:22.470 and product managers. 309 00:14:22.470 --> 00:14:25.070 And so, collectively, through all of that, 310 00:14:25.070 --> 00:14:29.110 we are transforming the company for its next 169 years. 311 00:14:29.110 --> 00:14:31.030 SAM RANSBOTHAM: Katia, can you give us 312 00:14:31.030 --> 00:14:33.260 another example of someone who's graduated 313 00:14:33.260 --> 00:14:36.390 from the boot camp that's gone on to do a good project? 314 00:14:36.390 --> 00:14:38.390 KATIA WALSH: We have lots of examples, actually. 315 00:14:38.390 --> 00:14:40.920 We have 101 examples, at this point. 316 00:14:40.920 --> 00:14:43.230 But one other example that I think 317 00:14:43.230 --> 00:14:46.330 is also particularly compelling is 318 00:14:46.330 --> 00:14:49.440 about a graduate who is a retail store 319 00:14:49.440 --> 00:14:53.290 manager, someone who had never seen code in her life before. 320 00:14:53.290 --> 00:14:56.640 Her job had been as a stylist for 11 years, 321 00:14:56.640 --> 00:14:58.670 so she's very close to the consumers. 322 00:14:58.670 --> 00:15:01.190 She talks to consumers all the time; 323 00:15:01.190 --> 00:15:03.390 she helps them make exciting decisions, 324 00:15:03.390 --> 00:15:06.540 she recommends what they should pair with what. 325 00:15:06.540 --> 00:15:09.570 But now, she has acquired skills that 326 00:15:09.570 --> 00:15:14.760 have enabled us to create a model that bundles items 327 00:15:14.760 --> 00:15:17.810 in our vast array of inventory that 328 00:15:17.810 --> 00:15:22.040 work very well with each other, that create outfits. 329 00:15:22.040 --> 00:15:24.480 And so, through this automated process, 330 00:15:24.480 --> 00:15:28.680 this retail store manager in our Denver premium outlet store 331 00:15:28.680 --> 00:15:32.190 is able to proactively go ahead and suggest something that's 332 00:15:32.190 --> 00:15:34.870 not only her own personal idea but is 333 00:15:34.870 --> 00:15:37.780 based on the recommendations of a machine learning model. 334 00:15:37.780 --> 00:15:42.440 She's able to recommend bundles of items, entire outfits, 335 00:15:42.440 --> 00:15:43.600 to our consumers. 336 00:15:43.600 --> 00:15:47.423 And, of course, because it's a model, it's always learning, 337 00:15:47.423 --> 00:15:49.340 she's always getting new data, and it's always 338 00:15:49.340 --> 00:15:50.173 getting even better. 339 00:15:50.173 --> 00:15:51.890 SHERVIN KHODABANDEH: What a great story. 340 00:15:51.890 --> 00:15:54.460 Send that woman to Sam for some fashion. 341 00:15:54.460 --> 00:15:54.960 342 00:15:54.960 --> 00:15:57.310 KATIA WALSH: I'll give you one other example 343 00:15:57.310 --> 00:16:00.340 of a machine learning boot camp graduate and what she did, 344 00:16:00.340 --> 00:16:04.070 a woman who works in our Las Vegas distribution center. 345 00:16:04.070 --> 00:16:08.450 So she went back to her old job, where she had been facing 346 00:16:08.450 --> 00:16:12.290 an ongoing problem for years, and the problem was that every 347 00:16:12.290 --> 00:16:15.260 day, the distribution center experiences downtime -- 348 00:16:15.260 --> 00:16:16.230 something goes wrong. 349 00:16:16.230 --> 00:16:19.480 Equipment breaks, a part wears out, 350 00:16:19.480 --> 00:16:23.510 and the distribution center would have to go anywhere from 351 00:16:23.510 --> 00:16:26.920 15 minutes to two hours with no work -- 352 00:16:26.920 --> 00:16:30.880 just complete downtime while we would continue to have to pay, 353 00:16:30.880 --> 00:16:34.930 of course, the cost of labor, and we would be missing out 354 00:16:34.930 --> 00:16:39.170 on the profits of shipping orders out. 355 00:16:39.170 --> 00:16:41.493 So this young woman went back to her job 356 00:16:41.493 --> 00:16:43.660 with the skills she had learned and she said, "Well, 357 00:16:43.660 --> 00:16:45.290 I can now tackle this problem." 358 00:16:45.290 --> 00:16:48.330 So she created a predictive maintenance model 359 00:16:48.330 --> 00:16:52.080 that now predicts with a great deal of accuracy what equipment 360 00:16:52.080 --> 00:16:55.180 is going to malfunction in the next 30 days. 361 00:16:55.180 --> 00:16:59.430 Moreover, she designed an app that 362 00:16:59.430 --> 00:17:03.200 shows those predictions in an easy-to-view way 363 00:17:03.200 --> 00:17:06.020 and automatically dispatches technicians 364 00:17:06.020 --> 00:17:08.690 to go ahead and preventatively check the equipment, 365 00:17:08.690 --> 00:17:11.510 so now there's no downtime in this particular distribution 366 00:17:11.510 --> 00:17:12.010 center. 367 00:17:12.010 --> 00:17:14.530 SAM RANSBOTHAM: What are some other things that you think 368 00:17:14.530 --> 00:17:17.940 that are involved with responsible use of AI 369 00:17:17.940 --> 00:17:20.295 and ethical and trusted use of AI within Levi Strauss? 370 00:17:20.295 --> 00:17:22.920 I know this is something you've been thinking about and working 371 00:17:22.920 --> 00:17:23.420 on. 372 00:17:23.420 --> 00:17:25.650 KATIA WALSH: [I've been] very much thinking about 373 00:17:25.650 --> 00:17:28.850 and working on AI for good. 374 00:17:28.850 --> 00:17:32.430 It's a very powerful tool, as we know, but like any tool, 375 00:17:32.430 --> 00:17:35.710 it can be used for good and for not-so-good. 376 00:17:35.710 --> 00:17:38.540 And one of the reasons I'm so happy to be at a company 377 00:17:38.540 --> 00:17:41.010 like Levi's is that it really does 378 00:17:41.010 --> 00:17:45.320 bring a lot of values that are transcending an industry 379 00:17:45.320 --> 00:17:47.950 or geography or even an era. 380 00:17:47.950 --> 00:17:50.830 So within the context of data and AI, 381 00:17:50.830 --> 00:17:54.740 we have set a conduct of ethics for everyone 382 00:17:54.740 --> 00:17:57.240 who works with data in the company. 383 00:17:57.240 --> 00:18:00.540 Over time, that will be everyone in the entire enterprise. 384 00:18:00.540 --> 00:18:03.600 Everyone who works with data in the company at this time 385 00:18:03.600 --> 00:18:05.820 has to actually sign a code of ethics, 386 00:18:05.820 --> 00:18:10.590 not unlike the Hippocratic oath, to make sure that we always 387 00:18:10.590 --> 00:18:13.740 protect our consumers, our company, our shareholders, 388 00:18:13.740 --> 00:18:15.630 because how can you delight consumers 389 00:18:15.630 --> 00:18:17.415 if you're not protecting them? 390 00:18:17.415 --> 00:18:19.040 And then there are other things that we 391 00:18:19.040 --> 00:18:23.370 are doing to ensure that we use data and machine learning 392 00:18:23.370 --> 00:18:24.720 with utmost care. 393 00:18:24.720 --> 00:18:28.430 For example, as you well know, there is still a lot 394 00:18:28.430 --> 00:18:32.080 of opportunity, unfortunately, for bias in models 395 00:18:32.080 --> 00:18:35.010 and algorithms and the outcomes from that, 396 00:18:35.010 --> 00:18:39.100 so while we know that you cannot ever completely eliminate bias 397 00:18:39.100 --> 00:18:41.620 in life, we are doing our best to minimize it, 398 00:18:41.620 --> 00:18:44.080 and there are three ways in which we do so. 399 00:18:44.080 --> 00:18:45.810 One: through the people. 400 00:18:45.810 --> 00:18:48.680 The more diverse people we have who work with data 401 00:18:48.680 --> 00:18:52.410 and bring data and create algorithms, the more likely 402 00:18:52.410 --> 00:18:54.760 we'll have these implicit checks and balances 403 00:18:54.760 --> 00:18:57.000 to ensure that we minimize bias. 404 00:18:57.000 --> 00:18:59.910 The other reason we bring such diverse data 405 00:18:59.910 --> 00:19:02.490 is not only because it can enrich our models 406 00:19:02.490 --> 00:19:05.200 but because it can also help minimize bias. 407 00:19:05.200 --> 00:19:08.220 So that's why the data sets are so different, 408 00:19:08.220 --> 00:19:11.420 and sometimes we bring together data sets that have never 409 00:19:11.420 --> 00:19:13.610 met in the past, and it's amazing 410 00:19:13.610 --> 00:19:15.600 what you can find out when that happens, 411 00:19:15.600 --> 00:19:17.720 but that also helps minimize bias. 412 00:19:17.720 --> 00:19:20.250 And the third way in which we work to minimize bias 413 00:19:20.250 --> 00:19:22.080 is the diversity of tools. 414 00:19:22.080 --> 00:19:26.220 We purposefully deploy a great deal of open-source tools. 415 00:19:26.220 --> 00:19:29.220 We make sure that, yes, while it's 416 00:19:29.220 --> 00:19:31.360 beneficial to work with certain vendors, 417 00:19:31.360 --> 00:19:34.423 we also always want to stay on top of what's next. 418 00:19:34.423 --> 00:19:36.090 And what's great about open-source tools 419 00:19:36.090 --> 00:19:38.680 is that they're worked on literally all over the world 420 00:19:38.680 --> 00:19:40.850 by anyone who has the skills. 421 00:19:40.850 --> 00:19:44.140 That is one of the reasons we also deploy open-source tools: 422 00:19:44.140 --> 00:19:45.414 to ensure diversity. 423 00:19:45.414 --> 00:19:45.667 424 00:19:45.667 --> 00:19:47.250 SHERVIN KHODABANDEH: I want to go back 425 00:19:47.250 --> 00:19:49.840 to one of the comments you made a bit earlier 426 00:19:49.840 --> 00:19:55.490 about modernizing, digitizing, infusing with deeper, 427 00:19:55.490 --> 00:20:01.710 better data and analytics those functions in the company, 428 00:20:01.710 --> 00:20:05.540 particularly in a sector like fashion retail, that 429 00:20:05.540 --> 00:20:07.300 maybe generally have traditionally 430 00:20:07.300 --> 00:20:09.760 been quite analog. 431 00:20:09.760 --> 00:20:11.930 Like, you talked about merchandising and planning, 432 00:20:11.930 --> 00:20:14.740 and I'll add to it maybe forecasting and pricing 433 00:20:14.740 --> 00:20:16.200 and things of that nature. 434 00:20:16.200 --> 00:20:20.070 Can you share some stories about how 435 00:20:20.070 --> 00:20:22.410 you've been able to bring folks who 436 00:20:22.410 --> 00:20:24.630 grew up sort of in a different era 437 00:20:24.630 --> 00:20:28.300 or are much more used to the old way of doing things, 438 00:20:28.300 --> 00:20:32.840 and how do you bring the new and old together 439 00:20:32.840 --> 00:20:34.070 in a collaborative way? 440 00:20:34.070 --> 00:20:36.740 KATIA WALSH: I think what you're talking about 441 00:20:36.740 --> 00:20:41.420 is, in general, how one can shepherd a change in a company 442 00:20:41.420 --> 00:20:43.680 or in an organization in general, 443 00:20:43.680 --> 00:20:45.680 and this has been my entire career. 444 00:20:45.680 --> 00:20:50.140 I spent 20-plus years at the intersection of technology 445 00:20:50.140 --> 00:20:52.950 and data and analytics and machine learning, 446 00:20:52.950 --> 00:20:56.280 and most of that career has been actually helping 447 00:20:56.280 --> 00:21:01.040 companies transform themselves to meet their strategic goals. 448 00:21:01.040 --> 00:21:03.990 It's particularly challenging with technology, 449 00:21:03.990 --> 00:21:06.650 and especially when you look at a particular technology 450 00:21:06.650 --> 00:21:10.300 like AI, because it can be seen as so intimidating. 451 00:21:10.300 --> 00:21:14.020 And so one of my aspirations and missions, actually, 452 00:21:14.020 --> 00:21:18.760 has been to humanize it -- to make it closer to people, 453 00:21:18.760 --> 00:21:23.610 to give it a face, to help people understand that not only 454 00:21:23.610 --> 00:21:25.850 is it not there to replace jobs, for example, 455 00:21:25.850 --> 00:21:28.830 but it is there to help them succeed even more, 456 00:21:28.830 --> 00:21:30.630 to make them even smarter. 457 00:21:30.630 --> 00:21:33.700 And that's one of the reasons I introduced this machine 458 00:21:33.700 --> 00:21:36.260 learning boot camp, not only at Levi Strauss & Co., 459 00:21:36.260 --> 00:21:38.730 but at other companies as well. 460 00:21:38.730 --> 00:21:43.810 So humanizing this capability is very important, 461 00:21:43.810 --> 00:21:47.850 and how exactly we do that may depend on the specific context 462 00:21:47.850 --> 00:21:50.580 of a company or an organization, and certainly even 463 00:21:50.580 --> 00:21:52.420 within a time that can change. 464 00:21:52.420 --> 00:21:55.290 But I really believe in exciting and inspiring people 465 00:21:55.290 --> 00:21:58.530 about it and helping them get on board because they want to, 466 00:21:58.530 --> 00:22:00.477 not because they feel threatened. 467 00:22:00.477 --> 00:22:02.310 SHERVIN KHODABANDEH: Are there lessons there 468 00:22:02.310 --> 00:22:07.330 for others who are playing similar roles as an agent 469 00:22:07.330 --> 00:22:12.390 of change in terms of how they might go about implementing 470 00:22:12.390 --> 00:22:13.620 changes of this sort? 471 00:22:13.620 --> 00:22:14.940 472 00:22:14.940 --> 00:22:17.490 KATIA WALSH: Yes, one thing that I have found very helpful -- 473 00:22:17.490 --> 00:22:21.310 this is now the third company in which I am transforming 474 00:22:21.310 --> 00:22:24.130 the enterprise not alone, and that's actually one 475 00:22:24.130 --> 00:22:26.090 of the lessons: You cannot do it alone. 476 00:22:26.090 --> 00:22:28.970 You have to make sure that you garner that support, 477 00:22:28.970 --> 00:22:32.260 and that support has to be throughout the organization -- 478 00:22:32.260 --> 00:22:37.120 certainly at the top leadership level, but also throughout, 479 00:22:37.120 --> 00:22:39.690 grassroots and sideways, as well. 480 00:22:39.690 --> 00:22:42.320 So getting that support is really critical. 481 00:22:42.320 --> 00:22:44.550 But in addition to that, I've had 482 00:22:44.550 --> 00:22:51.180 this motto of "think big, start small, and scale fast." 483 00:22:51.180 --> 00:22:53.960 I suppose the "start small" can be seen as a compromise, 484 00:22:53.960 --> 00:22:56.100 but I don't see it as a compromise. 485 00:22:56.100 --> 00:23:00.090 I see it as an opportunity to deliver immediate value. 486 00:23:00.090 --> 00:23:03.890 And that's a big part of getting people on board. 487 00:23:03.890 --> 00:23:08.233 When we are able to show value very quickly, even if it's not 488 00:23:08.233 --> 00:23:09.650 the biggest value in the world, it 489 00:23:09.650 --> 00:23:11.870 has to be meaningful to excite people. 490 00:23:11.870 --> 00:23:15.630 But if people are able to see it very quickly and very 491 00:23:15.630 --> 00:23:20.890 concretely in their own business unit, function, geography, 492 00:23:20.890 --> 00:23:23.910 that certainly gets people on board because they see, 493 00:23:23.910 --> 00:23:25.740 "Wow, this is helping me solve a problem 494 00:23:25.740 --> 00:23:27.800 that I've been looking to tackle all this time," 495 00:23:27.800 --> 00:23:30.940 or, "This is helping me meet my commercial goal 496 00:23:30.940 --> 00:23:33.570 that otherwise I might have struggled to achieve." 497 00:23:33.570 --> 00:23:35.240 That helps a great deal. 498 00:23:35.240 --> 00:23:37.840 And what it also does is, if they're 499 00:23:37.840 --> 00:23:40.140 people sitting on the sidelines, it 500 00:23:40.140 --> 00:23:43.390 makes them want to say, "Well, how come I don't have this? 501 00:23:43.390 --> 00:23:45.160 I want to get this as well." 502 00:23:45.160 --> 00:23:47.510 So it creates a little bit of healthy competition. 503 00:23:47.510 --> 00:23:50.280 So to summarize, some of what I found helpful 504 00:23:50.280 --> 00:23:53.930 has been certainly ensuring support throughout the company, 505 00:23:53.930 --> 00:23:55.780 and one way in which it has helped 506 00:23:55.780 --> 00:24:00.450 me to do that is to think big, start small, scale fast, show 507 00:24:00.450 --> 00:24:03.140 that immediate value, and get people on board. 508 00:24:03.140 --> 00:24:05.080 SHERVIN KHODABANDEH: And educate and humanize, 509 00:24:05.080 --> 00:24:06.290 as you talked about earlier. 510 00:24:06.290 --> 00:24:07.253 KATIA WALSH: That also. 511 00:24:07.253 --> 00:24:09.170 SHERVIN KHODABANDEH: It's not a simple answer. 512 00:24:09.170 --> 00:24:11.274 It's a real transformation. 513 00:24:11.274 --> 00:24:11.912 514 00:24:11.912 --> 00:24:12.620 KATIA WALSH: Yes. 515 00:24:12.620 --> 00:24:17.270 The other thing is that when we build these capabilities 516 00:24:17.270 --> 00:24:21.810 I want to actually address a potential fallacy. 517 00:24:21.810 --> 00:24:25.900 A lot of companies [embark] on a digital transformation, 518 00:24:25.900 --> 00:24:27.250 as we are calling it now. 519 00:24:27.250 --> 00:24:30.700 We've been digital for the past 25 years as a world, 520 00:24:30.700 --> 00:24:33.520 but I guess COVID has accelerated this need 521 00:24:33.520 --> 00:24:36.840 to modernize businesses in every industry. 522 00:24:36.840 --> 00:24:39.440 But a lot of companies embark on that 523 00:24:39.440 --> 00:24:42.890 without having a clear vision or view of why they're doing that. 524 00:24:42.890 --> 00:24:44.850 And that's why the fusion with strategy 525 00:24:44.850 --> 00:24:47.592 is so important: because it gives that "why." 526 00:24:47.592 --> 00:24:49.300 But the other thing to keep in mind, too, 527 00:24:49.300 --> 00:24:52.080 is, it's not about building technology. 528 00:24:52.080 --> 00:24:54.480 Yes, technology matters a great deal. 529 00:24:54.480 --> 00:24:56.480 It's a key enabler. 530 00:24:56.480 --> 00:25:00.590 But we have to be careful about "build it and they'll come." 531 00:25:00.590 --> 00:25:03.890 We have not built the perfect data ocean. 532 00:25:03.890 --> 00:25:05.910 We have it; we have gotten it started. 533 00:25:05.910 --> 00:25:08.770 It will never be perfect, because data is never perfect. 534 00:25:08.770 --> 00:25:11.360 It's always coming at us like a tsunami. 535 00:25:11.360 --> 00:25:14.240 I wouldn't say that we have built the perfect platforms 536 00:25:14.240 --> 00:25:17.240 yet, either -- again, because they're developing so fast 537 00:25:17.240 --> 00:25:19.240 in such a dynamic field. 538 00:25:19.240 --> 00:25:22.510 But what we are doing is to show that value 539 00:25:22.510 --> 00:25:26.390 very consistently and hopefully growing it over time. 540 00:25:26.390 --> 00:25:29.900 And that's what the essence of transformation is. 541 00:25:29.900 --> 00:25:32.068 SAM RANSBOTHAM: Katia, it's great talking with you. 542 00:25:32.068 --> 00:25:33.860 Thanks for taking the time to talk with us. 543 00:25:33.860 --> 00:25:34.700 We've really enjoyed it. 544 00:25:34.700 --> 00:25:35.860 SHERVIN KHODABANDEH: It's really been great. 545 00:25:35.860 --> 00:25:36.520 Thank you, Katia. 546 00:25:36.520 --> 00:25:38.270 KATIA WALSH: My pleasure, Sam and Shervin. 547 00:25:38.270 --> 00:25:39.448 Nice to meet you. 548 00:25:39.448 --> 00:25:40.990 SAM RANSBOTHAM: Thanks for listening. 549 00:25:40.990 --> 00:25:43.030 Next time, we chat with Kobi Abayomi, 550 00:25:43.030 --> 00:25:46.400 senior vice president of data science at Warner Music Group. 551 00:25:46.400 --> 00:25:49.112 Don't worry -- neither Shervin nor I will sing. 552 00:25:49.112 --> 00:25:50.570 ALLISON RYDER: Thanks for listening 553 00:25:50.570 --> 00:25:52.070 to Me, Myself, and AI. 554 00:25:52.070 --> 00:25:54.530 We believe, like you, that the conversation 555 00:25:54.530 --> 00:25:56.750 about AI implementation doesn't start and stop 556 00:25:56.750 --> 00:25:57.920 with this podcast. 557 00:25:57.920 --> 00:26:00.420 That's why we've created a group on LinkedIn, specifically 558 00:26:00.420 --> 00:26:01.540 for leaders like you. 559 00:26:01.540 --> 00:26:04.280 It's called AI for Leaders, and if you join us, 560 00:26:04.280 --> 00:26:06.310 you can chat with show creators and hosts, 561 00:26:06.310 --> 00:26:09.920 ask your own questions, share insights, and gain access 562 00:26:09.920 --> 00:26:12.410 to valuable resources about AI implementation 563 00:26:12.410 --> 00:26:14.510 from MIT SMR and BCG. 564 00:26:14.510 --> 00:26:19.630 You can access it by visiting mitsmr.com/AIforLeaders. 565 00:26:19.630 --> 00:26:22.350 We'll put that link in the show notes, 566 00:26:22.350 --> 00:26:24.780 and we hope to see you there. 567 00:26:24.780 --> 00:26:30.000