WEBVTT 1 00:00:00.000 --> 00:00:00.520 2 00:00:00.520 --> 00:00:02.340 SAM RANSBOTHAM: Skincare products 3 00:00:02.340 --> 00:00:05.430 are inherently physical, not virtual. 4 00:00:05.430 --> 00:00:08.990 How can companies use AI to make choosing skincare products 5 00:00:08.990 --> 00:00:10.390 possible online? 6 00:00:10.390 --> 00:00:13.040 Find out on today's episode. 7 00:00:13.040 --> 00:00:16.550 SOWMYA GOTTIPATI: I am Sowmya Gottipati from Estée Lauder, 8 00:00:16.550 --> 00:00:19.530 and you're listening to Me, Myself, and AI. 9 00:00:19.530 --> 00:00:23.410 SAM RANSBOTHAM: Welcome to Me, Myself, and AI, 10 00:00:23.410 --> 00:00:26.410 a podcast on artificial intelligence in business. 11 00:00:26.410 --> 00:00:30.190 Each episode, we introduce you to someone innovating with AI. 12 00:00:30.190 --> 00:00:34.480 I'm Sam Ransbotham, professor of analytics at Boston College. 13 00:00:34.480 --> 00:00:38.040 I'm also the AI and business strategy guest editor 14 00:00:38.040 --> 00:00:39.450 at MIT Sloan Management Review. 15 00:00:39.450 --> 00:00:39.858 16 00:00:39.858 --> 00:00:41.900 SHERVIN KHODABANDEH: And I'm Shervin Khodabandeh, 17 00:00:41.900 --> 00:00:45.900 senior partner with BCG, and I colead BCG's AI practice 18 00:00:45.900 --> 00:00:46.890 in North America. 19 00:00:46.890 --> 00:00:51.410 Together, MIT SMR and BCG have been researching and publishing 20 00:00:51.410 --> 00:00:54.340 on AI for six years, interviewing hundreds 21 00:00:54.340 --> 00:00:56.450 of practitioners and surveying thousands 22 00:00:56.450 --> 00:00:59.940 of companies on what it takes to build and to deploy and scale 23 00:00:59.940 --> 00:01:02.410 AI capabilities and really transform 24 00:01:02.410 --> 00:01:03.700 the way organizations operate. 25 00:01:03.700 --> 00:01:06.680 SAM RANSBOTHAM: Shervin and I are excited today to be talking 26 00:01:06.680 --> 00:01:10.190 with Sowmya Gottipati, head of global supply chain technology 27 00:01:10.190 --> 00:01:11.370 for Estée Lauder. 28 00:01:11.370 --> 00:01:13.750 Sowmya, thanks for taking the time to talk with us. 29 00:01:13.750 --> 00:01:14.613 Welcome. 30 00:01:14.613 --> 00:01:16.030 SOWMYA GOTTIPATI: Glad to be here. 31 00:01:16.030 --> 00:01:19.050 I'm really excited to be here talking about AI, 32 00:01:19.050 --> 00:01:20.190 one of my favorite topics. 33 00:01:20.190 --> 00:01:20.610 34 00:01:20.610 --> 00:01:21.570 SHERVIN KHODABANDEH: Great to have you. 35 00:01:21.570 --> 00:01:24.130 SAM RANSBOTHAM: Let's start with your current role at Estée 36 00:01:24.130 --> 00:01:24.630 Lauder. 37 00:01:24.630 --> 00:01:25.880 What do you do now? 38 00:01:25.880 --> 00:01:28.560 SOWMYA GOTTIPATI: I've been with Estée for about two and a half 39 00:01:28.560 --> 00:01:29.360 years. 40 00:01:29.360 --> 00:01:34.170 We are a prestige and luxury beauty brand company, 41 00:01:34.170 --> 00:01:38.170 and we have around 30 brands under our umbrella, 42 00:01:38.170 --> 00:01:40.510 Estée Lauder being the flagship brand, 43 00:01:40.510 --> 00:01:45.930 but we also have Clinique, MAC, La Mer, and several others. 44 00:01:45.930 --> 00:01:50.110 We are a global company, with our products 45 00:01:50.110 --> 00:01:52.650 across various regions, from Asia 46 00:01:52.650 --> 00:01:55.650 to [North America] and Latin America, Europe, 47 00:01:55.650 --> 00:01:57.050 and everywhere. 48 00:01:57.050 --> 00:02:00.690 I'm responsible for global supply chain technology. 49 00:02:00.690 --> 00:02:03.070 I am responsible for the technology 50 00:02:03.070 --> 00:02:06.120 that powers the entire supply chain globally. 51 00:02:06.120 --> 00:02:11.130 That includes inventory, supply, demand planning, manufacturing, 52 00:02:11.130 --> 00:02:13.950 distribution centers, fulfillment, transportation, 53 00:02:13.950 --> 00:02:18.250 end-to-end supply chain -- all the technology capabilities 54 00:02:18.250 --> 00:02:19.237 that support that. 55 00:02:19.237 --> 00:02:19.878 56 00:02:19.878 --> 00:02:21.920 SAM RANSBOTHAM: All right; I didn't hear anything 57 00:02:21.920 --> 00:02:23.837 about artificial intelligence in that, though. 58 00:02:23.837 --> 00:02:26.080 So how does artificial intelligence 59 00:02:26.080 --> 00:02:27.470 connect with that supply chain? 60 00:02:27.470 --> 00:02:30.190 SOWMYA GOTTIPATI: I took the supply chain role 61 00:02:30.190 --> 00:02:31.750 just about four months ago. 62 00:02:31.750 --> 00:02:36.190 Prior to that, I was responsible for brand technology 63 00:02:36.190 --> 00:02:37.940 for the Estée Lauder brand. 64 00:02:37.940 --> 00:02:42.200 That was really where I was directly involved in a lot 65 00:02:42.200 --> 00:02:46.660 of AI applications: how we use AI at Estée Lauder, 66 00:02:46.660 --> 00:02:48.790 starting with the consumer experience; 67 00:02:48.790 --> 00:02:52.020 how we are enhancing our consumer experience with the AI 68 00:02:52.020 --> 00:02:54.477 technology and providing real-life applications, 69 00:02:54.477 --> 00:02:56.060 which I can talk [about] a little bit, 70 00:02:56.060 --> 00:02:58.970 with the virtual try-on [tool] and so on and so forth. 71 00:02:58.970 --> 00:03:01.690 And then personalization is a very important area, 72 00:03:01.690 --> 00:03:04.770 and the application of AI is very significant. 73 00:03:04.770 --> 00:03:08.030 And then AI is also being used to create new products, 74 00:03:08.030 --> 00:03:10.240 such as skincare and fragrance. 75 00:03:10.240 --> 00:03:13.810 This is where we can use data to inform ourselves 76 00:03:13.810 --> 00:03:16.840 on what kind of ingredients and what type of products 77 00:03:16.840 --> 00:03:20.330 that people like and so we can inform our product 78 00:03:20.330 --> 00:03:24.700 planning using AI and what I call agile enterprise. 79 00:03:24.700 --> 00:03:26.300 So there are a number of areas where 80 00:03:26.300 --> 00:03:29.110 AI is applied to run an efficient organization, 81 00:03:29.110 --> 00:03:32.880 such as supply chain and R&D. These are all various areas 82 00:03:32.880 --> 00:03:34.330 that we are using AI in. 83 00:03:34.330 --> 00:03:36.290 SHERVIN KHODABANDEH: Sounds like it's 84 00:03:36.290 --> 00:03:39.160 quite prevalent across the whole value chain. 85 00:03:39.160 --> 00:03:41.730 I did read up on something -- 86 00:03:41.730 --> 00:03:47.660 I think a talk you'd given -- on how AI's being used to help 87 00:03:47.660 --> 00:03:49.887 personalize fragrance. 88 00:03:49.887 --> 00:03:50.470 Is that right? 89 00:03:50.470 --> 00:03:50.790 90 00:03:50.790 --> 00:03:51.760 SOWMYA GOTTIPATI: That's right. 91 00:03:51.760 --> 00:03:53.885 SHERVIN KHODABANDEH: Can you comment on that a bit? 92 00:03:53.885 --> 00:03:54.290 93 00:03:54.290 --> 00:03:55.498 SOWMYA GOTTIPATI: Absolutely. 94 00:03:55.498 --> 00:03:58.230 That's definitely one of my most exciting projects 95 00:03:58.230 --> 00:03:59.370 that I worked on. 96 00:03:59.370 --> 00:04:03.840 It's an industry breakthrough, so I feel very proud about it. 97 00:04:03.840 --> 00:04:06.800 It's a fragrance recommendation engine, 98 00:04:06.800 --> 00:04:10.940 but it takes advantage of neuroscience and AI 99 00:04:10.940 --> 00:04:12.430 and olfactory science. 100 00:04:12.430 --> 00:04:14.530 We are bringing all three sciences together 101 00:04:14.530 --> 00:04:15.740 to make that happen. 102 00:04:15.740 --> 00:04:20.260 You know, the human brain has approximately 400 103 00:04:20.260 --> 00:04:23.760 olfactory receptors, and we are working 104 00:04:23.760 --> 00:04:27.510 with a company that can actually replicate those receptors 105 00:04:27.510 --> 00:04:31.690 in a lab environment, so if you take a particular fragrance, 106 00:04:31.690 --> 00:04:35.000 we can actually tell which of the olfactory receptors 107 00:04:35.000 --> 00:04:38.310 in your brain are activated by that fragrance. 108 00:04:38.310 --> 00:04:41.600 SHERVIN KHODABANDEH: So are these like neuromorphic chips, 109 00:04:41.600 --> 00:04:43.173 or are these silicon-based software? 110 00:04:43.173 --> 00:04:43.860 111 00:04:43.860 --> 00:04:46.030 SOWMYA GOTTIPATI: It's not a software. 112 00:04:46.030 --> 00:04:48.030 These are actually biosensor testing. 113 00:04:48.030 --> 00:04:48.203 114 00:04:48.203 --> 00:04:49.870 SHERVIN KHODABANDEH: That's pretty cool. 115 00:04:49.870 --> 00:04:49.982 116 00:04:49.982 --> 00:04:51.690 SOWMYA GOTTIPATI: Yeah, it's really cool. 117 00:04:51.690 --> 00:04:54.740 So we would be able to tell [that] receptor 118 00:04:54.740 --> 00:05:00.750 67 and 92 and 86 are triggered by this particular fragrance. 119 00:05:00.750 --> 00:05:04.910 And let's say that fragrance is predominantly lavender-based. 120 00:05:04.910 --> 00:05:07.000 By the way, your brain can't really 121 00:05:07.000 --> 00:05:09.170 tell the difference between lavender and woody, 122 00:05:09.170 --> 00:05:11.960 so I might be able to bring a woody fragrance to you 123 00:05:11.960 --> 00:05:14.260 and the same receptors might get triggered 124 00:05:14.260 --> 00:05:17.880 because they're evoking the same emotion in your brain. 125 00:05:17.880 --> 00:05:21.010 So, because the same receptors are being triggered, 126 00:05:21.010 --> 00:05:24.390 we can tell, "Oh, by the way, just because you love lavender, 127 00:05:24.390 --> 00:05:27.500 you might like this other fragrance that may be woody," 128 00:05:27.500 --> 00:05:29.850 which smells totally different, but they 129 00:05:29.850 --> 00:05:31.520 have the same effect on your brain, 130 00:05:31.520 --> 00:05:33.890 or they trigger the same emotional reaction 131 00:05:33.890 --> 00:05:34.560 in your brain. 132 00:05:34.560 --> 00:05:36.800 SHERVIN KHODABANDEH: That's very cool. 133 00:05:36.800 --> 00:05:38.202 And that's live? 134 00:05:38.202 --> 00:05:39.660 SOWMYA GOTTIPATI: That's live, yes. 135 00:05:39.660 --> 00:05:42.220 We are piloting that in China right now, 136 00:05:42.220 --> 00:05:45.850 and we're trying to expand it to other areas as well. 137 00:05:45.850 --> 00:05:48.460 The way we implemented it, it's interesting 138 00:05:48.460 --> 00:05:50.370 because we started off with online, 139 00:05:50.370 --> 00:05:51.880 because selling fragrances online 140 00:05:51.880 --> 00:05:54.460 is very difficult, because you can't how do you 141 00:05:54.460 --> 00:05:55.720 smell [it], right? 142 00:05:55.720 --> 00:05:58.200 At least ... we don't have that technology yet; 143 00:05:58.200 --> 00:05:59.490 maybe 10 years from now. 144 00:05:59.490 --> 00:06:02.750 But this is why we came up with this technology: to see [if] 145 00:06:02.750 --> 00:06:04.900 maybe can we use facial recognition, 146 00:06:04.900 --> 00:06:08.590 and the facial recognition can identify the emotion 147 00:06:08.590 --> 00:06:11.330 that you're feeling based on very subtle changes 148 00:06:11.330 --> 00:06:14.150 in your face when you smell it, and based on that, 149 00:06:14.150 --> 00:06:16.460 we can recognize how you are reacting 150 00:06:16.460 --> 00:06:17.850 to each of those fragrances. 151 00:06:17.850 --> 00:06:19.460 You get a score, and based on that, 152 00:06:19.460 --> 00:06:21.168 we could tell whether you liked or didn't 153 00:06:21.168 --> 00:06:24.310 like it or moderately, on a scale of one to 10, 154 00:06:24.310 --> 00:06:26.420 how you're liking it, and we use that data. 155 00:06:26.420 --> 00:06:28.880 SHERVIN KHODABANDEH: So as a customer, 156 00:06:28.880 --> 00:06:30.747 you're looking at my facial recognition? 157 00:06:30.747 --> 00:06:31.830 SOWMYA GOTTIPATI: Correct. 158 00:06:31.830 --> 00:06:33.990 SHERVIN KHODABANDEH: And then deciding what's 159 00:06:33.990 --> 00:06:35.337 the right fragrance for me. 160 00:06:35.337 --> 00:06:36.420 SOWMYA GOTTIPATI: Correct. 161 00:06:36.420 --> 00:06:38.003 SHERVIN KHODABANDEH: One of the things 162 00:06:38.003 --> 00:06:42.450 that Sam and I have been probing into over the past several 163 00:06:42.450 --> 00:06:46.990 years is the collaboration between human and AI 164 00:06:46.990 --> 00:06:52.020 and how it is so much more accretive, that collaboration, 165 00:06:52.020 --> 00:06:54.212 to the pure tech or the pure human 166 00:06:54.212 --> 00:06:55.670 and how they complement each other. 167 00:06:55.670 --> 00:06:59.140 And it seems to me that fragrance and makeup 168 00:06:59.140 --> 00:07:01.630 and these things are so personal. 169 00:07:01.630 --> 00:07:06.550 And I have to imagine that in the AI solutions you talked 170 00:07:06.550 --> 00:07:09.840 about, there must be -- or should be -- 171 00:07:09.840 --> 00:07:14.720 a fair amount of human intervention or collaboration. 172 00:07:14.720 --> 00:07:16.510 Can you comment on any of that? 173 00:07:16.510 --> 00:07:21.650 I get the recommender system and how it works and the receptors 174 00:07:21.650 --> 00:07:26.220 and all that, but is there a human side to this as well -- 175 00:07:26.220 --> 00:07:30.540 that maybe the experts and maybe the customers are interacting 176 00:07:30.540 --> 00:07:34.586 with the recommendations of AI and adapting it? 177 00:07:34.586 --> 00:07:34.972 178 00:07:34.972 --> 00:07:36.180 SOWMYA GOTTIPATI: Absolutely. 179 00:07:36.180 --> 00:07:39.870 Historically, when you try lipstick, how many can you try? 180 00:07:39.870 --> 00:07:41.700 Maybe three, four, or five. 181 00:07:41.700 --> 00:07:44.300 You can't do more than that because after a while, 182 00:07:44.300 --> 00:07:47.590 the skin starts drying up and it's uncomfortable. 183 00:07:47.590 --> 00:07:50.710 But now, with the virtual try-on capability, 184 00:07:50.710 --> 00:07:53.770 you could try 30 shades of lipstick in 30 seconds. 185 00:07:53.770 --> 00:07:55.380 Same thing with foundation. 186 00:07:55.380 --> 00:08:00.060 We have 56 shades of foundation, which 187 00:08:00.060 --> 00:08:02.440 are so slightly different. 188 00:08:02.440 --> 00:08:04.850 We take pride in providing high-touch service, 189 00:08:04.850 --> 00:08:08.180 and in each of our stores, we have beauty advisers. 190 00:08:08.180 --> 00:08:11.380 Their job is to work with the customer 191 00:08:11.380 --> 00:08:15.440 and recommend various foundations, lipsticks, etc. 192 00:08:15.440 --> 00:08:18.170 How do you try so many different foundations? 193 00:08:18.170 --> 00:08:21.250 You can't, whereas AI can narrow it down for you. 194 00:08:21.250 --> 00:08:23.880 These virtual try-on applications can narrow it down 195 00:08:23.880 --> 00:08:27.610 to two or three, and from there, a beauty adviser can actually 196 00:08:27.610 --> 00:08:28.880 work with the customer. 197 00:08:28.880 --> 00:08:30.860 So a beauty adviser is there to help 198 00:08:30.860 --> 00:08:34.130 them choose what actually looks better and have 199 00:08:34.130 --> 00:08:37.480 that conversation, and also explain why [they would] 200 00:08:37.480 --> 00:08:40.548 recommend this for your skin based on the results 201 00:08:40.548 --> 00:08:42.590 that you want achieve, whether it be [addressing] 202 00:08:42.590 --> 00:08:44.190 acne or dryness, etc. 203 00:08:44.190 --> 00:08:49.350 So we don't see that going away, that human-machine interaction; 204 00:08:49.350 --> 00:08:50.520 it will always be there. 205 00:08:50.520 --> 00:08:54.810 SHERVIN KHODABANDEH: Is there a feedback loop whereby 206 00:08:54.810 --> 00:08:57.320 the machine gets smarter? 207 00:08:57.320 --> 00:09:02.280 For example, the beauty adviser says this, or the customer -- 208 00:09:02.280 --> 00:09:06.620 now you've narrowed it down from 60 shades to three -- 209 00:09:06.620 --> 00:09:08.980 but based on the final choice they make, 210 00:09:08.980 --> 00:09:12.250 I assume the algorithms are getting smarter from that 211 00:09:12.250 --> 00:09:13.640 interaction as well. 212 00:09:13.640 --> 00:09:15.120 SOWMYA GOTTIPATI: Yes, absolutely. 213 00:09:15.120 --> 00:09:17.540 There are two ways it happens. 214 00:09:17.540 --> 00:09:22.960 One is, we have a consumer data platform that has information 215 00:09:22.960 --> 00:09:25.890 about what you previously bought, what you liked, 216 00:09:25.890 --> 00:09:28.110 what your situation is, etc., so it 217 00:09:28.110 --> 00:09:30.060 feeds into that so that next time 218 00:09:30.060 --> 00:09:32.630 when you come into the store, or when you interact with us, 219 00:09:32.630 --> 00:09:35.450 we can say, "Hey, by the way, last time you bought this, 220 00:09:35.450 --> 00:09:38.090 so I can reserve that for you or I could 221 00:09:38.090 --> 00:09:39.350 recommend something else." 222 00:09:39.350 --> 00:09:43.140 And the second thing is, when we rolled out 223 00:09:43.140 --> 00:09:46.130 virtual try-on applications, we started off 224 00:09:46.130 --> 00:09:50.090 with a million faces for the data modeling. 225 00:09:50.090 --> 00:09:52.980 Now it has a hundred million faces. 226 00:09:52.980 --> 00:09:56.530 So that algorithm and the engine is constantly 227 00:09:56.530 --> 00:09:58.090 improved over a period of time. 228 00:09:58.090 --> 00:10:00.630 SHERVIN KHODABANDEH: These are faces of actual customers, 229 00:10:00.630 --> 00:10:00.870 right? 230 00:10:00.870 --> 00:10:01.600 SOWMYA GOTTIPATI: Actual customers. 231 00:10:01.600 --> 00:10:02.142 That's right. 232 00:10:02.142 --> 00:10:04.670 SAM RANSBOTHAM: How does that beauty adviser 233 00:10:04.670 --> 00:10:09.530 work with the platform to get that feedback back 234 00:10:09.530 --> 00:10:10.573 into the system? 235 00:10:10.573 --> 00:10:12.490 I guess you can see what they actually ordered 236 00:10:12.490 --> 00:10:14.710 or what they chose or what they preferred. 237 00:10:14.710 --> 00:10:16.240 How do they get that input back in? 238 00:10:16.240 --> 00:10:18.000 SHERVIN KHODABANDEH: I can tell, Sam, 239 00:10:18.000 --> 00:10:20.616 you're intrigued by the beauty adviser concept. 240 00:10:20.616 --> 00:10:21.350 241 00:10:21.350 --> 00:10:27.940 SOWMYA GOTTIPATI: We have very strict privacy laws, 242 00:10:27.940 --> 00:10:30.950 so in the store, when people are buying 243 00:10:30.950 --> 00:10:33.040 in the store, a lot of times we actually 244 00:10:33.040 --> 00:10:36.580 do not gather their personal information, 245 00:10:36.580 --> 00:10:38.930 whereas when they're buying stuff 246 00:10:38.930 --> 00:10:42.040 from us online or [via] social platforms, 247 00:10:42.040 --> 00:10:45.200 where there is a login and that kind of mechanism, 248 00:10:45.200 --> 00:10:47.280 then you have that information, so we 249 00:10:47.280 --> 00:10:50.350 know exactly what they bought, and that information 250 00:10:50.350 --> 00:10:51.490 gets passed on. 251 00:10:51.490 --> 00:10:54.190 SHERVIN KHODABANDEH: One of the things we're seeing -- 252 00:10:54.190 --> 00:10:56.920 maybe it's a teaser of our new work to come out -- 253 00:10:56.920 --> 00:11:01.070 but one of the things we're seeing is that the ability 254 00:11:01.070 --> 00:11:07.680 to understand and explain why an algorithm or an AI solution 255 00:11:07.680 --> 00:11:11.680 makes a recommendation or pulls out a particular insight 256 00:11:11.680 --> 00:11:15.160 or an action, just the ability to sort of understand it rather 257 00:11:15.160 --> 00:11:19.783 than it's a black box, helps organizations get a lot more 258 00:11:19.783 --> 00:11:20.283 adoption. 259 00:11:20.283 --> 00:11:21.210 260 00:11:21.210 --> 00:11:23.820 SOWMYA GOTTIPATI: I can speak to our supply chain world. 261 00:11:23.820 --> 00:11:27.050 In the last year, we rolled out an AI application 262 00:11:27.050 --> 00:11:30.470 to do our supply planning and demand planning. 263 00:11:30.470 --> 00:11:34.700 Before, it was spreadsheets and those kinds of things. 264 00:11:34.700 --> 00:11:38.060 The moment we started using the AI application, 265 00:11:38.060 --> 00:11:43.410 we saw 30% increases in our forecasting accuracy. 266 00:11:43.410 --> 00:11:45.440 SHERVIN KHODABANDEH: Exactly. 267 00:11:45.440 --> 00:11:48.710 Some of my clients deliberately will 268 00:11:48.710 --> 00:11:54.440 settle for a less precise or less accurate recommendation 269 00:11:54.440 --> 00:11:56.800 so that they get the adoption going. 270 00:11:56.800 --> 00:11:59.090 Maybe they go for less precision to trade 271 00:11:59.090 --> 00:12:02.580 in a little bit of explainability or ability 272 00:12:02.580 --> 00:12:05.670 to override, and so that way, at least people 273 00:12:05.670 --> 00:12:07.010 will begin to trust it more. 274 00:12:07.010 --> 00:12:09.010 I don't know whether you do something like that. 275 00:12:09.010 --> 00:12:12.040 SOWMYA GOTTIPATI: I have not come across that, 276 00:12:12.040 --> 00:12:13.870 but that's a very interesting point. 277 00:12:13.870 --> 00:12:15.780 SAM RANSBOTHAM: There's an angle there, too, 278 00:12:15.780 --> 00:12:17.960 that trades short term and long term, Shervin. 279 00:12:17.960 --> 00:12:21.490 Let's say short term, they take a compromise solution 280 00:12:21.490 --> 00:12:22.953 that isn't quite as good. 281 00:12:22.953 --> 00:12:25.370 And then they can come back in three months and say, "Hey, 282 00:12:25.370 --> 00:12:27.440 you overrode this and it didn't turn out as good 283 00:12:27.440 --> 00:12:28.770 as you thought, sunshine." 284 00:12:28.770 --> 00:12:29.910 It's a longer game. 285 00:12:29.910 --> 00:12:31.690 It's not just each one-off decision -- 286 00:12:31.690 --> 00:12:33.440 that short-term optimal. 287 00:12:33.440 --> 00:12:36.220 SHERVIN KHODABANDEH: I did just that with my son, actually. 288 00:12:36.220 --> 00:12:39.560 He was going to a school dance, and he was outside, 289 00:12:39.560 --> 00:12:42.153 and all he had on was a T-shirt, and I said, "Wear a jacket; 290 00:12:42.153 --> 00:12:42.820 wear something." 291 00:12:42.820 --> 00:12:44.320 He says, "No, I'll be fine." 292 00:12:44.320 --> 00:12:45.890 And I'm like, "OK. 293 00:12:45.890 --> 00:12:47.030 You're going to get sick." 294 00:12:47.030 --> 00:12:50.790 And he got sick and ... hopefully he learned, 295 00:12:50.790 --> 00:12:52.530 and he is like, "Dad, you were right." 296 00:12:52.530 --> 00:12:55.480 [Laughs.] And I'm going to make him listen to this podcast 297 00:12:55.480 --> 00:12:58.300 so that he knows now, I've said this to the whole world. 298 00:12:58.300 --> 00:13:00.970 SAM RANSBOTHAM: Well, Shervin, if you can drag your kids 299 00:13:00.970 --> 00:13:03.410 into this, I have to tell the anecdote that I ... 300 00:13:03.410 --> 00:13:05.840 this will shock you, Shervin, but I track the time that 301 00:13:05.840 --> 00:13:08.280 my kids' bus arrives every day. 302 00:13:08.280 --> 00:13:11.590 So I've got seven years' worth of data now of what time that 303 00:13:11.590 --> 00:13:12.620 bus arrives. 304 00:13:12.620 --> 00:13:15.540 And so my next step is, I'm predicting that. 305 00:13:15.540 --> 00:13:17.590 I'm trying to say, "OK, what do we think today? 306 00:13:17.590 --> 00:13:19.105 Is today going to be an early day? 307 00:13:19.105 --> 00:13:20.480 Is today going to be a late day?" 308 00:13:20.480 --> 00:13:22.840 And then we can leave the house at the right time, 309 00:13:22.840 --> 00:13:24.930 but maybe we miss it someday, so I'm not sure -- 310 00:13:24.930 --> 00:13:26.600 SHERVIN KHODABANDEH: How's that working? 311 00:13:26.600 --> 00:13:28.510 How's that working out for you? 312 00:13:28.510 --> 00:13:30.910 SAM RANSBOTHAM: I'll have to come back on a later episode 313 00:13:30.910 --> 00:13:33.173 and see how that plays out. 314 00:13:33.173 --> 00:13:34.840 But at least it's real time, you know -- 315 00:13:34.840 --> 00:13:37.570 trying to use the dog food of the things that we talk about 316 00:13:37.570 --> 00:13:39.140 on the show. 317 00:13:39.140 --> 00:13:41.240 Speaking of things we talk about on the show -- 318 00:13:41.240 --> 00:13:42.410 how's that for a segue? -- 319 00:13:42.410 --> 00:13:44.330 Sowmya, we have a segment where we 320 00:13:44.330 --> 00:13:47.200 ask our guests a series of rapid-fire questions. 321 00:13:47.200 --> 00:13:49.770 And so the idea is you just hear this question 322 00:13:49.770 --> 00:13:52.750 and you give the first response that comes to your mind. 323 00:13:52.750 --> 00:13:54.480 Shervin, are you doing these today, 324 00:13:54.480 --> 00:13:55.230 or am I doing them today? 325 00:13:55.230 --> 00:13:55.980 SHERVIN KHODABANDEH: No, I'm not, 326 00:13:55.980 --> 00:13:57.330 because I don't have it in front of me. 327 00:13:57.330 --> 00:13:58.420 SAM RANSBOTHAM: All right. 328 00:13:58.420 --> 00:14:02.304 So, Sowmya, what's been your proudest AI moment? 329 00:14:02.304 --> 00:14:03.612 330 00:14:03.612 --> 00:14:05.070 SOWMYA GOTTIPATI: I think I already 331 00:14:05.070 --> 00:14:07.880 spoke about this: The fragrance application we built 332 00:14:07.880 --> 00:14:11.340 last year in my current role, that is really my proudest 333 00:14:11.340 --> 00:14:11.840 moment. 334 00:14:11.840 --> 00:14:14.890 But before that, when I was in my previous job, 335 00:14:14.890 --> 00:14:18.370 when we cracked the code on computer vision -- 336 00:14:18.370 --> 00:14:21.810 combining computer vision with natural language processing 337 00:14:21.810 --> 00:14:25.950 to break down video processing, because that was really 338 00:14:25.950 --> 00:14:28.980 the beginning days of AI when we were able to build something 339 00:14:28.980 --> 00:14:31.750 like that, so that was really cool. 340 00:14:31.750 --> 00:14:34.950 This is one thing I think is the coolest thing about technology: 341 00:14:34.950 --> 00:14:37.150 Technology transcends industries. 342 00:14:37.150 --> 00:14:39.590 It almost doesn't matter what industry it is. 343 00:14:39.590 --> 00:14:41.440 Technology is so pervasive. 344 00:14:41.440 --> 00:14:45.140 So I feel so happy that we are able to apply 345 00:14:45.140 --> 00:14:49.233 the same technology for totally different applications, 346 00:14:49.233 --> 00:14:50.400 and that's the beauty of it. 347 00:14:50.400 --> 00:14:52.600 SAM RANSBOTHAM: If you'd come up with something 348 00:14:52.600 --> 00:14:54.080 that topped the fragrance example, 349 00:14:54.080 --> 00:14:55.600 I was going to be super impressed. 350 00:14:55.600 --> 00:14:58.110 That was already a pretty proud one. 351 00:14:58.110 --> 00:15:00.150 So, what worries you the most about AI? 352 00:15:00.150 --> 00:15:03.220 SOWMYA GOTTIPATI: What worries me the most? 353 00:15:03.220 --> 00:15:07.640 I think it's the data privacy and the bias. 354 00:15:07.640 --> 00:15:10.730 Those, definitely, and the tracking. 355 00:15:10.730 --> 00:15:13.800 [On the] one hand, when I use Google maps, 356 00:15:13.800 --> 00:15:16.810 I like this functionality; I like what it does. 357 00:15:16.810 --> 00:15:20.100 But at the same time, I know Google knows exactly where I 358 00:15:20.100 --> 00:15:21.910 am at every second of the day. 359 00:15:21.910 --> 00:15:23.770 I don't like that. 360 00:15:23.770 --> 00:15:27.750 So the privacy and the data-tracking piece, 361 00:15:27.750 --> 00:15:29.610 absolutely, [are] a problem. 362 00:15:29.610 --> 00:15:32.310 SAM RANSBOTHAM: What's your favorite activity that 363 00:15:32.310 --> 00:15:33.990 involves no technology? 364 00:15:33.990 --> 00:15:35.380 SOWMYA GOTTIPATI: Reading a book. 365 00:15:35.380 --> 00:15:37.672 SAM RANSBOTHAM: Do you have any recommendations for us? 366 00:15:37.672 --> 00:15:40.180 You can extend your answer with a book recommendation. 367 00:15:40.180 --> 00:15:40.680 368 00:15:40.680 --> 00:15:42.880 SOWMYA GOTTIPATI: Recently, it's a very short book 369 00:15:42.880 --> 00:15:45.530 that I read: The Night Diary. 370 00:15:45.530 --> 00:15:49.620 It's about a Pakistani girl during the partition 371 00:15:49.620 --> 00:15:52.990 between India and Pakistan and this little girl 372 00:15:52.990 --> 00:15:57.630 who didn't talk but wrote a diary every day. 373 00:15:57.630 --> 00:16:00.780 It was a really moving and interesting book. 374 00:16:00.780 --> 00:16:01.700 I really liked it. 375 00:16:01.700 --> 00:16:02.200 376 00:16:02.200 --> 00:16:05.069 SAM RANSBOTHAM: So, what was the first career that you 377 00:16:05.069 --> 00:16:06.277 wanted when you were a child? 378 00:16:06.277 --> 00:16:07.495 379 00:16:07.495 --> 00:16:08.370 SOWMYA GOTTIPATI: Oh. 380 00:16:08.370 --> 00:16:10.730 [Laughs.] I wanted to be a pilot, actually. 381 00:16:10.730 --> 00:16:11.209 382 00:16:11.209 --> 00:16:12.876 SHERVIN KHODABANDEH: You are one, right? 383 00:16:12.876 --> 00:16:13.223 384 00:16:13.223 --> 00:16:14.390 SOWMYA GOTTIPATI: Yes, I am. 385 00:16:14.390 --> 00:16:15.220 [Laughs.] 386 00:16:15.220 --> 00:16:17.790 SAM RANSBOTHAM: OK, so you get a check mark next to that one. 387 00:16:17.790 --> 00:16:19.373 SHERVIN KHODABANDEH: You got that one. 388 00:16:19.373 --> 00:16:19.695 389 00:16:19.695 --> 00:16:22.070 SOWMYA GOTTIPATI: Well, I'm more of a recreational pilot, 390 00:16:22.070 --> 00:16:26.160 but I actually wanted to be a professional pilot. 391 00:16:26.160 --> 00:16:28.430 But that's OK; I'll settle for recreational. 392 00:16:28.430 --> 00:16:30.320 SAM RANSBOTHAM: I don't know. 393 00:16:30.320 --> 00:16:33.950 You've gone from AT&T to NBC to Estée Lauder, 394 00:16:33.950 --> 00:16:36.040 so there's a next step. 395 00:16:36.040 --> 00:16:39.200 So, what's your greatest wish for AI in the future? 396 00:16:39.200 --> 00:16:40.420 What are you hoping for? 397 00:16:40.420 --> 00:16:44.240 398 00:16:44.240 --> 00:16:47.780 SOWMYA GOTTIPATI: This is more of an answer from my personal 399 00:16:47.780 --> 00:16:52.900 side of things, which is, I just hope we use AI 400 00:16:52.900 --> 00:16:56.340 for environmental causes more -- you know, 401 00:16:56.340 --> 00:17:00.890 better crops with better yields, and water conservation. 402 00:17:00.890 --> 00:17:04.810 And I hope there are a lot more advances 403 00:17:04.810 --> 00:17:06.650 on that side of things as opposed 404 00:17:06.650 --> 00:17:11.170 to shopping or personalized experiences. 405 00:17:11.170 --> 00:17:11.670 406 00:17:11.670 --> 00:17:13.540 SAM RANSBOTHAM: That's particularly 407 00:17:13.540 --> 00:17:15.123 interesting, coming from someone who's 408 00:17:15.123 --> 00:17:16.790 so interested in both of those aspects, 409 00:17:16.790 --> 00:17:20.260 that you think that these other aspects might be even more 410 00:17:20.260 --> 00:17:21.182 promising. 411 00:17:21.182 --> 00:17:22.890 Sowmya, great talking with you, and these 412 00:17:22.890 --> 00:17:24.099 are fascinating applications. 413 00:17:24.099 --> 00:17:27.109 I think that most people who listen to this 414 00:17:27.109 --> 00:17:28.830 are going to remember the smell example. 415 00:17:28.830 --> 00:17:30.360 I mean, I think there's something 416 00:17:30.360 --> 00:17:32.190 very visceral about that that I think 417 00:17:32.190 --> 00:17:35.440 will connect with lots of people and spur some thinking. 418 00:17:35.440 --> 00:17:37.545 So thanks for taking the time to talk with us. 419 00:17:37.545 --> 00:17:38.420 We really enjoyed it. 420 00:17:38.420 --> 00:17:38.870 Thank you. 421 00:17:38.870 --> 00:17:39.880 SOWMYA GOTTIPATI: Oh, thank you. 422 00:17:39.880 --> 00:17:41.430 SOWYMA GOTTIPATI: This is so much fun. 423 00:17:41.430 --> 00:17:43.055 SHERVIN KHODABANDEH: Thank you so much. 424 00:17:43.055 --> 00:17:45.790 And we can have some beauty advisers to work 425 00:17:45.790 --> 00:17:47.340 on Sam while we're talking. 426 00:17:47.340 --> 00:17:47.753 427 00:17:47.753 --> 00:17:49.170 SAM RANSBOTHAM: This is a podcast! 428 00:17:49.170 --> 00:17:50.930 Nobody knows that I've got a face for radio. 429 00:17:50.930 --> 00:17:53.180 SHERVIN KHODABANDEH: Tell them to bring all the shades 430 00:17:53.180 --> 00:17:55.956 and foundations, and we'll see what they could do. 431 00:17:55.956 --> 00:17:56.463 432 00:17:56.463 --> 00:17:58.880 SOWMYA GOTTIPATI: You should go to Esteelauder.com and try 433 00:17:58.880 --> 00:18:00.760 the 30 shades of lipstick in 30 seconds -- 434 00:18:00.760 --> 00:18:02.000 SHERVIN KHODABANDEH: I'll go with you. 435 00:18:02.000 --> 00:18:03.210 SOWMYA GOTTIPATI: -- and the foundation. 436 00:18:03.210 --> 00:18:04.190 See how it looks on you. 437 00:18:04.190 --> 00:18:04.550 [Laughs] 438 00:18:04.550 --> 00:18:05.400 SHERVIN KHODABANDEH: I'll go with you. 439 00:18:05.400 --> 00:18:07.310 SAM RANSBOTHAM: I just Googled something -- "shades of gray" 440 00:18:07.310 --> 00:18:08.227 -- and I get something 441 00:18:08.227 --> 00:18:09.352 SOWMYA GOTTIPATI: Not that. 442 00:18:09.352 --> 00:18:09.960 [Laughs] 443 00:18:09.960 --> 00:18:16.100 SAM RANSBOTHAM: We've come to the end of Season 444 00:18:16.100 --> 00:18:18.120 4 of Me, Myself, and AI. 445 00:18:18.120 --> 00:18:20.020 We'll be back on Aug. 446 00:18:20.020 --> 00:18:21.660 2 with new episodes. 447 00:18:21.660 --> 00:18:25.410 In the meantime, we hope you'll listen to our back episodes 448 00:18:25.410 --> 00:18:27.880 and join our LinkedIn community, AI for Leaders, 449 00:18:27.880 --> 00:18:29.830 to keep the discussion going. 450 00:18:29.830 --> 00:18:31.418 Thanks for listening. 451 00:18:31.418 --> 00:18:32.720 452 00:18:32.720 --> 00:18:37.020 ALLISON RYDER: Thanks for listening 453 00:18:37.020 --> 00:18:38.520 to Me, Myself, and AI. 454 00:18:38.520 --> 00:18:40.970 We believe, like you, that the conversation 455 00:18:40.970 --> 00:18:43.200 about AI implementation doesn't start and stop 456 00:18:43.200 --> 00:18:44.317 with this podcast. 457 00:18:44.317 --> 00:18:46.150 That's why we've created a group on LinkedIn 458 00:18:46.150 --> 00:18:47.980 specifically for leaders like you. 459 00:18:47.980 --> 00:18:50.730 It's called AI for Leaders, and if you join us, 460 00:18:50.730 --> 00:18:52.760 you can chat with show creators and hosts, 461 00:18:52.760 --> 00:18:55.470 ask your own questions, share your insights, 462 00:18:55.470 --> 00:18:58.120 and gain access to valuable resources about AI 463 00:18:58.120 --> 00:19:00.950 implementation from MIT SMR and BCG. 464 00:19:00.950 --> 00:19:06.080 You can access it by visiting mitsmr.com/AIforLeaders. 465 00:19:06.080 --> 00:19:08.800 We'll put that link in the show notes, 466 00:19:08.800 --> 00:19:11.230 and we hope to see you there. 467 00:19:11.230 --> 00:19:16.000