WEBVTT 1 00:00:00.000 --> 00:00:00.900 2 00:00:00.900 --> 00:00:03.640 SAM RANSBOTHAM: Even digital-fist companies 3 00:00:03.640 --> 00:00:06.260 approach technology implementations with caution, 4 00:00:06.260 --> 00:00:09.360 ensuring they limit their exposure to risk. 5 00:00:09.360 --> 00:00:13.620 In today's episode, find out how one e-commerce retailer thinks 6 00:00:13.620 --> 00:00:15.340 about implementing -- and scaling -- 7 00:00:15.340 --> 00:00:16.410 AI. 8 00:00:16.410 --> 00:00:19.230 FIONA TAN: I'm Fiona Tan from Wayfair, 9 00:00:19.230 --> 00:00:21.880 and you're listening Me, Myself, and AI. 10 00:00:21.880 --> 00:00:24.180 SAM RANSBOTHAM: Welcome to Me, Myself, and AI, 11 00:00:24.180 --> 00:00:27.220 a podcast on artificial intelligence in business. 12 00:00:27.220 --> 00:00:30.950 Each episode, we introduce you to someone innovating with AI. 13 00:00:30.950 --> 00:00:35.250 I'm Sam Ransbotham, professor of analytics at Boston College. 14 00:00:35.250 --> 00:00:38.780 I'm also the AI and business strategy guest editor 15 00:00:38.780 --> 00:00:40.072 at MIT Sloan Management Review. 16 00:00:40.072 --> 00:00:40.572 17 00:00:40.572 --> 00:00:42.660 SHERVIN KHODABANDEH: And I'm Shervin Khodabandeh, 18 00:00:42.660 --> 00:00:46.680 senior partner with BCG, and I colead BCG's AI practice 19 00:00:46.680 --> 00:00:47.670 in North America. 20 00:00:47.670 --> 00:00:52.170 Together, MIT SMR and BCG have been researching and publishing 21 00:00:52.170 --> 00:00:55.110 on AI for six years, interviewing hundreds 22 00:00:55.110 --> 00:00:57.220 of practitioners and surveying thousands 23 00:00:57.220 --> 00:01:00.710 of companies on what it takes to build and to deploy and scale 24 00:01:00.710 --> 00:01:03.180 AI capabilities and really transform 25 00:01:03.180 --> 00:01:05.459 the way organizations operate. 26 00:01:05.459 --> 00:01:07.200 SAM RANSBOTHAM: Today Shervin and I 27 00:01:07.200 --> 00:01:09.420 are excited to be joined by Fiona Tan. 28 00:01:09.420 --> 00:01:11.532 Fiona's the CTO at Wayfair. 29 00:01:11.532 --> 00:01:12.740 Fiona, thanks for joining us. 30 00:01:12.740 --> 00:01:13.240 Welcome. 31 00:01:13.240 --> 00:01:14.698 FIONA TAN: Thank you for having me. 32 00:01:14.698 --> 00:01:16.240 SAM RANSBOTHAM: Let's get started. 33 00:01:16.240 --> 00:01:18.770 We've got listeners throughout the world that may not be 34 00:01:18.770 --> 00:01:21.787 as familiar with Wayfair as Shervin and I are -- 35 00:01:21.787 --> 00:01:24.120 we could probably look around our rooms and find Wayfair 36 00:01:24.120 --> 00:01:27.150 items -- so can you start by describing Wayfair? 37 00:01:27.150 --> 00:01:28.170 What does Wayfair do? 38 00:01:28.170 --> 00:01:29.140 FIONA TAN: For sure. 39 00:01:29.140 --> 00:01:31.140 And first of all, thank you for being customers; 40 00:01:31.140 --> 00:01:33.580 [I'm] always happy to have customers to talk to. 41 00:01:33.580 --> 00:01:37.030 Basically, we are a digital-first retailer 42 00:01:37.030 --> 00:01:38.680 in the home goods category. 43 00:01:38.680 --> 00:01:41.230 We've also augmented that now with some stores, 44 00:01:41.230 --> 00:01:43.980 opening our second store in the Boston, Massachusetts, 45 00:01:43.980 --> 00:01:45.630 area, and so [I'm] really excited 46 00:01:45.630 --> 00:01:47.860 about that as well, as we move toward being 47 00:01:47.860 --> 00:01:48.860 an omnichannel retailer. 48 00:01:48.860 --> 00:01:50.735 SAM RANSBOTHAM: That's the opposite direction 49 00:01:50.735 --> 00:01:51.660 than most people go. 50 00:01:51.660 --> 00:01:54.118 FIONA TAN: You know, it is, but it's actually kind of neat; 51 00:01:54.118 --> 00:01:56.600 it does afford us some interesting ways of approaching 52 00:01:56.600 --> 00:01:58.550 it because we are digital first. 53 00:01:58.550 --> 00:02:00.410 I think, hopefully, you'll find that there 54 00:02:00.410 --> 00:02:03.450 are some really nice ways that we are able to tie 55 00:02:03.450 --> 00:02:04.587 in the digital aspects. 56 00:02:04.587 --> 00:02:06.420 You go into the store, you see what's there, 57 00:02:06.420 --> 00:02:08.770 but you can also see the rest of our catalog 58 00:02:08.770 --> 00:02:11.800 in a way that's hopefully really useful and a little bit 59 00:02:11.800 --> 00:02:14.320 different than the typical brick-and-mortar shopping 60 00:02:14.320 --> 00:02:15.540 experience. 61 00:02:15.540 --> 00:02:18.930 Part of it, too, that's really interesting about Wayfair 62 00:02:18.930 --> 00:02:21.990 and our approach to AI/ML -- a lot of ... 63 00:02:21.990 --> 00:02:24.280 it's under the covers, and you don't realize, 64 00:02:24.280 --> 00:02:27.590 but what is actually powering the entire experience that you 65 00:02:27.590 --> 00:02:30.340 have as a customer -- and then also for our suppliers -- 66 00:02:30.340 --> 00:02:33.530 there's a lot of machine learning and AI behind it. 67 00:02:33.530 --> 00:02:36.180 It's not that visible, but it is actually 68 00:02:36.180 --> 00:02:37.740 powering everything that we do. 69 00:02:37.740 --> 00:02:39.510 For example, there's a lot around 70 00:02:39.510 --> 00:02:41.430 trying to understand the customer's intent. 71 00:02:41.430 --> 00:02:44.910 And we do that inasmuch as what they can tell us in the search 72 00:02:44.910 --> 00:02:47.763 strings, etc., but also based on where they've looked 73 00:02:47.763 --> 00:02:49.930 and how much time they've spent looking at something 74 00:02:49.930 --> 00:02:50.805 versus another thing. 75 00:02:50.805 --> 00:02:53.400 So we try to build up our customer graph, 76 00:02:53.400 --> 00:02:56.390 and then we also look at the products, the items 77 00:02:56.390 --> 00:02:58.010 that we're listing on our site. 78 00:02:58.010 --> 00:02:59.810 Because of the category that we're in, 79 00:02:59.810 --> 00:03:02.230 we don't really have as [many] branded items. 80 00:03:02.230 --> 00:03:07.413 So it's, how do we use AI and ML to upload as many items 81 00:03:07.413 --> 00:03:09.580 as possible -- and we have tens of millions of items 82 00:03:09.580 --> 00:03:14.110 on our site -- and be able to get as much product information 83 00:03:14.110 --> 00:03:14.610 as possible? 84 00:03:14.610 --> 00:03:16.235 Some of that we get from our suppliers, 85 00:03:16.235 --> 00:03:18.250 but [for] a lot of the product information, 86 00:03:18.250 --> 00:03:20.902 we are using AI and ML to actually glean [it] 87 00:03:20.902 --> 00:03:23.110 from the photos they give us, from the text that they 88 00:03:23.110 --> 00:03:27.270 give us, to be able to form our product understanding. 89 00:03:27.270 --> 00:03:29.880 So we build a customer graph, we build a product graph, 90 00:03:29.880 --> 00:03:32.770 all using AI/ML, and then we do that matchmaking. 91 00:03:32.770 --> 00:03:35.690 When you're on our site and you're looking for something 92 00:03:35.690 --> 00:03:38.350 we can personalize based on what we already know about you. 93 00:03:38.350 --> 00:03:40.400 That's the magic: How do we find you 94 00:03:40.400 --> 00:03:42.790 that perfect couch when you can't really describe it 95 00:03:42.790 --> 00:03:44.752 to me in a very succinct way? 96 00:03:44.752 --> 00:03:46.210 SHERVIN KHODABANDEH: Fiona, tell us 97 00:03:46.210 --> 00:03:47.540 a bit about your own journey. 98 00:03:47.540 --> 00:03:51.800 How did you get into technology, and how'd that evolve? 99 00:03:51.800 --> 00:03:53.820 FIONA TAN: I went to MIT as an undergrad, 100 00:03:53.820 --> 00:03:57.390 and I took my first computer science class, 6001, 101 00:03:57.390 --> 00:03:58.500 and fell in love with it. 102 00:03:58.500 --> 00:04:01.330 And it's one of those things I look back [on] and I'm like, 103 00:04:01.330 --> 00:04:05.358 "I'm so fortunate to find something that I enjoy doing," 104 00:04:05.358 --> 00:04:07.650 and I realized, "They're going to pay me money for it." 105 00:04:07.650 --> 00:04:10.732 And this was one of those really fortuitous moments, 106 00:04:10.732 --> 00:04:12.190 I think, when I realized, hey, I've 107 00:04:12.190 --> 00:04:14.290 always loved solving problems, I've always 108 00:04:14.290 --> 00:04:16.500 loved optimizing whatever I was doing, 109 00:04:16.500 --> 00:04:20.220 and here's a field where I get to do that in practice. 110 00:04:20.220 --> 00:04:22.300 I did my master's as well in computer science, 111 00:04:22.300 --> 00:04:25.210 and then I've worked in technology my whole career. 112 00:04:25.210 --> 00:04:28.710 I started out building out enterprise software, so really 113 00:04:28.710 --> 00:04:32.200 looking at, how do you build solutions that can be adopted 114 00:04:32.200 --> 00:04:34.740 by any and all industries? 115 00:04:34.740 --> 00:04:37.300 I spent some time at Oracle, starting out, and then [was] 116 00:04:37.300 --> 00:04:39.210 at a company called Tipco for a long time, 117 00:04:39.210 --> 00:04:41.460 essentially building enterprise platforms. 118 00:04:41.460 --> 00:04:45.170 And then I moved over to Walmart and now Wayfair. 119 00:04:45.170 --> 00:04:49.200 The platform mindset that I got from the first two-thirds 120 00:04:49.200 --> 00:04:50.750 of my career is still very prevalent. 121 00:04:50.750 --> 00:04:53.770 Even when you're building a very specific use case, 122 00:04:53.770 --> 00:04:56.800 you still want to try to use that platform mindset, 123 00:04:56.800 --> 00:05:00.200 because it then allows you to build out solutions in a much 124 00:05:00.200 --> 00:05:02.080 more scalable and sensible way. 125 00:05:02.080 --> 00:05:03.640 It's still relevant, and then you 126 00:05:03.640 --> 00:05:06.450 get to focus on a very specific problem set. 127 00:05:06.450 --> 00:05:08.683 You get to be much more business-outcome-based. 128 00:05:08.683 --> 00:05:10.100 And then those are the things that 129 00:05:10.100 --> 00:05:13.360 are really different about a very specific retail use 130 00:05:13.360 --> 00:05:16.090 case, for example, versus building out an enterprise 131 00:05:16.090 --> 00:05:16.922 platform. 132 00:05:16.922 --> 00:05:19.130 In one case, you're very, very close to the customer, 133 00:05:19.130 --> 00:05:20.540 and you get to focus on [solving] 134 00:05:20.540 --> 00:05:23.730 a particular problem set but still building it, I would say, 135 00:05:23.730 --> 00:05:25.110 with the right architecture [and] 136 00:05:25.110 --> 00:05:27.650 platform mindset that allows you to then scale, whether it's 137 00:05:27.650 --> 00:05:28.710 horizontally or vertically. 138 00:05:28.710 --> 00:05:30.335 I think that's one of the things that I 139 00:05:30.335 --> 00:05:33.930 have found that has been useful: my background in enterprise 140 00:05:33.930 --> 00:05:34.825 platforms. 141 00:05:34.825 --> 00:05:35.950 SHERVIN KHODABANDEH: Great. 142 00:05:35.950 --> 00:05:38.510 Can you comment a little bit about the overall philosophy 143 00:05:38.510 --> 00:05:40.135 of how you're thinking about use cases? 144 00:05:40.135 --> 00:05:40.520 145 00:05:40.520 --> 00:05:41.530 FIONA TAN: Absolutely. 146 00:05:41.530 --> 00:05:44.730 And I think that is another key tenet of how we operate. 147 00:05:44.730 --> 00:05:47.380 At Wayfair, we actually started out in marketing. 148 00:05:47.380 --> 00:05:50.720 This was an area where we felt like AI and ML could really 149 00:05:50.720 --> 00:05:51.780 play a big part. 150 00:05:51.780 --> 00:05:54.180 We say, hey, look -- from a marketing standpoint, 151 00:05:54.180 --> 00:05:56.890 the bidding, and how much I should bid for, and where I 152 00:05:56.890 --> 00:05:59.230 should spend the money from a channel perspective ... 153 00:05:59.230 --> 00:06:01.710 those are things that we feel like we can control and are 154 00:06:01.710 --> 00:06:02.510 lower risk. 155 00:06:02.510 --> 00:06:06.060 If we get it wrong, maybe we pay a little bit more for an ad 156 00:06:06.060 --> 00:06:08.490 than we needed to, but these were areas 157 00:06:08.490 --> 00:06:11.420 that we actually invested in first, because we could 158 00:06:11.420 --> 00:06:16.460 learn and use AI and ML for that and control the amount of risk 159 00:06:16.460 --> 00:06:17.550 that we were following. 160 00:06:17.550 --> 00:06:19.900 And then once we figured that out, 161 00:06:19.900 --> 00:06:22.470 once we got more into the use of ML, 162 00:06:22.470 --> 00:06:24.900 we then looked at other areas we could apply it to. 163 00:06:24.900 --> 00:06:27.500 So how do we apply similar technologies 164 00:06:27.500 --> 00:06:30.290 in terms of our pricing and demand generation? 165 00:06:30.290 --> 00:06:33.260 How do we expand that out to the rest of our supply chain, 166 00:06:33.260 --> 00:06:36.720 the catalog, and understanding about products into search, 167 00:06:36.720 --> 00:06:38.590 and an understanding of the customers? 168 00:06:38.590 --> 00:06:41.510 So if you look at where we apply AI and ML now, 169 00:06:41.510 --> 00:06:43.560 it's much more prevalent, but we started out 170 00:06:43.560 --> 00:06:47.060 with this very specific use case around marketing and customer 171 00:06:47.060 --> 00:06:47.970 acquisition. 172 00:06:47.970 --> 00:06:50.860 That was the first place that we started using AI and ML. 173 00:06:50.860 --> 00:06:54.130 SHERVIN KHODABANDEH: You have these two rules -- 174 00:06:54.130 --> 00:06:59.750 sort of the rules of thumb [that could serve as] good advice 175 00:06:59.750 --> 00:07:03.390 to many, many retailers and in other industries as well -- 176 00:07:03.390 --> 00:07:07.530 what are the two rules of what use cases lent themselves more 177 00:07:07.530 --> 00:07:08.790 to AI/ML? 178 00:07:08.790 --> 00:07:10.030 Tell us more about that. 179 00:07:10.030 --> 00:07:12.140 FIONA TAN: We use a little bit of a risk framework 180 00:07:12.140 --> 00:07:14.770 around what is the reputational risk 181 00:07:14.770 --> 00:07:17.980 or other risk to the company if we get it wrong. 182 00:07:17.980 --> 00:07:19.940 Back to marketing, a lot of it's going 183 00:07:19.940 --> 00:07:22.180 to be, if we get it wrong, we pay a little bit more. 184 00:07:22.180 --> 00:07:25.925 Other areas where we don't go fully automated because we're 185 00:07:25.925 --> 00:07:28.050 a little bit more concerned from a risk perspective 186 00:07:28.050 --> 00:07:31.510 could be, for example, product information or product quality. 187 00:07:31.510 --> 00:07:33.610 I think we try to do that as much as possible. 188 00:07:33.610 --> 00:07:35.440 But then, to some degree, this is also 189 00:07:35.440 --> 00:07:37.810 something where we would include the humans in the loop 190 00:07:37.810 --> 00:07:39.750 to do that extra level of checks. 191 00:07:39.750 --> 00:07:42.560 So we don't fully automate, because if we get that wrong, 192 00:07:42.560 --> 00:07:43.910 that is problematic. 193 00:07:43.910 --> 00:07:47.130 We use that as a way for us to first figure out 194 00:07:47.130 --> 00:07:48.740 what we lean into first. 195 00:07:48.740 --> 00:07:51.390 If you can automate fully and control the risk, 196 00:07:51.390 --> 00:07:53.770 that's where we feel like we can go a little faster. 197 00:07:53.770 --> 00:07:57.200 And then, other areas we might go in but then also involve 198 00:07:57.200 --> 00:08:00.170 the humans in the loop -- the controls to make sure that we 199 00:08:00.170 --> 00:08:02.000 have that extra level of checks. 200 00:08:02.000 --> 00:08:04.350 So that's one way that we look at it. 201 00:08:04.350 --> 00:08:07.130 The other is around data, and this is obviously something 202 00:08:07.130 --> 00:08:10.320 that I think a lot of other technology organizations 203 00:08:10.320 --> 00:08:13.490 are also thinking about when they think about ML: Are we 204 00:08:13.490 --> 00:08:16.420 ready from a quantity and availability of data 205 00:08:16.420 --> 00:08:19.313 [perspective], as well as the usability of the data? 206 00:08:19.313 --> 00:08:20.730 I think that's something, frankly, 207 00:08:20.730 --> 00:08:22.940 that a lot of companies, struggle with: making 208 00:08:22.940 --> 00:08:26.480 sure that there's one source of truth versus now 209 00:08:26.480 --> 00:08:28.480 there's five people who use the source of truth, 210 00:08:28.480 --> 00:08:30.340 have done some adjustments to it, 211 00:08:30.340 --> 00:08:32.289 and now I've got five things that 212 00:08:32.289 --> 00:08:35.360 are sort of similar to that first source of truth, 213 00:08:35.360 --> 00:08:37.929 and the manageability of it becomes a bit of an issue. 214 00:08:37.929 --> 00:08:40.735 We look at where I've [got] good, stable sources of truth. 215 00:08:40.735 --> 00:08:43.110 SHERVIN KHODABANDEH: You talked about [keeping the] human 216 00:08:43.110 --> 00:08:48.550 in the loop, and this idea -- as fundamental or simple as it 217 00:08:48.550 --> 00:08:49.090 sounds -- 218 00:08:49.090 --> 00:08:52.140 I think that it's still a misconception for many, 219 00:08:52.140 --> 00:08:54.640 because many still think it's not AI 220 00:08:54.640 --> 00:08:57.610 if there's a human involved, or it 221 00:08:57.610 --> 00:09:00.790 must be that it does everything all by itself, 222 00:09:00.790 --> 00:09:02.260 otherwise it's really, really not 223 00:09:02.260 --> 00:09:05.870 full AI, which Sam and I have looked a lot at this 224 00:09:05.870 --> 00:09:07.660 across companies, and what we've seen 225 00:09:07.660 --> 00:09:10.130 is, there's a whole bunch of use cases 226 00:09:10.130 --> 00:09:13.210 that you're either not going to approach at all if you expect 227 00:09:13.210 --> 00:09:17.470 fully automated, or would be suboptimal or substandard 228 00:09:17.470 --> 00:09:19.960 or, as you said, highly risky. 229 00:09:19.960 --> 00:09:22.630 What are some examples of humans in the loop? 230 00:09:22.630 --> 00:09:25.920 I can imagine there might be ideas where humans and AI work 231 00:09:25.920 --> 00:09:28.710 together, and AI has some ideas, and the human says, 232 00:09:28.710 --> 00:09:30.160 "Well, maybe not quite this one. 233 00:09:30.160 --> 00:09:31.580 Let's try this other idea." 234 00:09:31.580 --> 00:09:34.520 Is that also something prevalent in your organization? 235 00:09:34.520 --> 00:09:35.082 236 00:09:35.082 --> 00:09:36.040 FIONA TAN: Yeah, it is. 237 00:09:36.040 --> 00:09:37.980 It is actually quite prevalent, and this 238 00:09:37.980 --> 00:09:40.080 is the part where it's really much more 239 00:09:40.080 --> 00:09:42.670 business-driven and pragmatic in terms of our application. 240 00:09:42.670 --> 00:09:44.768 And so, to some degree, there's probably 241 00:09:44.768 --> 00:09:47.060 someone out there who might say, "Hey, look, that's not 242 00:09:47.060 --> 00:09:49.677 pure AI or pure ML, because you are involving 243 00:09:49.677 --> 00:09:50.760 this human or that human." 244 00:09:50.760 --> 00:09:53.040 But in our case, it doesn't really matter. 245 00:09:53.040 --> 00:09:56.530 We're trying to achieve a strong outcome from a business 246 00:09:56.530 --> 00:09:57.690 perspective. 247 00:09:57.690 --> 00:09:59.780 The way we've thought about it is, sometimes 248 00:09:59.780 --> 00:10:03.280 we use the automation and AI/ML to narrow down the choices. 249 00:10:03.280 --> 00:10:05.610 So we do some of the work initially, 250 00:10:05.610 --> 00:10:08.480 and then we narrow it down to, say, maybe five, six, 251 00:10:08.480 --> 00:10:10.100 whatever it is -- a smaller number -- 252 00:10:10.100 --> 00:10:12.440 that we can bring the experts, the humans, 253 00:10:12.440 --> 00:10:14.810 in to make that final decision. 254 00:10:14.810 --> 00:10:16.120 So quality is one of them. 255 00:10:16.120 --> 00:10:18.520 The other is style -- that's something that's always 256 00:10:18.520 --> 00:10:20.340 a little tricky to be able to get right. 257 00:10:20.340 --> 00:10:22.040 If we can narrow it down, it just 258 00:10:22.040 --> 00:10:24.840 makes the human part a lot easier as well, 259 00:10:24.840 --> 00:10:26.380 but then also very valuable. 260 00:10:26.380 --> 00:10:27.755 Because some of those [decisions] 261 00:10:27.755 --> 00:10:29.410 are sometimes pretty nuanced. 262 00:10:29.410 --> 00:10:32.430 I'm actually not a style expert, so I probably couldn't tell you 263 00:10:32.430 --> 00:10:34.870 the difference between, like, two, three styles, 264 00:10:34.870 --> 00:10:37.150 and there's a lot of places where they cross over, 265 00:10:37.150 --> 00:10:37.740 etc., right? 266 00:10:37.740 --> 00:10:39.323 And so that's when you have an expert. 267 00:10:39.323 --> 00:10:41.130 And we do have design experts on staff 268 00:10:41.130 --> 00:10:43.600 that can help us with some of those definitions. 269 00:10:43.600 --> 00:10:45.800 And then in our space, things change. 270 00:10:45.800 --> 00:10:48.470 Styles change -- what's in, what's not, and all that. 271 00:10:48.470 --> 00:10:52.540 So, again, having the ability to bring in humans in the loop 272 00:10:52.540 --> 00:10:54.710 is super interesting and helpful to us. 273 00:10:54.710 --> 00:10:56.550 SAM RANSBOTHAM: And people, as you mentioned 274 00:10:56.550 --> 00:10:58.008 at the very beginning, may not even 275 00:10:58.008 --> 00:11:01.720 know that you're using AI or using machine learning. 276 00:11:01.720 --> 00:11:03.220 Shervin and I are kind of fascinated 277 00:11:03.220 --> 00:11:05.190 with this at the moment, because it 278 00:11:05.190 --> 00:11:09.230 seems like there's a whole lot of uses that people once it's 279 00:11:09.230 --> 00:11:12.320 out there and you can actually do it practically, it couldn't 280 00:11:12.320 --> 00:11:14.570 possibly be artificial intelligence, because that that 281 00:11:14.570 --> 00:11:18.720 is a mythical being, but once you can do it, 282 00:11:18.720 --> 00:11:20.480 well, then it seems normal. 283 00:11:20.480 --> 00:11:24.530 You mentioned how widespread use of it is throughout 284 00:11:24.530 --> 00:11:26.013 your organization, and actually -- 285 00:11:26.013 --> 00:11:27.930 how many people in your organization would say 286 00:11:27.930 --> 00:11:28.720 they're using AI? 287 00:11:28.720 --> 00:11:31.130 FIONA TAN: Yeah, I think that's the part 288 00:11:31.130 --> 00:11:34.510 around we try to build it into the fabric of the technology 289 00:11:34.510 --> 00:11:35.410 organization. 290 00:11:35.410 --> 00:11:38.430 So we have a data science team, but they work very closely 291 00:11:38.430 --> 00:11:39.760 with the software engineers. 292 00:11:39.760 --> 00:11:42.815 We want to, again, even within the technology organization, 293 00:11:42.815 --> 00:11:44.190 make sure that the scientists who 294 00:11:44.190 --> 00:11:47.000 are building the models, that it's something that's actually 295 00:11:47.000 --> 00:11:48.230 production worthy. 296 00:11:48.230 --> 00:11:51.620 You want to make sure that the teams are well integrated ... 297 00:11:51.620 --> 00:11:53.752 so even if you have a software engineer, 298 00:11:53.752 --> 00:11:55.210 maybe they're not a data scientist, 299 00:11:55.210 --> 00:11:57.860 but they work very closely with them and they understand what 300 00:11:57.860 --> 00:11:58.840 the needs are. 301 00:11:58.840 --> 00:12:00.843 And that's what we found to be successful. 302 00:12:00.843 --> 00:12:03.260 SAM RANSBOTHAM: So enough of this pragmatic stuff, though. 303 00:12:03.260 --> 00:12:03.560 FIONA TAN: [Laughs.] 304 00:12:03.560 --> 00:12:05.393 SAM RANSBOTHAM: I mean, you've got ML and AI 305 00:12:05.393 --> 00:12:08.360 throughout the organization; you're using it lots of places. 306 00:12:08.360 --> 00:12:09.090 What's next? 307 00:12:09.090 --> 00:12:10.250 What are you excited about? 308 00:12:10.250 --> 00:12:12.336 What's the fun thing that's coming up next? 309 00:12:12.336 --> 00:12:12.960 310 00:12:12.960 --> 00:12:14.460 FIONA TAN: There's a bunch of things 311 00:12:14.460 --> 00:12:16.400 that we're trying to do as well. 312 00:12:16.400 --> 00:12:18.330 We're also looking at innovations 313 00:12:18.330 --> 00:12:22.050 in terms of incorporating other tactile-type capabilities. 314 00:12:22.050 --> 00:12:23.720 We have a small group that plays around 315 00:12:23.720 --> 00:12:26.710 with looking out for technology, whether it's 316 00:12:26.710 --> 00:12:30.030 advancements from mobile apps and the native capabilities 317 00:12:30.030 --> 00:12:32.950 of the devices that will allow us to do more. 318 00:12:32.950 --> 00:12:35.340 It's looking forward toward embedding 319 00:12:35.340 --> 00:12:38.260 more augmented reality into our shopping experience, 320 00:12:38.260 --> 00:12:39.150 for example. 321 00:12:39.150 --> 00:12:40.817 One of the things that we looked at also 322 00:12:40.817 --> 00:12:42.820 was, there was some technology out there that 323 00:12:42.820 --> 00:12:46.300 was allowing us to get you to almost "feel" the thing 324 00:12:46.300 --> 00:12:48.170 that you are trying to buy. 325 00:12:48.170 --> 00:12:51.910 The other thing, too, is, because we are heavily invested 326 00:12:51.910 --> 00:12:55.760 in imagery -- imagery is a big part of what sells in the home 327 00:12:55.760 --> 00:12:59.210 category, and we have a lot of 3D models, etc., 328 00:12:59.210 --> 00:13:00.850 for a lot of the items that we sell -- 329 00:13:00.850 --> 00:13:03.770 how do we then potentially create a digital twin 330 00:13:03.770 --> 00:13:05.520 of your home, for example, in the cloud, 331 00:13:05.520 --> 00:13:09.690 so that you can almost furnish your home virtually to match 332 00:13:09.690 --> 00:13:10.927 what you have in real life? 333 00:13:10.927 --> 00:13:12.510 And you can use that to influence what 334 00:13:12.510 --> 00:13:13.635 you're buying in real life. 335 00:13:13.635 --> 00:13:15.700 Or maybe that's your home in the metaverse, 336 00:13:15.700 --> 00:13:18.660 and you're going to furnish it a completely different way. 337 00:13:18.660 --> 00:13:20.780 There's a lot of really interesting technology 338 00:13:20.780 --> 00:13:22.480 and concepts out there that we are 339 00:13:22.480 --> 00:13:24.440 trying to keep abreast of while we're 340 00:13:24.440 --> 00:13:26.570 continuing to be practical and pragmatic, but yes. 341 00:13:26.570 --> 00:13:28.112 SHERVIN KHODABANDEH: A good portfolio 342 00:13:28.112 --> 00:13:31.990 of high risk and high reward, and practical stuff. 343 00:13:31.990 --> 00:13:32.680 It's very good. 344 00:13:32.680 --> 00:13:34.387 SAM RANSBOTHAM: The haptic things 345 00:13:34.387 --> 00:13:36.220 you mentioned seem particularly interesting. 346 00:13:36.220 --> 00:13:37.890 We do focus a lot on visual. 347 00:13:37.890 --> 00:13:39.970 We've made so many advances on visual and sound. 348 00:13:39.970 --> 00:13:40.470 349 00:13:40.470 --> 00:13:42.120 FIONA TAN: Yeah, but not so much feel. 350 00:13:42.120 --> 00:13:42.620 351 00:13:42.620 --> 00:13:44.670 SAM RANSBOTHAM: What's the Pantone color set 352 00:13:44.670 --> 00:13:46.810 equivalent of haptic or touch? 353 00:13:46.810 --> 00:13:49.360 It seems like if we had some of those sorts of things, 354 00:13:49.360 --> 00:13:51.010 where I could have an array at home, 355 00:13:51.010 --> 00:13:52.660 and I could touch these four things, 356 00:13:52.660 --> 00:13:55.090 and this is what this couch feels like, I feel like that's 357 00:13:55.090 --> 00:13:56.100 kind of interesting. 358 00:13:56.100 --> 00:13:58.110 I'm not sure if I want to go there with smell, 359 00:13:58.110 --> 00:14:01.190 because I'm not sure if I want that Pantone array of smells 360 00:14:01.190 --> 00:14:05.180 in my home, but it's exciting to see that you're thinking about 361 00:14:05.180 --> 00:14:08.570 these, let's say, nontraditional or non-, you know, 362 00:14:08.570 --> 00:14:11.440 first two primary senses that we tend to focus on. 363 00:14:11.440 --> 00:14:12.160 FIONA TAN: Yeah. 364 00:14:12.160 --> 00:14:12.660 Yeah. 365 00:14:12.660 --> 00:14:14.660 SAM RANSBOTHAM: We have a segment where 366 00:14:14.660 --> 00:14:16.603 we ask you a series of rapid-fire questions, 367 00:14:16.603 --> 00:14:18.770 and you're just supposed to say the first thing that 368 00:14:18.770 --> 00:14:19.606 comes to your mind. 369 00:14:19.606 --> 00:14:20.097 370 00:14:20.097 --> 00:14:20.680 FIONA TAN: OK. 371 00:14:20.680 --> 00:14:21.180 [Laughs.] 372 00:14:21.180 --> 00:14:22.593 SAM RANSBOTHAM: Are you ready? 373 00:14:22.593 --> 00:14:23.260 FIONA TAN: Yeah. 374 00:14:23.260 --> 00:14:23.940 OK. 375 00:14:23.940 --> 00:14:24.900 We'll try. 376 00:14:24.900 --> 00:14:26.950 SAM RANSBOTHAM: What's your proudest AI moment? 377 00:14:26.950 --> 00:14:30.480 FIONA TAN: I think one of the ones that I'm most proud of is, 378 00:14:30.480 --> 00:14:33.140 as we built out the AI capabilities 379 00:14:33.140 --> 00:14:35.970 across different functions, we have one particular capability 380 00:14:35.970 --> 00:14:38.680 that we're now building, which is what we call geo-sort. 381 00:14:38.680 --> 00:14:42.680 Basically, it allows us to take advantage of the capabilities 382 00:14:42.680 --> 00:14:44.340 that we have that are foundational -- 383 00:14:44.340 --> 00:14:46.100 on the understanding of a product, 384 00:14:46.100 --> 00:14:47.990 the understanding of the customer -- 385 00:14:47.990 --> 00:14:50.170 and then being able to take that, 386 00:14:50.170 --> 00:14:53.630 and then we also factor in where products are located. 387 00:14:53.630 --> 00:14:55.880 Basically, we have a sort order based 388 00:14:55.880 --> 00:14:59.085 on my understanding of your intent, my best understanding 389 00:14:59.085 --> 00:15:00.460 of all the products that we have, 390 00:15:00.460 --> 00:15:03.025 and then we look at where the product is located, 391 00:15:03.025 --> 00:15:04.650 what it costs to ship for you, and then 392 00:15:04.650 --> 00:15:07.740 we do another round of optimization around that. 393 00:15:07.740 --> 00:15:09.510 In a way, the reason why I'm proud of it 394 00:15:09.510 --> 00:15:12.440 is, because we built the foundational capabilities, 395 00:15:12.440 --> 00:15:16.000 we can now deliver second-order solutions on top of that. 396 00:15:16.000 --> 00:15:19.210 And that's very specific to us, but I'm sure a lot of companies 397 00:15:19.210 --> 00:15:21.230 are at that point, hopefully, too, 398 00:15:21.230 --> 00:15:22.980 where they have foundational capabilities, 399 00:15:22.980 --> 00:15:24.522 and they now figure out, "Oh, there's 400 00:15:24.522 --> 00:15:27.165 a second-order solution I can now devise because I've 401 00:15:27.165 --> 00:15:28.040 laid the groundwork." 402 00:15:28.040 --> 00:15:30.290 SAM RANSBOTHAM: Yeah, you spend a lot of time and effort 403 00:15:30.290 --> 00:15:31.665 on those foundations, and getting 404 00:15:31.665 --> 00:15:33.370 to use that foundation seems fun, 405 00:15:33.370 --> 00:15:35.870 because some of the foundation may be in the suffering 406 00:15:35.870 --> 00:15:38.700 category of getting your data house in order 407 00:15:38.700 --> 00:15:41.020 and getting things ready for those next things. 408 00:15:41.020 --> 00:15:41.423 409 00:15:41.423 --> 00:15:42.090 FIONA TAN: Yeah. 410 00:15:42.090 --> 00:15:42.882 SAM RANSBOTHAM: OK. 411 00:15:42.882 --> 00:15:44.280 What worries you about AI? 412 00:15:44.280 --> 00:15:47.640 FIONA TAN: In our application, I would say, 413 00:15:47.640 --> 00:15:50.190 it's part of back to the whole risk category 414 00:15:50.190 --> 00:15:51.380 that we talked about. 415 00:15:51.380 --> 00:15:53.850 We feel good about the way that we are using AI. 416 00:15:53.850 --> 00:15:56.780 I don't think we are anywhere close to the boundaries 417 00:15:56.780 --> 00:15:58.790 of where we start to worry, and part of it 418 00:15:58.790 --> 00:16:02.200 is just around we'll be looking at how people are using it, 419 00:16:02.200 --> 00:16:03.650 but it's all anonymized. 420 00:16:03.650 --> 00:16:05.143 We're trying to figure out trends. 421 00:16:05.143 --> 00:16:06.560 For example, when we do marketing, 422 00:16:06.560 --> 00:16:08.310 we look at what channels are successful, 423 00:16:08.310 --> 00:16:11.535 but it's not going into the details of who bought where. 424 00:16:11.535 --> 00:16:13.660 But [it's] something that I think in general people 425 00:16:13.660 --> 00:16:15.535 do need to think about in terms of how you're 426 00:16:15.535 --> 00:16:17.500 using the data that you have and making sure 427 00:16:17.500 --> 00:16:20.690 that it's at the aggregate, and how do you make sure 428 00:16:20.690 --> 00:16:22.607 that it continues to be so? 429 00:16:22.607 --> 00:16:24.690 SAM RANSBOTHAM: What's your favorite activity that 430 00:16:24.690 --> 00:16:26.820 does not involve technology? 431 00:16:26.820 --> 00:16:29.000 FIONA TAN: I have two current favorite activities. 432 00:16:29.000 --> 00:16:30.870 I'm learning how to golf, and I think 433 00:16:30.870 --> 00:16:35.550 that's going to be a lifelong endeavor, because it 434 00:16:35.550 --> 00:16:37.370 seems like it's very hard. 435 00:16:37.370 --> 00:16:39.260 And I enjoy cooking. 436 00:16:39.260 --> 00:16:42.060 It's funny, because I approach cooking the same way 437 00:16:42.060 --> 00:16:43.390 that I do with technology. 438 00:16:43.390 --> 00:16:44.340 I'm always optimizing. 439 00:16:44.340 --> 00:16:46.280 So I never follow one recipe; I pick out 440 00:16:46.280 --> 00:16:48.750 the best parts of, like, six different recipes, and then 441 00:16:48.750 --> 00:16:50.210 and it's funny because people ask me, "Well, 442 00:16:50.210 --> 00:16:51.293 which one did you follow?" 443 00:16:51.293 --> 00:16:53.560 I'm like, "Ah, it's actually very complicated. 444 00:16:53.560 --> 00:16:54.340 Let me explain. 445 00:16:54.340 --> 00:16:57.550 You do this and you do that and you trade off " 446 00:16:57.550 --> 00:17:01.248 It's how I think, so that's how I cook and how I bake as well. 447 00:17:01.248 --> 00:17:03.540 SAM RANSBOTHAM: It's an ensemble model approach, right? 448 00:17:03.540 --> 00:17:03.870 FIONA TAN: Yeah. 449 00:17:03.870 --> 00:17:05.319 SAM RANSBOTHAM: It's just like a random forest. 450 00:17:05.319 --> 00:17:07.569 You just reach into the bag -- you're grabbing another 451 00:17:07.569 --> 00:17:09.598 selection out and building an ensemble recipe. 452 00:17:09.598 --> 00:17:10.390 FIONA TAN: Exactly. 453 00:17:10.390 --> 00:17:10.760 454 00:17:10.760 --> 00:17:12.540 SAM RANSBOTHAM: What's the first career that you wanted? 455 00:17:12.540 --> 00:17:14.248 What did you want to be when you grew up? 456 00:17:14.248 --> 00:17:15.869 FIONA TAN: My first career -- 457 00:17:15.869 --> 00:17:16.970 I wanted to be a vet. 458 00:17:16.970 --> 00:17:19.440 Isn't that what most children want to be initially? 459 00:17:19.440 --> 00:17:20.079 SAM RANSBOTHAM: Yeah, and then you 460 00:17:20.079 --> 00:17:22.662 took your first computer science class and everything changed. 461 00:17:22.662 --> 00:17:24.099 FIONA TAN: Yeah, exactly. 462 00:17:24.099 --> 00:17:25.020 And then I was hooked. 463 00:17:25.020 --> 00:17:26.599 SAM RANSBOTHAM: What's your greatest 464 00:17:26.599 --> 00:17:27.770 wish for the future for AI? 465 00:17:27.770 --> 00:17:29.270 What do you hope we're going to gain 466 00:17:29.270 --> 00:17:30.824 from artificial intelligence? 467 00:17:30.824 --> 00:17:31.487 468 00:17:31.487 --> 00:17:33.070 FIONA TAN: I hope that we can continue 469 00:17:33.070 --> 00:17:35.710 to use it and make just really good practical applications 470 00:17:35.710 --> 00:17:36.210 of it. 471 00:17:36.210 --> 00:17:38.530 I think there's so many, and obviously we're 472 00:17:38.530 --> 00:17:41.650 using it in a commerce and retail arena, [but there are] 473 00:17:41.650 --> 00:17:44.990 a lot of use cases where we can help with understanding health 474 00:17:44.990 --> 00:17:45.750 care, etc. 475 00:17:45.750 --> 00:17:47.840 There are just so many applications of it. 476 00:17:47.840 --> 00:17:49.890 I'd love for it to just be prevalent 477 00:17:49.890 --> 00:17:52.120 and for folks to continue to practice it, 478 00:17:52.120 --> 00:17:54.717 and, again, it's looking at the data 479 00:17:54.717 --> 00:17:56.800 and helping us understand things that we might not 480 00:17:56.800 --> 00:17:59.260 have understood just from an analytical perspective. 481 00:17:59.260 --> 00:18:01.927 I think that's the part around the AI part 482 00:18:01.927 --> 00:18:04.010 of it, is it may not be things that we might think 483 00:18:04.010 --> 00:18:08.620 of ourselves, but looking for solutions in a very novel way. 484 00:18:08.620 --> 00:18:10.270 SAM RANSBOTHAM: Well, Fiona, thank you 485 00:18:10.270 --> 00:18:11.770 for taking the time to talk with us. 486 00:18:11.770 --> 00:18:15.980 I think a lot of the things you said about pragmatic approaches 487 00:18:15.980 --> 00:18:17.895 and [being] careful about risk, I 488 00:18:17.895 --> 00:18:19.270 think those are things that apply 489 00:18:19.270 --> 00:18:21.650 in lots of different places, even if you're not 490 00:18:21.650 --> 00:18:24.755 digital first and physical second. 491 00:18:24.755 --> 00:18:26.630 I think those things apply to lots of people, 492 00:18:26.630 --> 00:18:27.880 and I think people will learn from that. 493 00:18:27.880 --> 00:18:29.890 Thank you for taking the time to talk with us today. 494 00:18:29.890 --> 00:18:31.598 FIONA TAN: Yeah, thank you for having me. 495 00:18:31.598 --> 00:18:32.188 I enjoyed it. 496 00:18:32.188 --> 00:18:33.820 497 00:18:33.820 --> 00:18:35.900 SAM RANSBOTHAM: That's a wrap on Season 5. 498 00:18:35.900 --> 00:18:37.000 Thanks for listening. 499 00:18:37.000 --> 00:18:39.590 We'll be back early next year with new episodes. 500 00:18:39.590 --> 00:18:42.500 In the meantime, please follow Me, Myself, and AI 501 00:18:42.500 --> 00:18:45.390 on LinkedIn to stay up to date and to be 502 00:18:45.390 --> 00:18:48.230 the first to hear about bonus episodes and other content. 503 00:18:48.230 --> 00:18:50.480 504 00:18:50.480 --> 00:18:52.320 ALLISON RYDER: Thanks for listening 505 00:18:52.320 --> 00:18:53.820 to Me, Myself, and AI. 506 00:18:53.820 --> 00:18:56.270 We believe, like you, that the conversation 507 00:18:56.270 --> 00:18:58.500 about AI implementation doesn't start and stop 508 00:18:58.500 --> 00:18:59.617 with this podcast. 509 00:18:59.617 --> 00:19:01.450 That's why we've created a group on LinkedIn 510 00:19:01.450 --> 00:19:03.290 specifically for listeners like you. 511 00:19:03.290 --> 00:19:06.030 It's called AI for Leaders, and if you join us, 512 00:19:06.030 --> 00:19:08.060 you can chat with show creators and hosts, 513 00:19:08.060 --> 00:19:10.760 ask your own questions, share your insights, 514 00:19:10.760 --> 00:19:13.420 and gain access to valuable resources about AI 515 00:19:13.420 --> 00:19:16.260 implementation from MIT SMR and BCG. 516 00:19:16.260 --> 00:19:21.380 You can access it by visiting mitsmr.com/AIforLeaders. 517 00:19:21.380 --> 00:19:24.100 We'll put that link in the show notes, 518 00:19:24.100 --> 00:19:26.530 and we hope to see you there. 519 00:19:26.530 --> 00:19:32.000