WEBVTT 1 00:00:00.000 --> 00:00:02.140 2 00:00:02.140 --> 00:00:04.880 SAM RANSBOTHAM: Launching a new AI initiative 3 00:00:04.880 --> 00:00:07.460 is quite different from launching a new technology 4 00:00:07.460 --> 00:00:09.830 platform like an ERP. 5 00:00:09.830 --> 00:00:13.560 Find out the key differences on today's episode. 6 00:00:13.560 --> 00:00:16.480 NITZAN MEKEL-BOBROV: I'm Nitzan Mekel-Bobrov from eBay, 7 00:00:16.480 --> 00:00:18.780 and you're listening to Me, Myself, and AI. 8 00:00:18.780 --> 00:00:22.380 SAM RANSBOTHAM: Welcome to Me, Myself, and AI, 9 00:00:22.380 --> 00:00:25.190 a podcast on artificial intelligence in business. 10 00:00:25.190 --> 00:00:28.480 Each episode, we introduce you to someone innovating with AI. 11 00:00:28.480 --> 00:00:31.500 I'm Sam Ransbotham, professor of information systems 12 00:00:31.500 --> 00:00:32.700 at Boston College. 13 00:00:32.700 --> 00:00:35.770 I'm also the guest editor for the AI and Business Strategy 14 00:00:35.770 --> 00:00:38.610 Big Ideas program at MIT Sloan Management Review. 15 00:00:38.610 --> 00:00:40.770 SHERVIN KHODABANDEH: And I'm Shervin Khodabandeh, 16 00:00:40.770 --> 00:00:44.760 senior partner with BCG, and I also colead BCG's AI practice 17 00:00:44.760 --> 00:00:45.740 in North America. 18 00:00:45.740 --> 00:00:48.820 Together, MIT SMR and BCG have been 19 00:00:48.820 --> 00:00:52.340 researching AI for six years now, interviewing hundreds 20 00:00:52.340 --> 00:00:54.900 of practitioners and surveying thousands 21 00:00:54.900 --> 00:00:59.140 of companies on what it takes to build and deploy and scale 22 00:00:59.140 --> 00:01:01.320 AI capabilities across the organization 23 00:01:01.320 --> 00:01:04.642 and really transform the way organizations operate. 24 00:01:04.642 --> 00:01:06.600 SAM RANSBOTHAM: Shervin and I are excited today 25 00:01:06.600 --> 00:01:09.420 to be talking with Nitzan Mekel-Bobrov, the chief AI 26 00:01:09.420 --> 00:01:10.415 officer at eBay. 27 00:01:10.415 --> 00:01:12.540 Nitzan, thanks for taking the time to talk with us. 28 00:01:12.540 --> 00:01:13.040 Welcome. 29 00:01:13.040 --> 00:01:13.640 30 00:01:13.640 --> 00:01:14.570 NITZAN MEKEL-BOBROV: Thank you so much. 31 00:01:14.570 --> 00:01:15.528 I'm excited to be here. 32 00:01:15.528 --> 00:01:15.780 33 00:01:15.780 --> 00:01:17.220 SAM RANSBOTHAM: You've got a relatively new position, 34 00:01:17.220 --> 00:01:18.340 within the last year or so. 35 00:01:18.340 --> 00:01:20.090 Can you tell us what your role is at eBay? 36 00:01:20.090 --> 00:01:21.650 NITZAN MEKEL-BOBROV: Sure. 37 00:01:21.650 --> 00:01:23.870 Before me, my predecessors, so to speak, 38 00:01:23.870 --> 00:01:26.320 have all been chief AI scientists, 39 00:01:26.320 --> 00:01:31.020 and the change to a chief AI officer was actually 40 00:01:31.020 --> 00:01:36.220 a strategic one, with the recognition that AI is more 41 00:01:36.220 --> 00:01:38.330 than just the machine learning models -- 42 00:01:38.330 --> 00:01:42.950 that AI is the engineering that it takes to "productionize" 43 00:01:42.950 --> 00:01:45.660 those models at scale and, of course, 44 00:01:45.660 --> 00:01:48.880 the business impact and business use cases. 45 00:01:48.880 --> 00:01:52.390 We really think of AI as an end-to-end experience. 46 00:01:52.390 --> 00:01:54.627 SAM RANSBOTHAM: All right; so eBay's 47 00:01:54.627 --> 00:01:55.710 pretty excited about this. 48 00:01:55.710 --> 00:01:57.818 What is eBay hoping to gain from this? 49 00:01:57.818 --> 00:01:59.860 NITZAN MEKEL-BOBROV: I think [it's] opportunities 50 00:01:59.860 --> 00:02:03.910 in every facet of our business, as probably most companies 51 00:02:03.910 --> 00:02:06.480 our size would say as well. 52 00:02:06.480 --> 00:02:10.560 I think for us, probably what we're most excited about, 53 00:02:10.560 --> 00:02:14.360 and the reason that I joined eBay late last year -- 54 00:02:14.360 --> 00:02:21.080 what made me excited -- is the ability to create AI-led tools, 55 00:02:21.080 --> 00:02:24.230 essentially, that we put into the hands of our customers, 56 00:02:24.230 --> 00:02:29.320 both buyers and sellers, to create their own experiences 57 00:02:29.320 --> 00:02:30.940 that they share with each other. 58 00:02:30.940 --> 00:02:37.060 It's not just about us using AI to build things, but rather us 59 00:02:37.060 --> 00:02:41.930 building AI tooling to enable our buyers and sellers to build 60 00:02:41.930 --> 00:02:44.000 things, and I think that's really exciting. 61 00:02:44.000 --> 00:02:46.277 SAM RANSBOTHAM: Give us an example of one of those. 62 00:02:46.277 --> 00:02:47.860 NITZAN MEKEL-BOBROV: Most recently, we 63 00:02:47.860 --> 00:02:52.830 rolled out our new 3D visualization experience, 64 00:02:52.830 --> 00:02:57.750 which is using computer vision on the back end 65 00:02:57.750 --> 00:03:01.190 to do the processing and some of the rendering. 66 00:03:01.190 --> 00:03:05.370 It's not about us using this technology 67 00:03:05.370 --> 00:03:09.070 to create 3D visualizations, but rather, we're 68 00:03:09.070 --> 00:03:12.030 enabling, through our mobile app, 69 00:03:12.030 --> 00:03:15.170 our sellers to create visualizations 70 00:03:15.170 --> 00:03:18.910 of their own items at scale, and in a sort 71 00:03:18.910 --> 00:03:22.690 of super easy way that doesn't require professional equipment. 72 00:03:22.690 --> 00:03:24.240 SAM RANSBOTHAM: What do you think 73 00:03:24.240 --> 00:03:27.540 that eBay's doing uniquely in artificial intelligence? 74 00:03:27.540 --> 00:03:28.237 75 00:03:28.237 --> 00:03:30.320 NITZAN MEKEL-BOBROV: I'll start with the approach, 76 00:03:30.320 --> 00:03:33.020 and then I'll get into an example or two. 77 00:03:33.020 --> 00:03:35.100 From an approach perspective, I think 78 00:03:35.100 --> 00:03:37.320 we're going at it in a unique way 79 00:03:37.320 --> 00:03:40.070 because we're a two-sided marketplace. 80 00:03:40.070 --> 00:03:42.750 We worry about buyers and sellers, 81 00:03:42.750 --> 00:03:49.690 and so much of our attention is on building capabilities 82 00:03:49.690 --> 00:03:54.760 for our customers to use versus building experiences ourselves 83 00:03:54.760 --> 00:03:55.560 directly. 84 00:03:55.560 --> 00:03:59.960 Building tools for our customers to build experiences I do think 85 00:03:59.960 --> 00:04:01.340 is unique. 86 00:04:01.340 --> 00:04:03.850 It's unique also from a technology perspective, 87 00:04:03.850 --> 00:04:06.810 because there's a different level of resilience that's 88 00:04:06.810 --> 00:04:12.180 needed, and you have to test a far greater number of ways 89 00:04:12.180 --> 00:04:15.780 in which it can fail when you're not actually 90 00:04:15.780 --> 00:04:17.870 building the experience yourself; 91 00:04:17.870 --> 00:04:20.170 you're putting tools in the hands of others to do. 92 00:04:20.170 --> 00:04:24.530 So it's almost like we are software as a service 93 00:04:24.530 --> 00:04:27.730 within an e-commerce company, right? 94 00:04:27.730 --> 00:04:30.080 So that's sort of one aspect of it. 95 00:04:30.080 --> 00:04:35.100 In terms of specific areas of focus that are maybe unique, 96 00:04:35.100 --> 00:04:40.560 we are [doubling down] now on visual experiences. 97 00:04:40.560 --> 00:04:43.510 We can say "computer vision," but I think of it 98 00:04:43.510 --> 00:04:48.420 a little bit more broadly than computer vision as an AI 99 00:04:48.420 --> 00:04:49.180 approach. 100 00:04:49.180 --> 00:04:52.250 I'm thinking of it as intelligent visual experiences 101 00:04:52.250 --> 00:04:55.780 that are immersive, interactive, adaptive, etc. 102 00:04:55.780 --> 00:04:59.750 3D is sort of the tip of the iceberg, 103 00:04:59.750 --> 00:05:07.150 but as we get deeper into the "metaverse" and deeper 104 00:05:07.150 --> 00:05:13.370 and deeper into ways in which our digital platform is more 105 00:05:13.370 --> 00:05:17.940 than just a place to do the commerce itself, 106 00:05:17.940 --> 00:05:24.630 we think that visual experiences in real time, live interaction 107 00:05:24.630 --> 00:05:27.850 between people, ways of visualizing products 108 00:05:27.850 --> 00:05:31.140 that feel like they're in your hands versus just being 109 00:05:31.140 --> 00:05:33.420 on the screen is something that will 110 00:05:33.420 --> 00:05:35.450 be transformational for eBay. 111 00:05:35.450 --> 00:05:38.160 SAM RANSBOTHAM: You've got a very interesting background: 112 00:05:38.160 --> 00:05:41.830 Bookings.com, Capital One, Hearst, Boston Scientific. 113 00:05:41.830 --> 00:05:44.080 Maybe you can tell us a little bit about how 114 00:05:44.080 --> 00:05:45.810 you ended up where you are. 115 00:05:45.810 --> 00:05:48.110 And one of the things that I want to harp on, 116 00:05:48.110 --> 00:05:50.460 perhaps, is that a lot of our guests 117 00:05:50.460 --> 00:05:52.890 tend to focus on the artificial part 118 00:05:52.890 --> 00:05:56.060 of artificial intelligence, but your background is actually 119 00:05:56.060 --> 00:05:59.160 in the intelligence part, like the actual human intelligence, 120 00:05:59.160 --> 00:06:02.520 with your research on brains and how brains evolve, 121 00:06:02.520 --> 00:06:05.490 so I think that's a fascinating, different angle. 122 00:06:05.490 --> 00:06:08.570 You're coming much more from the intelligence part 123 00:06:08.570 --> 00:06:09.845 than from the technology part. 124 00:06:09.845 --> 00:06:11.220 How did you get to where you are, 125 00:06:11.220 --> 00:06:12.817 and how did you learn these lessons? 126 00:06:12.817 --> 00:06:14.400 NITZAN MEKEL-BOBROV: It's interesting. 127 00:06:14.400 --> 00:06:19.300 I actually was planning on being a lab biologist or geneticist. 128 00:06:19.300 --> 00:06:22.290 The problem was that I was terrible at the bench. 129 00:06:22.290 --> 00:06:26.390 Anything that I needed to use my hands for never worked. 130 00:06:26.390 --> 00:06:29.060 None of my experiments worked; everything was a flop. 131 00:06:29.060 --> 00:06:34.990 So I quickly learned, "Well, I'm good with coding and computer 132 00:06:34.990 --> 00:06:38.230 algorithms; maybe I'll focus on more theoretical aspects 133 00:06:38.230 --> 00:06:39.230 of biology." 134 00:06:39.230 --> 00:06:41.750 And then I got immersed in the world 135 00:06:41.750 --> 00:06:44.880 of neuroscience and computational genomics, 136 00:06:44.880 --> 00:06:46.590 and that's, by the way, how I got 137 00:06:46.590 --> 00:06:48.390 introduced to neural networks. 138 00:06:48.390 --> 00:06:50.130 It's interesting because me and all 139 00:06:50.130 --> 00:06:55.000 my peers, that's how we entered what ended up 140 00:06:55.000 --> 00:06:58.800 being machine learning applications in business, 141 00:06:58.800 --> 00:07:02.870 and I had no awareness that engineering, essentially, 142 00:07:02.870 --> 00:07:05.540 and computer science were playing such a big role 143 00:07:05.540 --> 00:07:10.760 on the other side of a somewhat academic fence. 144 00:07:10.760 --> 00:07:14.440 To me, it seemed like a very obvious progression. 145 00:07:14.440 --> 00:07:19.270 As I was working on modeling how real brains, so to speak, work 146 00:07:19.270 --> 00:07:21.240 and how human intelligence works, 147 00:07:21.240 --> 00:07:26.270 moving from there into artificial intelligence 148 00:07:26.270 --> 00:07:28.650 seemed like a very natural progression, 149 00:07:28.650 --> 00:07:29.990 and I wasn't alone in this. 150 00:07:29.990 --> 00:07:33.340 But then I got introduced to this whole other set 151 00:07:33.340 --> 00:07:36.610 of folks coming at it from the other end. 152 00:07:36.610 --> 00:07:40.120 The progression -- you're seeing me jumping around from one 153 00:07:40.120 --> 00:07:43.220 industry to another -- isn't accidental. 154 00:07:43.220 --> 00:07:45.800 I spent the first half of my career in health care 155 00:07:45.800 --> 00:07:47.040 because it was very natural. 156 00:07:47.040 --> 00:07:50.340 I did my graduate work in, essentially, 157 00:07:50.340 --> 00:07:54.040 machine learning methods for genomic analysis, 158 00:07:54.040 --> 00:07:56.870 because it was the human genome era, etc., 159 00:07:56.870 --> 00:07:59.180 and so I stayed in health care. 160 00:07:59.180 --> 00:08:00.770 But at some point, I really wanted 161 00:08:00.770 --> 00:08:04.400 to see how AI can be used in other industries, 162 00:08:04.400 --> 00:08:09.570 and so I fairly purposefully moved from financial services 163 00:08:09.570 --> 00:08:11.810 to online travel to e-commerce. 164 00:08:11.810 --> 00:08:15.220 SHERVIN KHODABANDEH: As you have traversed the wide array 165 00:08:15.220 --> 00:08:19.510 of sectors and industries, what did you find 166 00:08:19.510 --> 00:08:24.280 was one of the biggest hurdles in getting AI at scale 167 00:08:24.280 --> 00:08:25.580 in these organizations? 168 00:08:25.580 --> 00:08:25.962 169 00:08:25.962 --> 00:08:27.420 NITZAN MEKEL-BOBROV: Always people; 170 00:08:27.420 --> 00:08:28.618 they always get in the way. 171 00:08:28.618 --> 00:08:29.660 SAM RANSBOTHAM: [Laughs]. 172 00:08:29.660 --> 00:08:30.850 Spoken like a true engineer. 173 00:08:30.850 --> 00:08:31.350 174 00:08:31.350 --> 00:08:35.490 NITZAN MEKEL-BOBROV: It's hard to get a whole group of people 175 00:08:35.490 --> 00:08:39.549 with different incentives to coordinate together 176 00:08:39.549 --> 00:08:42.120 in a way that is needed. 177 00:08:42.120 --> 00:08:45.490 Doing AI at scale, and in a way that 178 00:08:45.490 --> 00:08:47.560 could drive transformational value, 179 00:08:47.560 --> 00:08:51.630 does require a much broader set of players 180 00:08:51.630 --> 00:08:53.250 playing nicely together. 181 00:08:53.250 --> 00:08:57.200 And while typically everyone is onboard 182 00:08:57.200 --> 00:09:00.280 that it's the right answer, the prioritization 183 00:09:00.280 --> 00:09:04.370 of that versus the very immediate-term business 184 00:09:04.370 --> 00:09:07.370 objectives is what typically ends up faltering. 185 00:09:07.370 --> 00:09:09.830 SAM RANSBOTHAM: Part of your background 186 00:09:09.830 --> 00:09:13.190 that I was interested in is that one of your dissertation 187 00:09:13.190 --> 00:09:16.490 findings was that the human brain is still 188 00:09:16.490 --> 00:09:17.780 evolving and getting smarter. 189 00:09:17.780 --> 00:09:21.680 I think you may be in a unique position to compare 190 00:09:21.680 --> 00:09:24.390 but I assume that artificial intelligence is also 191 00:09:24.390 --> 00:09:25.880 getting smarter. 192 00:09:25.880 --> 00:09:27.890 Tell us a little about what you see happening 193 00:09:27.890 --> 00:09:29.160 in terms of these rates. 194 00:09:29.160 --> 00:09:31.913 Are the machines getting smarter faster 195 00:09:31.913 --> 00:09:33.830 than humans are getting smarter, or are humans 196 00:09:33.830 --> 00:09:35.230 still continuing to outpace? 197 00:09:35.230 --> 00:09:37.690 I'm kind of curious what your perspective is on those two 198 00:09:37.690 --> 00:09:38.560 things. 199 00:09:38.560 --> 00:09:40.310 NITZAN MEKEL-BOBROV: You really dusted off 200 00:09:40.310 --> 00:09:44.730 some old things when you were looking at my background. 201 00:09:44.730 --> 00:09:47.210 I appreciate you digging. 202 00:09:47.210 --> 00:09:49.860 I think there's two things I would pick up there 203 00:09:49.860 --> 00:09:53.270 that I do find interesting on a sort of philosophical level 204 00:09:53.270 --> 00:09:54.090 almost. 205 00:09:54.090 --> 00:09:58.500 The first one is that when we look back in history, 206 00:09:58.500 --> 00:10:03.270 both recent and longer term, we look at these macro changes, 207 00:10:03.270 --> 00:10:06.570 and then we assume that because those happen over long periods 208 00:10:06.570 --> 00:10:09.150 of time, that they're almost episodic -- 209 00:10:09.150 --> 00:10:12.780 like it's not something that you would observe day to day. 210 00:10:12.780 --> 00:10:16.160 But as any geneticist would tell you, 211 00:10:16.160 --> 00:10:20.670 evolution is just population genetics happening over 212 00:10:20.670 --> 00:10:22.250 a longer time scale. 213 00:10:22.250 --> 00:10:25.710 It's not as if it's something that is episodic; 214 00:10:25.710 --> 00:10:27.880 it's something that's actually continuous. 215 00:10:27.880 --> 00:10:33.070 In that sense, I think it's not surprising 216 00:10:33.070 --> 00:10:37.510 that humans have continued to evolve 217 00:10:37.510 --> 00:10:39.350 and are continuing to evolve. 218 00:10:39.350 --> 00:10:42.530 It's just the nature of biology. 219 00:10:42.530 --> 00:10:45.260 What's happened in the most recent history of the human 220 00:10:45.260 --> 00:10:50.940 species is that the variation in signals and the speed of that 221 00:10:50.940 --> 00:10:55.790 variation being introduced is just accelerating massively, 222 00:10:55.790 --> 00:11:01.030 and because of that, we're able to adapt faster and faster 223 00:11:01.030 --> 00:11:04.690 and faster and in that sense become "smarter" -- 224 00:11:04.690 --> 00:11:07.010 maybe "more adaptive" is a better way [to say this]. 225 00:11:07.010 --> 00:11:09.550 And I think it's the same with what's 226 00:11:09.550 --> 00:11:11.600 happening with technology now. 227 00:11:11.600 --> 00:11:14.250 There's a lot of discussion about children being exposed 228 00:11:14.250 --> 00:11:16.980 to digital technologies, etc., and the rate at which 229 00:11:16.980 --> 00:11:20.930 technology is changing, so the signals that people are getting 230 00:11:20.930 --> 00:11:23.480 -- the variation in signals -- is just getting more and more 231 00:11:23.480 --> 00:11:27.430 and more, and so people's brains are becoming increasingly 232 00:11:27.430 --> 00:11:28.990 adaptive. 233 00:11:28.990 --> 00:11:32.610 There's somewhat of an analogy with AI there, where, really, 234 00:11:32.610 --> 00:11:34.910 the amount of data -- the variation in signals that 235 00:11:34.910 --> 00:11:39.450 we're feeding our models -- is continuously growing 236 00:11:39.450 --> 00:11:44.030 exponentially, and so of course the models are becoming more 237 00:11:44.030 --> 00:11:46.610 and more flexible and adaptive as well. 238 00:11:46.610 --> 00:11:48.100 SHERVIN KHODABANDEH: Nitzan, you've 239 00:11:48.100 --> 00:11:54.740 been pretty vocal about AI not being just a fancy technology 240 00:11:54.740 --> 00:11:58.240 or a series of fancy algorithms just because they're 241 00:11:58.240 --> 00:12:00.470 cool and awesome. 242 00:12:00.470 --> 00:12:00.578 243 00:12:00.578 --> 00:12:01.870 SAM RANSBOTHAM: Which they are. 244 00:12:01.870 --> 00:12:04.120 SHERVIN KHODABANDEH: There has to be a purpose for it; 245 00:12:04.120 --> 00:12:05.960 there has to be a real need for it. 246 00:12:05.960 --> 00:12:09.940 Comment more about that meta framing of AI strategy 247 00:12:09.940 --> 00:12:13.890 and its alignment with business and corporate strategy 248 00:12:13.890 --> 00:12:16.890 at eBay or, in general, the purpose of AI, 249 00:12:16.890 --> 00:12:19.227 because I know you've been pretty vocal about that. 250 00:12:19.227 --> 00:12:20.310 NITZAN MEKEL-BOBROV: Yeah. 251 00:12:20.310 --> 00:12:22.852 I've probably been pretty vocal because I learned it the hard 252 00:12:22.852 --> 00:12:25.520 way, getting somewhat burned in [the] earlier years 253 00:12:25.520 --> 00:12:29.793 of my career coming into a new company and thinking I knew it 254 00:12:29.793 --> 00:12:31.960 all -- that I knew it better than everyone because I 255 00:12:31.960 --> 00:12:35.470 understood the technology and what's going on under the hood. 256 00:12:35.470 --> 00:12:39.790 And I learned the hard way, as I say, 257 00:12:39.790 --> 00:12:44.320 that it's a lot more nuanced than brute-forcing 258 00:12:44.320 --> 00:12:48.690 a technology onto a theoretical use case 259 00:12:48.690 --> 00:12:49.940 that you might think of. 260 00:12:49.940 --> 00:12:53.880 It's really important to understand the business context 261 00:12:53.880 --> 00:12:56.560 in which these technologies are deployed 262 00:12:56.560 --> 00:13:00.280 and understand it deeply. 263 00:13:00.280 --> 00:13:03.630 So, for example, early on in my career, 264 00:13:03.630 --> 00:13:05.400 when I was in financial services, 265 00:13:05.400 --> 00:13:08.810 we were using AI for a lot of automation -- 266 00:13:08.810 --> 00:13:10.040 workflow automation. 267 00:13:10.040 --> 00:13:12.680 There's huge amounts of savings in this -- 268 00:13:12.680 --> 00:13:15.340 hundreds of millions of dollars a year. 269 00:13:15.340 --> 00:13:20.980 And to me, it felt obvious that certain applications -- 270 00:13:20.980 --> 00:13:24.070 for example, with speech recognition and intent 271 00:13:24.070 --> 00:13:26.500 detection, etc., in the call center -- 272 00:13:26.500 --> 00:13:28.940 would be an ideal fit. 273 00:13:28.940 --> 00:13:33.900 But it's only after I actually shadowed a number of agents 274 00:13:33.900 --> 00:13:37.410 and spent probably about a good month deep-diving 275 00:13:37.410 --> 00:13:40.740 into their workflows that I understood that there's so much 276 00:13:40.740 --> 00:13:43.110 complexity there; that it's really 277 00:13:43.110 --> 00:13:45.950 about the interaction between the human intelligence 278 00:13:45.950 --> 00:13:47.450 and the machine intelligence, and it 279 00:13:47.450 --> 00:13:49.700 was making assumptions that just weren't 280 00:13:49.700 --> 00:13:52.220 going to bear out in real life. 281 00:13:52.220 --> 00:13:55.020 So it's that move from what works 282 00:13:55.020 --> 00:13:59.670 in the lab to what works in real life that is really critical. 283 00:13:59.670 --> 00:14:02.630 And then, of course, thinking about what's 284 00:14:02.630 --> 00:14:07.190 important to the business is not just a matter 285 00:14:07.190 --> 00:14:09.930 of what's important today but really 286 00:14:09.930 --> 00:14:12.180 what's in the DNA of the company, 287 00:14:12.180 --> 00:14:16.600 because it takes time to not just build but deploy and get 288 00:14:16.600 --> 00:14:17.920 these things up and running. 289 00:14:17.920 --> 00:14:20.360 I don't know any one of my peers who's 290 00:14:20.360 --> 00:14:23.270 ever been able to get anything up 291 00:14:23.270 --> 00:14:27.930 and running at scale in a matter of probably 292 00:14:27.930 --> 00:14:30.650 less than a year [before] you see real impact, 293 00:14:30.650 --> 00:14:33.210 and honestly it's often quite a bit longer than that. 294 00:14:33.210 --> 00:14:35.260 It's a marathon, it's not a sprint, 295 00:14:35.260 --> 00:14:37.750 and so you really have to be conscious of what 296 00:14:37.750 --> 00:14:40.840 will be the business strategy two, three, 297 00:14:40.840 --> 00:14:44.260 four years down the road, not just what is on the executive's 298 00:14:44.260 --> 00:14:45.210 mind today. 299 00:14:45.210 --> 00:14:46.793 SHERVIN KHODABANDEH: I want to pick up 300 00:14:46.793 --> 00:14:49.470 on something you said about automation, which 301 00:14:49.470 --> 00:14:53.950 is one of the larger themes that AI's being used [for], 302 00:14:53.950 --> 00:14:56.980 but I think you also alluded to it. 303 00:14:56.980 --> 00:14:59.680 I think it's unfortunate that when most people think 304 00:14:59.680 --> 00:15:04.613 about AI, they tend to think of it as the extreme case of "It's 305 00:15:04.613 --> 00:15:06.530 going to replace humans," and you were talking 306 00:15:06.530 --> 00:15:10.880 about the importance of what I call the middle ground, where 307 00:15:10.880 --> 00:15:16.180 human and AI work together, so that human-AI interaction is 308 00:15:16.180 --> 00:15:16.780 key. 309 00:15:16.780 --> 00:15:19.900 Perhaps that's also one of the reasons it's so hard to scale, 310 00:15:19.900 --> 00:15:22.640 because you've got to figure out how humans 311 00:15:22.640 --> 00:15:24.640 will work differently with AI. 312 00:15:24.640 --> 00:15:28.460 Can you share some stories or insights 313 00:15:28.460 --> 00:15:30.610 about how you've done that? 314 00:15:30.610 --> 00:15:34.950 Because it requires changing the mindset of humans and what 315 00:15:34.950 --> 00:15:35.740 they normally do. 316 00:15:35.740 --> 00:15:39.660 NITZAN MEKEL-BOBROV: I can give you a couple of examples 317 00:15:39.660 --> 00:15:42.380 that I'm seeing at eBay, but honestly, 318 00:15:42.380 --> 00:15:45.190 a lot of what I'm seeing at eBay, I've seen before. 319 00:15:45.190 --> 00:15:48.910 The transformation stage that we're on now 320 00:15:48.910 --> 00:15:51.330 is one that virtually every company is on, 321 00:15:51.330 --> 00:15:53.960 and no one is fully there yet. 322 00:15:53.960 --> 00:15:59.190 So if you think on the back-office side, specifically 323 00:15:59.190 --> 00:16:04.000 customer service, I think that anxiety is typically most acute 324 00:16:04.000 --> 00:16:06.580 there, because frankly they've seen it 325 00:16:06.580 --> 00:16:10.020 before with other technologies; this isn't their first rodeo. 326 00:16:10.020 --> 00:16:14.940 And in truth, over the course of the maturation 327 00:16:14.940 --> 00:16:18.290 of the technology, there are individual roles 328 00:16:18.290 --> 00:16:20.660 that are no longer needed. 329 00:16:20.660 --> 00:16:21.970 That is true. 330 00:16:21.970 --> 00:16:24.260 But it's not that humans aren't needed; 331 00:16:24.260 --> 00:16:26.780 it's just that the role that they do changes. 332 00:16:26.780 --> 00:16:29.010 For example, on the customer service side, 333 00:16:29.010 --> 00:16:32.740 what we are doing at eBay -- we tried for the past few years 334 00:16:32.740 --> 00:16:35.970 to inject AI in different places of the flow. 335 00:16:35.970 --> 00:16:39.970 And it was very challenging, both from 336 00:16:39.970 --> 00:16:42.360 a tech-debt perspective, because there's just 337 00:16:42.360 --> 00:16:43.980 a lot of tech debt in different places 338 00:16:43.980 --> 00:16:46.320 that made it hard to do that. 339 00:16:46.320 --> 00:16:49.750 To make it more concrete, I'll give a real example: intent 340 00:16:49.750 --> 00:16:50.790 detection. 341 00:16:50.790 --> 00:16:54.150 A customer calls, and as they're talking to you, 342 00:16:54.150 --> 00:16:58.040 there is a model that's picking up 343 00:16:58.040 --> 00:17:00.510 on what they think that the customer's trying to answer, 344 00:17:00.510 --> 00:17:03.310 so it's supposed to help the agent go 345 00:17:03.310 --> 00:17:07.200 to the right pages for help, surface the right information. 346 00:17:07.200 --> 00:17:09.560 Think about that poor agent, though: 347 00:17:09.560 --> 00:17:12.130 He's talking to a customer who, typically, when they call, 348 00:17:12.130 --> 00:17:13.630 it's not because they're happy; when 349 00:17:13.630 --> 00:17:15.380 a customer calls customer service, 350 00:17:15.380 --> 00:17:16.950 it's because they're frustrated. 351 00:17:16.950 --> 00:17:19.700 They're trying to help them while at the same time 352 00:17:19.700 --> 00:17:21.800 getting these messages on the screen, 353 00:17:21.800 --> 00:17:26.760 and that ability to multitask and pay attention 354 00:17:26.760 --> 00:17:29.260 to this thing that's flashing at them with intents 355 00:17:29.260 --> 00:17:31.960 can be more of a distraction than a help. 356 00:17:31.960 --> 00:17:36.360 What we're doing now is undertaking a more end-to-end 357 00:17:36.360 --> 00:17:40.540 approach, where we're really replacing or transforming 358 00:17:40.540 --> 00:17:45.710 customer service on our back end so that both the systems that 359 00:17:45.710 --> 00:17:50.550 the agents are using and the systems that the models are 360 00:17:50.550 --> 00:17:54.360 running on are designed together from the get-go -- 361 00:17:54.360 --> 00:17:56.000 so there's a much better interplay. 362 00:17:56.000 --> 00:17:59.010 And we also have a lot of our designers -- 363 00:17:59.010 --> 00:18:02.890 much more creative than our AI folks, in many ways -- 364 00:18:02.890 --> 00:18:07.310 helping think through what that interplay should look like, 365 00:18:07.310 --> 00:18:11.880 doing a lot of user research testing with agents on how they 366 00:18:11.880 --> 00:18:14.480 would interface with customers and technology at the same 367 00:18:14.480 --> 00:18:14.980 time. 368 00:18:14.980 --> 00:18:18.370 SAM RANSBOTHAM: The verb you used, inject, was interesting. 369 00:18:18.370 --> 00:18:21.530 You use that as saying that it didn't work to inject; 370 00:18:21.530 --> 00:18:24.733 it didn't work to Band-Aid it on or just to paste it on. 371 00:18:24.733 --> 00:18:26.400 SHERVIN KHODABANDEH: Well, another thing 372 00:18:26.400 --> 00:18:28.977 I wanted to pick up on, Sam, is, Nitzan, 373 00:18:28.977 --> 00:18:30.560 I think you made a comment like, "This 374 00:18:30.560 --> 00:18:32.840 is not their first rodeo with technology; 375 00:18:32.840 --> 00:18:34.670 they've seen other technologies." 376 00:18:34.670 --> 00:18:41.860 And I'm interested in your views on how AI might be or how AI is 377 00:18:41.860 --> 00:18:45.400 different compared to all the prior, let's say, 378 00:18:45.400 --> 00:18:49.710 technologies that came and transformed functions 379 00:18:49.710 --> 00:18:52.890 or processes, and whether you think there are any 380 00:18:52.890 --> 00:18:56.280 misconceptions there -- like people anchoring [on], maybe, 381 00:18:56.280 --> 00:19:00.180 "Oh well, this is just another ERP technology or like we did 382 00:19:00.180 --> 00:19:02.970 with some other technologies; it's the same." 383 00:19:02.970 --> 00:19:05.590 And I wonder whether you agree with that 384 00:19:05.590 --> 00:19:07.600 or whether you think there's a misconception 385 00:19:07.600 --> 00:19:10.360 to treat it the same, and what the difference 386 00:19:10.360 --> 00:19:12.240 might be with anything else people 387 00:19:12.240 --> 00:19:14.700 might be referencing or anchoring on 388 00:19:14.700 --> 00:19:16.760 based on their prior experiences. 389 00:19:16.760 --> 00:19:18.220 NITZAN MEKEL-BOBROV: Yeah, I think 390 00:19:18.220 --> 00:19:21.850 any specific piece of AI technology 391 00:19:21.850 --> 00:19:26.360 is comparable to an ERP system or some other version, 392 00:19:26.360 --> 00:19:29.530 but as a paradigm, it's a much bigger thing. 393 00:19:29.530 --> 00:19:32.960 I think the better analogy is the digital transformation. 394 00:19:32.960 --> 00:19:36.620 The change from brick-and-mortar to digital 395 00:19:36.620 --> 00:19:41.740 is probably more at the level of [an] analogy. 396 00:19:41.740 --> 00:19:43.990 Even where we are now, I would say, 397 00:19:43.990 --> 00:19:49.380 is a paradigm shift akin to the digital paradigm shift 398 00:19:49.380 --> 00:19:51.640 much more than just the introduction 399 00:19:51.640 --> 00:19:53.932 of some new ERP or CRM, etc. 400 00:19:53.932 --> 00:19:55.595 401 00:19:55.595 --> 00:19:57.470 SAM RANSBOTHAM: Nitzan, we have a new segment 402 00:19:57.470 --> 00:20:00.370 where we ask our guests a series of rapid-fire questions. 403 00:20:00.370 --> 00:20:01.930 Just answer with the first response 404 00:20:01.930 --> 00:20:03.010 that comes to your mind. 405 00:20:03.010 --> 00:20:06.280 What's your proudest moment with artificial intelligence? 406 00:20:06.280 --> 00:20:08.800 NITZAN MEKEL-BOBROV: Probably on the health care side. 407 00:20:08.800 --> 00:20:10.790 When I was in the Boston Scientific days, 408 00:20:10.790 --> 00:20:14.590 we rolled out a feature that was predictive, 409 00:20:14.590 --> 00:20:17.840 basically using signals from a pacemaker 410 00:20:17.840 --> 00:20:20.150 to predict heart failure. 411 00:20:20.150 --> 00:20:22.110 On average, it was 30 days in advance 412 00:20:22.110 --> 00:20:25.240 of when physicians would detect it otherwise, 413 00:20:25.240 --> 00:20:28.050 which is a huge lifesaving benefit. 414 00:20:28.050 --> 00:20:31.680 SAM RANSBOTHAM: What worries you about artificial intelligence? 415 00:20:31.680 --> 00:20:33.200 NITZAN MEKEL-BOBROV: Misuse. 416 00:20:33.200 --> 00:20:38.650 Misuse by bad actors, whether it's in military operations 417 00:20:38.650 --> 00:20:42.610 or in other types of nefarious activities. 418 00:20:42.610 --> 00:20:44.820 If you think of deepfakes as an example, 419 00:20:44.820 --> 00:20:47.780 just as it becomes more and more accessible and easy 420 00:20:47.780 --> 00:20:50.540 for everyone to use, I do have concerns. 421 00:20:50.540 --> 00:20:52.710 SAM RANSBOTHAM: What's your favorite activity that 422 00:20:52.710 --> 00:20:54.290 involves no technology at all? 423 00:20:54.290 --> 00:20:55.540 NITZAN MEKEL-BOBROV: Oh, crap. 424 00:20:55.540 --> 00:20:56.980 I was about to say "photography," 425 00:20:56.980 --> 00:20:59.500 and then I realized that's not even a good example. 426 00:20:59.500 --> 00:20:59.922 427 00:20:59.922 --> 00:21:01.880 SAM RANSBOTHAM: No, actually we can count that. 428 00:21:01.880 --> 00:21:03.930 I mean it's not strictly artificial intelligence, 429 00:21:03.930 --> 00:21:05.110 so we'll give you credit for that. 430 00:21:05.110 --> 00:21:06.210 SAM RANSBOTHAM: What did you want 431 00:21:06.210 --> 00:21:07.000 to be when you were a child? 432 00:21:07.000 --> 00:21:08.720 What did you want be when you grew up? 433 00:21:08.720 --> 00:21:10.370 AI engineer at eBay? 434 00:21:10.370 --> 00:21:13.200 NITZAN MEKEL-BOBROV: I actually started as a creative writing 435 00:21:13.200 --> 00:21:16.750 major, so I guess writer was my initial passion. 436 00:21:16.750 --> 00:21:17.250 437 00:21:17.250 --> 00:21:19.380 SAM RANSBOTHAM: What's your greatest 438 00:21:19.380 --> 00:21:21.880 wish for what we're going to do with artificial intelligence 439 00:21:21.880 --> 00:21:22.463 in the future? 440 00:21:22.463 --> 00:21:23.750 441 00:21:23.750 --> 00:21:25.400 NITZAN MEKEL-BOBROV: I think, making 442 00:21:25.400 --> 00:21:31.200 it easier for almost anyone to make a living pursuing 443 00:21:31.200 --> 00:21:32.180 their passion. 444 00:21:32.180 --> 00:21:34.020 SAM RANSBOTHAM: Nitzan, great meeting and talking with you 445 00:21:34.020 --> 00:21:34.670 today. 446 00:21:34.670 --> 00:21:36.630 I think one thing that is going to resonate 447 00:21:36.630 --> 00:21:38.690 with a lot of our listeners is this idea 448 00:21:38.690 --> 00:21:43.150 that AI implementations are different than existing 449 00:21:43.150 --> 00:21:47.180 implementations like ERPs; that that siloed approach that you 450 00:21:47.180 --> 00:21:50.510 might take toward a monolithic technology 451 00:21:50.510 --> 00:21:53.630 is very different when you involve lots of users, 452 00:21:53.630 --> 00:21:55.460 and particularly when you involve your two 453 00:21:55.460 --> 00:21:57.888 platforms, with both your customers and your sellers. 454 00:21:57.888 --> 00:21:59.680 Thanks for taking the time to talk with us. 455 00:21:59.680 --> 00:22:01.308 We really appreciate it; thank you. 456 00:22:01.308 --> 00:22:03.350 SHERVIN KHODABANDEH: Thank you for being with us. 457 00:22:03.350 --> 00:22:05.660 It's been quite insightful, and we really, really 458 00:22:05.660 --> 00:22:06.270 appreciate it. 459 00:22:06.270 --> 00:22:07.937 NITZAN MEKEL-BOBROV: Thank you, Shervin. 460 00:22:07.937 --> 00:22:08.450 461 00:22:08.450 --> 00:22:10.630 SAM RANSBOTHAM: Thank you for joining us today. 462 00:22:10.630 --> 00:22:14.230 Next time, we'll talk with Helen Lee, technical fellow 463 00:22:14.230 --> 00:22:16.450 and regional director at Boeing. 464 00:22:16.450 --> 00:22:17.695 Please join us. 465 00:22:17.695 --> 00:22:18.532 466 00:22:18.532 --> 00:22:19.990 ALLISON RYDER: Thanks for listening 467 00:22:19.990 --> 00:22:21.500 to Me, Myself, and AI. 468 00:22:21.500 --> 00:22:23.950 We believe, like you, that the conversation 469 00:22:23.950 --> 00:22:26.170 about AI implementation doesn't start and stop 470 00:22:26.170 --> 00:22:27.297 with this podcast. 471 00:22:27.297 --> 00:22:29.130 That's why we've created a group on LinkedIn 472 00:22:29.130 --> 00:22:30.960 specifically for leaders like you. 473 00:22:30.960 --> 00:22:33.700 It's called AI for Leaders, and if you join us, 474 00:22:33.700 --> 00:22:35.740 you can chat with show creators and hosts, 475 00:22:35.740 --> 00:22:39.360 ask your own questions, share insights, and gain access 476 00:22:39.360 --> 00:22:41.840 to valuable resources about AI implementation 477 00:22:41.840 --> 00:22:43.930 from MIT SMR and BCG. 478 00:22:43.930 --> 00:22:49.060 You can access it by visiting mitsmr.com/AIforLeaders. 479 00:22:49.060 --> 00:22:51.770 We'll put that link in the show notes, 480 00:22:51.770 --> 00:22:54.210 and we hope to see you there. 481 00:22:54.210 --> 00:22:59.000