WEBVTT 1 00:00:00.000 --> 00:00:00.900 2 00:00:00.900 --> 00:00:03.290 SAM RANSBOTHAM: What do Lego have 3 00:00:03.290 --> 00:00:05.550 to do with how PayPal thinks about AI? 4 00:00:05.550 --> 00:00:07.816 Find out on today's episode. 5 00:00:07.816 --> 00:00:09.237 6 00:00:09.237 --> 00:00:11.070 KHATEREH KHODAVIRDI: I'm Khatereh Khodavirdi 7 00:00:11.070 --> 00:00:12.900 from PayPal, and you're listening 8 00:00:12.900 --> 00:00:14.550 to Me, Myself, and AI. 9 00:00:14.550 --> 00:00:17.860 SAM RANSBOTHAM: Welcome to Me, Myself, and AI, 10 00:00:17.860 --> 00:00:20.870 a podcast on artificial intelligence in business. 11 00:00:20.870 --> 00:00:24.640 Each episode, we introduce you to someone innovating with AI. 12 00:00:24.640 --> 00:00:28.930 I'm Sam Ransbotham, professor of analytics at Boston College. 13 00:00:28.930 --> 00:00:32.479 I'm also the AI and business strategy guest editor 14 00:00:32.479 --> 00:00:33.771 at MIT Sloan Management Review. 15 00:00:33.771 --> 00:00:34.271 16 00:00:34.271 --> 00:00:36.340 SHERVIN KHODABANDEH: And I'm Shervin Khodabandeh, 17 00:00:36.340 --> 00:00:40.350 senior partner with BCG, and I colead BCG's AI practice 18 00:00:40.350 --> 00:00:41.350 in North America. 19 00:00:41.350 --> 00:00:45.850 Together, MIT SMR and BCG have been researching and publishing 20 00:00:45.850 --> 00:00:48.790 on AI for six years, interviewing hundreds 21 00:00:48.790 --> 00:00:50.900 of practitioners and surveying thousands 22 00:00:50.900 --> 00:00:54.390 of companies on what it takes to build and to deploy and scale 23 00:00:54.390 --> 00:00:56.860 AI capabilities and really transform 24 00:00:56.860 --> 00:00:58.170 the way organizations operate. 25 00:00:58.170 --> 00:01:00.990 SAM RANSBOTHAM: Today, Shervin and I 26 00:01:00.990 --> 00:01:02.720 are talking with Khatereh Khodavirdi. 27 00:01:02.720 --> 00:01:05.600 She's the senior director of data science at PayPal. 28 00:01:05.600 --> 00:01:06.990 Khatereh, thanks for joining us. 29 00:01:06.990 --> 00:01:07.490 Welcome. 30 00:01:07.490 --> 00:01:09.240 KHATEREH KHODAVIRDI: Thanks for having me. 31 00:01:09.240 --> 00:01:10.020 Great to be here. 32 00:01:10.020 --> 00:01:13.200 SAM RANSBOTHAM: You build and oversee giant data science 33 00:01:13.200 --> 00:01:16.610 teams for the many different entities within PayPal. 34 00:01:16.610 --> 00:01:19.920 A lot of people know PayPal, but probably some people 35 00:01:19.920 --> 00:01:23.250 don't realize the extent of all the PayPal activities, 36 00:01:23.250 --> 00:01:24.720 so maybe let's start there. 37 00:01:24.720 --> 00:01:26.960 Tell us a bit about what PayPal does 38 00:01:26.960 --> 00:01:29.800 and what all these different subentities do as well. 39 00:01:29.800 --> 00:01:30.758 How are they connected? 40 00:01:30.758 --> 00:01:31.800 KHATEREH KHODAVIRDI: Yes. 41 00:01:31.800 --> 00:01:33.390 Obviously, a majority of the people 42 00:01:33.390 --> 00:01:35.780 know PayPal through PayPal Checkout, which 43 00:01:35.780 --> 00:01:38.500 is a core product that we have, and the company 44 00:01:38.500 --> 00:01:40.090 started from that product. 45 00:01:40.090 --> 00:01:42.150 But we have a wealth of different products, 46 00:01:42.150 --> 00:01:45.010 especially on the consumer and merchant side. 47 00:01:45.010 --> 00:01:48.100 We have whole suites of products to provide people 48 00:01:48.100 --> 00:01:51.320 the capability on the merchant side to run their business, 49 00:01:51.320 --> 00:01:55.120 from invoicing [to] having the offline capability 50 00:01:55.120 --> 00:01:58.400 for financial services and then the online transactions, 51 00:01:58.400 --> 00:02:00.910 and on the consumer side, starting 52 00:02:00.910 --> 00:02:04.510 from peer-to-peer payment to financial services 53 00:02:04.510 --> 00:02:07.800 to different types of credit and "buy now, pay 54 00:02:07.800 --> 00:02:10.380 later" capability, and savings accounts. 55 00:02:10.380 --> 00:02:12.890 I can go on and on, but on the consumer side, 56 00:02:12.890 --> 00:02:15.150 we have a wealth of different products. 57 00:02:15.150 --> 00:02:18.790 Over time, we also acquired many other companies to help us 58 00:02:18.790 --> 00:02:21.540 accelerate and also be incremental in terms 59 00:02:21.540 --> 00:02:24.650 of the value chain that we are creating for the consumer 60 00:02:24.650 --> 00:02:27.530 and merchant, such as on the consumer side, 61 00:02:27.530 --> 00:02:29.970 a couple years ago, we acquired a company -- 62 00:02:29.970 --> 00:02:33.020 its name is Honey -- which actually helps you to find 63 00:02:33.020 --> 00:02:37.890 the best deal on the internet when you're shopping online. 64 00:02:37.890 --> 00:02:40.410 I started at PayPal on the consumer side, 65 00:02:40.410 --> 00:02:42.680 so I was helping the small-business group, 66 00:02:42.680 --> 00:02:46.760 helping accelerate and solve the problems for our customers 67 00:02:46.760 --> 00:02:49.730 through data and data science capability. 68 00:02:49.730 --> 00:02:52.770 And then, over time, I supported all the merchant side 69 00:02:52.770 --> 00:02:55.450 of our equation -- all the enterprise merchants, 70 00:02:55.450 --> 00:02:58.570 channel partners, our relationship with Shopify, 71 00:02:58.570 --> 00:03:01.920 WooCommerce, Magenta -- programmatically bringing 72 00:03:01.920 --> 00:03:03.730 a merchant for us. 73 00:03:03.730 --> 00:03:05.760 And then, a couple months ago, I actually 74 00:03:05.760 --> 00:03:08.450 switched completely to the consumer side of the house. 75 00:03:08.450 --> 00:03:11.690 So I'm new on the consumer side but super excited, 76 00:03:11.690 --> 00:03:15.780 because you can leverage a lot of AI and data science 77 00:03:15.780 --> 00:03:18.620 capability to solve a lot of interesting problems 78 00:03:18.620 --> 00:03:19.393 in this domain. 79 00:03:19.393 --> 00:03:20.810 SHERVIN KHODABANDEH: That's great. 80 00:03:20.810 --> 00:03:25.070 So obviously, PayPal is a big, multifaceted company 81 00:03:25.070 --> 00:03:27.805 with different business and subdivisions, 82 00:03:27.805 --> 00:03:28.930 and you talked about those. 83 00:03:28.930 --> 00:03:32.260 But since you're in the consumer group, let's talk about that. 84 00:03:32.260 --> 00:03:36.580 Share with us a bit how AI's being used in the consumer 85 00:03:36.580 --> 00:03:40.980 business to drive the themes you want with your consumers 86 00:03:40.980 --> 00:03:42.880 and what use cases is AI being used for. 87 00:03:42.880 --> 00:03:44.650 KHATEREH KHODAVIRDI: We can actually 88 00:03:44.650 --> 00:03:46.980 bring all the different accounts together, 89 00:03:46.980 --> 00:03:50.210 so we have a point of view around the user interactions 90 00:03:50.210 --> 00:03:50.710 with us. 91 00:03:50.710 --> 00:03:54.440 For example, if Shervin has a relationship to Honey with us, 92 00:03:54.440 --> 00:03:57.900 but at the same time he's using peer-to-peer or the Checkout 93 00:03:57.900 --> 00:04:00.520 product or our credit capability, 94 00:04:00.520 --> 00:04:02.810 how can we make sure that we can actually 95 00:04:02.810 --> 00:04:06.660 look at Shervin's relationship with PayPal through one lens? 96 00:04:06.660 --> 00:04:09.700 So that is, I would say, one of the fundamental problems we are 97 00:04:09.700 --> 00:04:11.250 trying to solve as a company. 98 00:04:11.250 --> 00:04:14.880 But the piece I'm personally super excited about is around 99 00:04:14.880 --> 00:04:18.060 the fact of understanding the customer journey with us 100 00:04:18.060 --> 00:04:21.140 and how we can leverage personalization and AI 101 00:04:21.140 --> 00:04:23.910 to actually solve customer use cases -- 102 00:04:23.910 --> 00:04:27.900 that "What are the jobs they are coming to PayPal to get done?" 103 00:04:27.900 --> 00:04:30.970 and "How can we show them the personalized and relevant 104 00:04:30.970 --> 00:04:33.233 messages to help them?" 105 00:04:33.233 --> 00:04:35.400 SO THINK ABOUT IT: Internally, we always said, "Hey, 106 00:04:35.400 --> 00:04:37.690 there are a bunch of happy paths with PayPal, 107 00:04:37.690 --> 00:04:39.970 and there are a bunch of sad paths with PayPal. 108 00:04:39.970 --> 00:04:43.510 How can we migrate people from one happy path 109 00:04:43.510 --> 00:04:47.688 to a happier path with us, and how can we avoid the sad path 110 00:04:47.688 --> 00:04:48.480 with the customer?" 111 00:04:48.480 --> 00:04:53.510 And I truly believe that this is basically an AI capability 112 00:04:53.510 --> 00:04:55.530 that we need to develop for customers. 113 00:04:55.530 --> 00:04:58.480 If you build that, it will unlock a humongous amount 114 00:04:58.480 --> 00:05:01.850 of value for our consumers and for us as a company. 115 00:05:01.850 --> 00:05:06.230 SHERVIN KHODABANDEH: Let's talk about some of those paths. 116 00:05:06.230 --> 00:05:08.730 You said, "What kind of jobs are people trying to get done?" 117 00:05:08.730 --> 00:05:10.080 so let's talk about that. 118 00:05:10.080 --> 00:05:13.165 Tell us how personalization could be helpful there. 119 00:05:13.165 --> 00:05:15.040 KHATEREH KHODAVIRDI: Think about it this way: 120 00:05:15.040 --> 00:05:17.480 Imagine Shervin is one of the people that is actually 121 00:05:17.480 --> 00:05:21.400 using PayPal Checkout to shop through different merchants 122 00:05:21.400 --> 00:05:23.810 throughout the internet as well. 123 00:05:23.810 --> 00:05:26.160 As you can imagine, once we acquired Honey, 124 00:05:26.160 --> 00:05:30.390 we also had a whole wealth of the coupons that are 125 00:05:30.390 --> 00:05:32.520 [available] out there, and over time, 126 00:05:32.520 --> 00:05:36.210 if you get a better sense around what type of categories or what 127 00:05:36.210 --> 00:05:38.850 type of merchant Shervin is interested in, 128 00:05:38.850 --> 00:05:43.020 we can actually show Shervin the right deals at the right time. 129 00:05:43.020 --> 00:05:45.560 For example, it's back-to-school time, 130 00:05:45.560 --> 00:05:47.840 and we know that historically, at this time, 131 00:05:47.840 --> 00:05:50.280 you shop in this type of category. 132 00:05:50.280 --> 00:05:53.120 Currently, you know, Target or Walmart 133 00:05:53.120 --> 00:05:56.100 or some of the top merchants for back to school 134 00:05:56.100 --> 00:05:58.060 are running these deals, and then 135 00:05:58.060 --> 00:06:02.190 we remind you and show you relevant and personalized deals 136 00:06:02.190 --> 00:06:04.300 to actually drive activity with us. 137 00:06:04.300 --> 00:06:08.570 Or you might use Windmill on a weekly or biweekly basis 138 00:06:08.570 --> 00:06:12.500 to send somebody who cleans your house some money. 139 00:06:12.500 --> 00:06:14.300 We can remind you that, "Hey, Shervin, it 140 00:06:14.300 --> 00:06:16.258 looks like you're doing this," and through just 141 00:06:16.258 --> 00:06:18.620 one-click touch, you go and do that. 142 00:06:18.620 --> 00:06:21.710 So basically, we become part of Shervin's life 143 00:06:21.710 --> 00:06:23.550 and understand what type of activity 144 00:06:23.550 --> 00:06:27.850 Shervin is trying to do and make it much easier for him. 145 00:06:27.850 --> 00:06:30.590 The other part of it that we are all super excited about it 146 00:06:30.590 --> 00:06:31.660 is tracking. 147 00:06:31.660 --> 00:06:34.340 When you shop online to a different retailer, 148 00:06:34.340 --> 00:06:37.160 one part of it is, you want to track that order 149 00:06:37.160 --> 00:06:40.860 and see where your order is and when you are getting it. 150 00:06:40.860 --> 00:06:42.830 We also want to make it much easier -- 151 00:06:42.830 --> 00:06:45.980 you can actually get the notification and see where 152 00:06:45.980 --> 00:06:50.130 your order is, and you can look at your whole commerce activity 153 00:06:50.130 --> 00:06:53.080 and financial activity in one place. 154 00:06:53.080 --> 00:06:55.700 I would say for me, also, the other part is that, yes, 155 00:06:55.700 --> 00:06:58.560 I'm in a data function and we build models, 156 00:06:58.560 --> 00:07:00.520 and I look at the outcome of our model, 157 00:07:00.520 --> 00:07:02.830 but also the more important part for me is -- 158 00:07:02.830 --> 00:07:07.160 I always call it qualitative and quantitative. 159 00:07:07.160 --> 00:07:09.750 I would say, "Hey, I want a sample of some 160 00:07:09.750 --> 00:07:11.100 of the customers in this group. 161 00:07:11.100 --> 00:07:13.710 I want to understand their journey, the activity they 162 00:07:13.710 --> 00:07:15.940 had with us," or, you know, actually 163 00:07:15.940 --> 00:07:18.240 attend the user research studies that we 164 00:07:18.240 --> 00:07:21.030 are doing with customers, so first-hand, I actually 165 00:07:21.030 --> 00:07:22.880 hear the challenges and the problems 166 00:07:22.880 --> 00:07:23.950 we are trying to solve. 167 00:07:23.950 --> 00:07:26.950 Because for me, the most important part is that I just 168 00:07:26.950 --> 00:07:30.420 cannot go and in isolation build the model and solve all 169 00:07:30.420 --> 00:07:31.270 the problems. 170 00:07:31.270 --> 00:07:33.920 It's really doing the qualitative 171 00:07:33.920 --> 00:07:37.000 and quantitative aspects and learning from each other 172 00:07:37.000 --> 00:07:38.222 and improving it over time. 173 00:07:38.222 --> 00:07:39.930 SHERVIN KHODABANDEH: That's very helpful. 174 00:07:39.930 --> 00:07:44.293 I have to imagine that being more of a tech company than, 175 00:07:44.293 --> 00:07:45.960 let's say, a financial services company, 176 00:07:45.960 --> 00:07:48.080 that some of these challenges are actually 177 00:07:48.080 --> 00:07:50.550 a lot easier for you to deal with than it would be, 178 00:07:50.550 --> 00:07:53.610 let's say, for a gigantic global bank 179 00:07:53.610 --> 00:07:57.960 that's trying to personalize across products and business 180 00:07:57.960 --> 00:07:59.570 lines and all that. 181 00:07:59.570 --> 00:08:01.370 Share with us some of the challenges. 182 00:08:01.370 --> 00:08:03.610 What's difficult about what you need to do? 183 00:08:03.610 --> 00:08:06.152 KHATEREH KHODAVIRDI: I think one of the biggest challenges we 184 00:08:06.152 --> 00:08:08.960 have is that some of the other companies that we acquired 185 00:08:08.960 --> 00:08:12.040 over time, each of them is using different data stacks, 186 00:08:12.040 --> 00:08:16.110 so basically migrating all of them to one data lake 187 00:08:16.110 --> 00:08:18.350 and having one kind of technology 188 00:08:18.350 --> 00:08:20.000 to use across the board and being 189 00:08:20.000 --> 00:08:25.730 able to [build] that common data layer platform end to end, 190 00:08:25.730 --> 00:08:28.160 and understanding all the touch points and all the data 191 00:08:28.160 --> 00:08:29.750 points we have with the customer. 192 00:08:29.750 --> 00:08:32.820 So having that common data technology platform 193 00:08:32.820 --> 00:08:35.390 is one of the common challenges we have internally. 194 00:08:35.390 --> 00:08:38.030 SAM RANSBOTHAM: That's huge, I think, for everybody. 195 00:08:38.030 --> 00:08:40.100 I remember my first experience with this. 196 00:08:40.100 --> 00:08:43.330 Back in my past life, I used to work at the United Nations, 197 00:08:43.330 --> 00:08:47.140 and I was working with databases, and I looked down 198 00:08:47.140 --> 00:08:49.600 and we had just literally dozens of databases 199 00:08:49.600 --> 00:08:52.780 we were supporting, and I asked, "Well, which is the standard?" 200 00:08:52.780 --> 00:08:55.445 And they said, "Well, this is the standard." 201 00:08:55.445 --> 00:08:57.570 And, I said, "Well, what are all the rest of them?" 202 00:08:57.570 --> 00:08:59.550 "Well, those were the standards then." 203 00:08:59.550 --> 00:09:03.080 And so you're in that same situation when you're acquiring 204 00:09:03.080 --> 00:09:06.160 companies, where you're pulling together lots 205 00:09:06.160 --> 00:09:08.760 of technology stacks [but] you don't 206 00:09:08.760 --> 00:09:11.078 want to just kind of rip them out and start over. 207 00:09:11.078 --> 00:09:12.370 How do you manage that process? 208 00:09:12.370 --> 00:09:15.810 How do you get those in a cohesive data science process? 209 00:09:15.810 --> 00:09:17.810 KHATEREH KHODAVIRDI: Yeah, so, I would say, Sam, 210 00:09:17.810 --> 00:09:19.920 you brought a very good question as well -- 211 00:09:19.920 --> 00:09:22.110 that it's not only the technology problem aspect 212 00:09:22.110 --> 00:09:22.610 of it. 213 00:09:22.610 --> 00:09:24.820 I call it the data governance aspect of it 214 00:09:24.820 --> 00:09:27.420 as well: what the definitions are, 215 00:09:27.420 --> 00:09:30.450 and how different people define and look at different things 216 00:09:30.450 --> 00:09:32.370 differently, and how you can define 217 00:09:32.370 --> 00:09:34.460 that common language across the board 218 00:09:34.460 --> 00:09:37.330 internally within the company as well. 219 00:09:37.330 --> 00:09:40.340 So that is why we are trying internally to develop the best 220 00:09:40.340 --> 00:09:43.960 practices that every new company that we have here 221 00:09:43.960 --> 00:09:46.840 are the steps one to end that we are going through 222 00:09:46.840 --> 00:09:50.300 to incorporate them as part of the rest of the data assets 223 00:09:50.300 --> 00:09:51.680 we have for the company. 224 00:09:51.680 --> 00:09:54.480 But you can imagine it's not an easy exercise. 225 00:09:54.480 --> 00:09:56.290 It's a humongous task. 226 00:09:56.290 --> 00:09:59.660 But that data governance aspect of is it also very important, 227 00:09:59.660 --> 00:10:02.580 because when you are bringing different components together, 228 00:10:02.580 --> 00:10:04.810 you need to take a step back and look 229 00:10:04.810 --> 00:10:06.860 at the definition from a different lens as well 230 00:10:06.860 --> 00:10:08.340 and see if those definitions still 231 00:10:08.340 --> 00:10:10.800 are relevant in the new construct or not. 232 00:10:10.800 --> 00:10:13.950 SHERVIN KHODABANDEH: Let's talk about teams a little bit. 233 00:10:13.950 --> 00:10:17.070 You're talking about a series of challenges 234 00:10:17.070 --> 00:10:21.420 with personalization, with other use cases, 235 00:10:21.420 --> 00:10:25.100 and then the capabilities that are required to get that done, 236 00:10:25.100 --> 00:10:29.060 from data coming together and identity resolution 237 00:10:29.060 --> 00:10:31.750 and many, many other things. 238 00:10:31.750 --> 00:10:33.200 Tell us about the team. 239 00:10:33.200 --> 00:10:36.490 Clearly, they have to have very strong technical capabilities, 240 00:10:36.490 --> 00:10:38.990 but what else do they need to have to work in that 241 00:10:38.990 --> 00:10:40.344 kind of an environment? 242 00:10:40.344 --> 00:10:40.465 243 00:10:40.465 --> 00:10:42.340 KHATEREH KHODAVIRDI: I would say this area is 244 00:10:42.340 --> 00:10:45.090 one of the areas that is so multidisciplinary, 245 00:10:45.090 --> 00:10:48.120 and you can imagine the different types of problems 246 00:10:48.120 --> 00:10:51.090 you want to solve need different types of skill sets 247 00:10:51.090 --> 00:10:52.920 and these types of talent. 248 00:10:52.920 --> 00:10:55.440 So I always said that in data science, AI, 249 00:10:55.440 --> 00:10:57.690 or the overall data field, one of the things that 250 00:10:57.690 --> 00:11:00.910 is really important is diversity of talent. 251 00:11:00.910 --> 00:11:03.040 And by diversity of talent, I don't only 252 00:11:03.040 --> 00:11:05.850 mean diversity of gender or background, which 253 00:11:05.850 --> 00:11:09.230 is very important, but diversity of thought leadership, 254 00:11:09.230 --> 00:11:12.500 diversity of problem-solving, diversity of technical skill 255 00:11:12.500 --> 00:11:15.720 set, because at the different stages of the problem, 256 00:11:15.720 --> 00:11:18.980 basically, you need to practice a different type of muscle 257 00:11:18.980 --> 00:11:22.000 in order to get the desired outcome. 258 00:11:22.000 --> 00:11:25.350 So, for example, one of the areas that I really feel [is] 259 00:11:25.350 --> 00:11:30.900 underappreciated in the data world is business acumen -- 260 00:11:30.900 --> 00:11:34.420 people who can actually tackle the problem through a very 261 00:11:34.420 --> 00:11:37.670 structured framework and be able to synthesize 262 00:11:37.670 --> 00:11:41.130 the recommendation and the "so what" to the business. 263 00:11:41.130 --> 00:11:43.990 Because the worst thing that can happen is that you look 264 00:11:43.990 --> 00:11:47.470 at your data team and you feel like they are building a bunch 265 00:11:47.470 --> 00:11:50.460 of black boxes for the rest of the organization, 266 00:11:50.460 --> 00:11:52.940 and you're not investing in the last mile -- 267 00:11:52.940 --> 00:11:57.060 that people actually translating the "what" and the "why" 268 00:11:57.060 --> 00:11:59.620 and the "so what" to the business group, 269 00:11:59.620 --> 00:12:02.590 to the product group, and to the rest of the organization. 270 00:12:02.590 --> 00:12:05.140 So you would not get the adoption 271 00:12:05.140 --> 00:12:07.870 that you are hoping from the capabilities 272 00:12:07.870 --> 00:12:09.050 that you are building. 273 00:12:09.050 --> 00:12:12.500 So the way I'm looking at it is that we are actually also 274 00:12:12.500 --> 00:12:15.250 building a product organization who supports 275 00:12:15.250 --> 00:12:19.130 the personalization and AI, because like any other product 276 00:12:19.130 --> 00:12:23.220 development cycle, you basically have a product strategy behind 277 00:12:23.220 --> 00:12:25.970 this -- that we are not practically building the AI 278 00:12:25.970 --> 00:12:28.890 model to solve specific use cases. 279 00:12:28.890 --> 00:12:32.300 We actually take a step back, understanding our consumer 280 00:12:32.300 --> 00:12:36.050 personas -- what are the jobs they are coming to us to get 281 00:12:36.050 --> 00:12:39.560 done? -- and build a product road map and product vision 282 00:12:39.560 --> 00:12:42.800 around this and tackle this problem in a cross-functional 283 00:12:42.800 --> 00:12:46.080 fashion instead of just internally within the data 284 00:12:46.080 --> 00:12:46.820 group. 285 00:12:46.820 --> 00:12:49.970 SAM RANSBOTHAM: Is it harder to get that last mile 286 00:12:49.970 --> 00:12:51.660 with AI-type projects? 287 00:12:51.660 --> 00:12:53.840 Is that something the people have a harder time 288 00:12:53.840 --> 00:12:54.760 understanding? 289 00:12:54.760 --> 00:12:57.790 Is it something that is harder for people to relate to? 290 00:12:57.790 --> 00:13:00.780 KHATEREH KHODAVIRDI: It might be a little bit harder 291 00:13:00.780 --> 00:13:04.340 because of just the scale of the problems that you are trying 292 00:13:04.340 --> 00:13:07.400 to solve here, because people cannot relate to it when 293 00:13:07.400 --> 00:13:10.340 you're talking about millions of customers. 294 00:13:10.340 --> 00:13:13.750 So for me, it's how I can actually break down the problem 295 00:13:13.750 --> 00:13:16.900 and solve smaller use cases through AI 296 00:13:16.900 --> 00:13:19.550 to create that adoption and championship 297 00:13:19.550 --> 00:13:21.820 in the organization [so] that it will help me 298 00:13:21.820 --> 00:13:23.270 to solve the biggest problem. 299 00:13:23.270 --> 00:13:26.160 It's a humongous task, because I'm not only 300 00:13:26.160 --> 00:13:29.340 talking about the product touch points that we have. 301 00:13:29.340 --> 00:13:32.880 I'm talking about every touch point that the customers have 302 00:13:32.880 --> 00:13:36.170 with us, either through customer service, through risk -- 303 00:13:36.170 --> 00:13:39.260 through all the different functions within the company. 304 00:13:39.260 --> 00:13:42.360 Rallying all of those cross-functional functions 305 00:13:42.360 --> 00:13:45.190 around solving this problem would be much harder, 306 00:13:45.190 --> 00:13:47.090 versus my approach is that I will first 307 00:13:47.090 --> 00:13:50.040 start with solving it within the product organization, 308 00:13:50.040 --> 00:13:53.130 understanding all the touch points with all the products, 309 00:13:53.130 --> 00:13:56.270 understanding how we can understand the product task, 310 00:13:56.270 --> 00:13:58.140 and personalizing that component. 311 00:13:58.140 --> 00:14:00.420 And then you can add an additional layer -- 312 00:14:00.420 --> 00:14:03.760 like bring the risk component -- with each of the product 313 00:14:03.760 --> 00:14:06.290 components, then you add the customer service. 314 00:14:06.290 --> 00:14:08.620 I would think about it more of a Legoland -- 315 00:14:08.620 --> 00:14:11.870 that at the end of the day, we will have an AI Legoland 316 00:14:11.870 --> 00:14:12.800 for PayPal. 317 00:14:12.800 --> 00:14:16.010 But right now, the way I am attacking to solve this problem 318 00:14:16.010 --> 00:14:19.470 is to build each of the individual Lego pieces 319 00:14:19.470 --> 00:14:21.930 with the hope that I can orchestrate and build 320 00:14:21.930 --> 00:14:25.530 a Legoland and it won't become like bunch of separate Lego 321 00:14:25.530 --> 00:14:28.765 that are not orchestrated to solve the common equation. 322 00:14:28.765 --> 00:14:30.640 SHERVIN KHODABANDEH: I love the Lego analogy, 323 00:14:30.640 --> 00:14:33.240 because my kids are totally into Lego 324 00:14:33.240 --> 00:14:36.420 and we have probably like 900,000 different pieces 325 00:14:36.420 --> 00:14:38.390 of Lego going on at any given time. 326 00:14:38.390 --> 00:14:41.350 And if you just look at it in isolation, 327 00:14:41.350 --> 00:14:43.680 you think, "OK, this is all we're doing." 328 00:14:43.680 --> 00:14:46.150 But of course, you've got to start there, 329 00:14:46.150 --> 00:14:47.830 and then the pieces come together. 330 00:14:47.830 --> 00:14:52.860 So then my question to you is, as with Lego, 331 00:14:52.860 --> 00:14:56.770 when I see my 12-year-old or eight-year-old building stuff, 332 00:14:56.770 --> 00:14:58.630 and I'm looking at it in isolation, 333 00:14:58.630 --> 00:15:02.050 I might not have a full sense of the vision [of what] 334 00:15:02.050 --> 00:15:03.342 the whole thing is going to be. 335 00:15:03.342 --> 00:15:04.967 So, I might say, "Oh, this is nothing," 336 00:15:04.967 --> 00:15:06.080 or "What are you building? 337 00:15:06.080 --> 00:15:06.920 It's like a small piece. 338 00:15:06.920 --> 00:15:08.670 Didn't you do something like this before?" 339 00:15:08.670 --> 00:15:11.900 And then they bring out the box with, like, 8,000 pieces 340 00:15:11.900 --> 00:15:13.450 that's going to look like this. 341 00:15:13.450 --> 00:15:15.020 And then, I go, "Uh-huh." 342 00:15:15.020 --> 00:15:17.090 How are you doing the big "uh-huh" 343 00:15:17.090 --> 00:15:21.680 here at PayPal so people don't lose sight of the big vision 344 00:15:21.680 --> 00:15:26.570 and don't get myopic about the little things that it takes 345 00:15:26.570 --> 00:15:27.720 to get there? 346 00:15:27.720 --> 00:15:30.240 KHATEREH KHODAVIRDI: You can imagine there is no shortage 347 00:15:30.240 --> 00:15:34.690 of individual use cases -- that there are many individual AI, 348 00:15:34.690 --> 00:15:36.970 or whatever you name it, capabilities within 349 00:15:36.970 --> 00:15:37.980 the organization. 350 00:15:37.980 --> 00:15:39.650 But when you take a step back, you 351 00:15:39.650 --> 00:15:42.000 do not have that guiding principle 352 00:15:42.000 --> 00:15:44.830 to see how they can help you to actually build that Legoland. 353 00:15:44.830 --> 00:15:47.430 And actually I want to tackle this problem 354 00:15:47.430 --> 00:15:51.310 in a reverse order; first I want to take a step back and say, 355 00:15:51.310 --> 00:15:55.550 "Hey, what will the blue sky look like in terms of the AI 356 00:15:55.550 --> 00:15:58.150 capability and personalization for us?" 357 00:15:58.150 --> 00:16:01.400 So building that product strategy and vision around it, 358 00:16:01.400 --> 00:16:03.480 and then trying to solve backward, 359 00:16:03.480 --> 00:16:07.700 and then breaking it down into smaller pieces of the component 360 00:16:07.700 --> 00:16:11.880 Lego and being very prescriptive around what are the key 361 00:16:11.880 --> 00:16:15.070 problems each of these Lego will try to solve, 362 00:16:15.070 --> 00:16:18.010 and why we are building each piece of the Legoland -- 363 00:16:18.010 --> 00:16:20.820 what will be the "so what" to the organization. 364 00:16:20.820 --> 00:16:22.560 Then you can actually build something 365 00:16:22.560 --> 00:16:25.890 because if you show that whole vision to everybody, 366 00:16:25.890 --> 00:16:29.540 it might be a little bit too much for some of the people 367 00:16:29.540 --> 00:16:33.200 to absorb it, so it might really slow down your progress 368 00:16:33.200 --> 00:16:34.150 in the organization. 369 00:16:34.150 --> 00:16:36.400 [In comparison,] when you show the bigger vision, 370 00:16:36.400 --> 00:16:39.800 you have something to rally the whole organization around. 371 00:16:39.800 --> 00:16:42.620 But at the same time, you can break it down 372 00:16:42.620 --> 00:16:47.020 into more tangible components so you can start making progress 373 00:16:47.020 --> 00:16:49.740 while you're actually keeping that energy 374 00:16:49.740 --> 00:16:53.680 and enthusiasm around the organization to the north star 375 00:16:53.680 --> 00:16:54.263 that you have. 376 00:16:54.263 --> 00:16:55.638 SHERVIN KHODABANDEH: The breaking 377 00:16:55.638 --> 00:16:57.070 down actually is quite critical. 378 00:16:57.070 --> 00:16:59.800 To build on the Lego analogy, usually 379 00:16:59.800 --> 00:17:02.020 you get these 5,000, 6,000 pieces of Lego, 380 00:17:02.020 --> 00:17:05.790 and they come in like 20, 50, 30 boxes or little bags, 381 00:17:05.790 --> 00:17:07.300 and so you first do this, and then 382 00:17:07.300 --> 00:17:09.819 you do that, but then you have the whole thing. 383 00:17:09.819 --> 00:17:14.589 Well, I got one two days ago that had 3,000 pieces in 20 384 00:17:14.589 --> 00:17:17.310 different bags, but all the bags are unlabeled, 385 00:17:17.310 --> 00:17:19.800 so you don't know what goes with what. 386 00:17:19.800 --> 00:17:21.750 And so then, what we have is, like, 387 00:17:21.750 --> 00:17:25.040 3,000 pieces, and we're trying to build a piece, 388 00:17:25.040 --> 00:17:27.760 and the analogy I'm trying to draw here 389 00:17:27.760 --> 00:17:31.320 is, as you're building these little capabilities that then 390 00:17:31.320 --> 00:17:34.710 come together and get stitched together to support your bigger 391 00:17:34.710 --> 00:17:38.970 vision, how do you make sure that these pieces are actually 392 00:17:38.970 --> 00:17:41.620 connecting rather than the organization is looking 393 00:17:41.620 --> 00:17:44.880 to find what piece goes where, or how does this connect, 394 00:17:44.880 --> 00:17:47.540 or do I have eight of these instead of five of those? 395 00:17:47.540 --> 00:17:48.710 How do you avoid that? 396 00:17:48.710 --> 00:17:50.410 And that's, by the way, something 397 00:17:50.410 --> 00:17:52.620 that goes on in a lot of other organizations, 398 00:17:52.620 --> 00:17:55.580 where there are silos of folks building things, 399 00:17:55.580 --> 00:17:57.450 and they don't all come together. 400 00:17:57.450 --> 00:17:58.747 How do you approach that? 401 00:17:58.747 --> 00:18:01.330 KHATEREH KHODAVIRDI: You brought up a very good question that, 402 00:18:01.330 --> 00:18:04.770 Shervin, basically, hey, I cannot go to run a step from 403 00:18:04.770 --> 00:18:08.970 that vision and product strategy for the AI and personalization 404 00:18:08.970 --> 00:18:12.350 to be a tactical component of the puzzle that you have. 405 00:18:12.350 --> 00:18:16.410 But once you build that mental model around the common teams 406 00:18:16.410 --> 00:18:18.580 of the problem you are trying to solve, 407 00:18:18.580 --> 00:18:21.630 then I would say the other reality also is that I cannot 408 00:18:21.630 --> 00:18:23.390 solve all these problems by myself, 409 00:18:23.390 --> 00:18:27.150 or not that handful number of people can solve these problems 410 00:18:27.150 --> 00:18:28.010 in the organization. 411 00:18:28.010 --> 00:18:30.130 You need to create that culture, and you 412 00:18:30.130 --> 00:18:32.490 need to rally your organization around this. 413 00:18:32.490 --> 00:18:34.250 SAM RANSBOTHAM: That makes sense, 414 00:18:34.250 --> 00:18:37.190 because I like the Lego analogy, but the world 415 00:18:37.190 --> 00:18:38.410 isn't as simple as Lego. 416 00:18:38.410 --> 00:18:40.530 You don't have that perfectly labeled bag 417 00:18:40.530 --> 00:18:43.360 that you know will fit together in the end. 418 00:18:43.360 --> 00:18:46.180 You've got to bring those people along to pull that in. 419 00:18:46.180 --> 00:18:48.380 One of the things that we did early in the pandemic 420 00:18:48.380 --> 00:18:51.010 was sort our giant things of Lego 421 00:18:51.010 --> 00:18:54.200 and, when you have those Lego that are not 422 00:18:54.200 --> 00:18:56.600 in the little bags, it's practically impossible 423 00:18:56.600 --> 00:18:59.578 to find the right piece to pull it back together. 424 00:18:59.578 --> 00:19:01.870 Shervin, it reminds me of when we were talking to [Arti 425 00:19:01.870 --> 00:19:03.995 Zeighami at H&M](https://lin k.chtbl.com/lplAIPMe), 426 00:19:03.995 --> 00:19:07.502 and he was talking about working with individual pieces 427 00:19:07.502 --> 00:19:09.210 SHERVIN KHODABANDEH: On the wheel, right? 428 00:19:09.210 --> 00:19:10.585 SAM RANSBOTHAM: His analogy there 429 00:19:10.585 --> 00:19:13.740 was tightening each lug nut a little bit as you go around, 430 00:19:13.740 --> 00:19:17.940 and there's some logic here, too, that what's different, 431 00:19:17.940 --> 00:19:22.100 too, about Lego is that you could solve one bag 432 00:19:22.100 --> 00:19:24.492 and move to the next bag, but in reality, you've 433 00:19:24.492 --> 00:19:26.700 got a lot of people working on lots of different bags 434 00:19:26.700 --> 00:19:28.147 who are moving at different paces. 435 00:19:28.147 --> 00:19:29.980 KHATEREH KHODAVIRDI: No, totally, Sam, and I 436 00:19:29.980 --> 00:19:33.830 would say that is why it is very important that, in my view, 437 00:19:33.830 --> 00:19:37.180 in order to make progress on such a complex topic, 438 00:19:37.180 --> 00:19:39.550 it's kind of a top-down and bottom-up approach, 439 00:19:39.550 --> 00:19:43.090 and you just need to have that check-in on a regular basis. 440 00:19:43.090 --> 00:19:46.540 The top down, I would call it your blue-sky strategy 441 00:19:46.540 --> 00:19:49.940 around where you want to be with AI strategy, 442 00:19:49.940 --> 00:19:53.500 and the bottom up is just mainly like the tactical differences 443 00:19:53.500 --> 00:19:57.010 of the Lego that somehow exist in the organization, 444 00:19:57.010 --> 00:19:58.390 or there are different components 445 00:19:58.390 --> 00:19:59.807 that different teams are building, 446 00:19:59.807 --> 00:20:02.560 and how you can actually bring the two components together. 447 00:20:02.560 --> 00:20:05.860 And you're absolutely right that the speed of the development 448 00:20:05.860 --> 00:20:07.540 and making progress for some parts 449 00:20:07.540 --> 00:20:10.220 is more difficult than the others as well. 450 00:20:10.220 --> 00:20:13.850 So you also coordinate the different components together, 451 00:20:13.850 --> 00:20:15.920 so at the same time, you are making progress, 452 00:20:15.920 --> 00:20:18.320 you can rally the organization around that, 453 00:20:18.320 --> 00:20:21.285 but also be realistic that some other parts are more complex 454 00:20:21.285 --> 00:20:22.410 and it will take more time. 455 00:20:22.410 --> 00:20:24.952 SHERVIN KHODABANDEH: And the one thing that I think companies 456 00:20:24.952 --> 00:20:28.660 have that Lego pieces don't is they still have P&Ls 457 00:20:28.660 --> 00:20:30.440 and targets and numbers. 458 00:20:30.440 --> 00:20:32.940 And so I think that's probably why 459 00:20:32.940 --> 00:20:35.490 it's such a nuanced approach, as you were saying, KK, 460 00:20:35.490 --> 00:20:38.180 that it's like you've got to figure out where the most 461 00:20:38.180 --> 00:20:40.790 practical path is for your organization, 462 00:20:40.790 --> 00:20:43.510 given all the players and all the stakeholders 463 00:20:43.510 --> 00:20:45.270 and all the pieces, and maybe for Arti, 464 00:20:45.270 --> 00:20:48.300 it was a bit "one lug nut at a time" 465 00:20:48.300 --> 00:20:50.650 to get the whole thing going, and maybe here it 466 00:20:50.650 --> 00:20:53.140 is like, "No, we've got to get personalization perfect 467 00:20:53.140 --> 00:20:55.050 before we move on to risk or pricing." 468 00:20:55.050 --> 00:20:58.530 And I think that's the nuances of different organizations 469 00:20:58.530 --> 00:20:59.280 a little bit. 470 00:20:59.280 --> 00:21:00.100 KHATEREH KHODAVIRDI: Yeah, and I would say, 471 00:21:00.100 --> 00:21:02.070 Shervin, you brought up a really good point 472 00:21:02.070 --> 00:21:03.570 as well, that at the end of the day, 473 00:21:03.570 --> 00:21:06.490 all the organizations are very value-focused, 474 00:21:06.490 --> 00:21:09.620 both for the customers and also for the company 475 00:21:09.620 --> 00:21:12.210 and the shareholder as well. 476 00:21:12.210 --> 00:21:15.060 And for me, one of the biggest risks that people can do 477 00:21:15.060 --> 00:21:18.960 is just looking at the output and the outcome of the P&L 478 00:21:18.960 --> 00:21:22.320 [whereas] for such big projects like this, 479 00:21:22.320 --> 00:21:24.310 like personalization, the reality 480 00:21:24.310 --> 00:21:27.010 is, it will definitely take some time for you 481 00:21:27.010 --> 00:21:30.610 to actually get the true benefit of this from the output 482 00:21:30.610 --> 00:21:33.720 standpoint of your P&L. But what are the leading indicators 483 00:21:33.720 --> 00:21:36.910 and the KPIs you can have to actually keep 484 00:21:36.910 --> 00:21:40.510 the team and the organization accountable, to make progress 485 00:21:40.510 --> 00:21:43.300 so you make sure that you are moving in the right direction, 486 00:21:43.300 --> 00:21:46.650 while it will take you more time to see the whole benefit 487 00:21:46.650 --> 00:21:48.772 as an output in your P&L? 488 00:21:48.772 --> 00:21:51.230 Because the reality is that the worst thing that can happen 489 00:21:51.230 --> 00:21:54.360 is that all of us know that this is absolutely the right thing 490 00:21:54.360 --> 00:21:57.690 to do for the company, but because you wouldn't there 491 00:21:57.690 --> 00:22:00.730 is no magic, there is all this good work 492 00:22:00.730 --> 00:22:04.180 consistently over a long period of time. 493 00:22:04.180 --> 00:22:07.870 There is no magic that you see the output overnight, 494 00:22:07.870 --> 00:22:10.800 so how can you keep the team accountable, the working team, 495 00:22:10.800 --> 00:22:13.540 to make progress, but to the leading indicator 496 00:22:13.540 --> 00:22:16.048 that you know that eventually it will get you to the outcome 497 00:22:16.048 --> 00:22:17.090 that you are looking for? 498 00:22:17.090 --> 00:22:17.900 SHERVIN KHODABANDEH: For sure, right? 499 00:22:17.900 --> 00:22:20.360 As you're saying, the worst possible thing 500 00:22:20.360 --> 00:22:27.410 would be to set a big outcome goal for the vision, 501 00:22:27.410 --> 00:22:29.150 but for the wrong time. 502 00:22:29.150 --> 00:22:30.780 KK, this has been quite insightful. 503 00:22:30.780 --> 00:22:34.120 I think, Sam, we should move to the rapid-fire question 504 00:22:34.120 --> 00:22:34.700 segment. 505 00:22:34.700 --> 00:22:36.840 So this is a segment we do, KK, where 506 00:22:36.840 --> 00:22:40.170 we ask you a bunch of questions in rapid-fire style, 507 00:22:40.170 --> 00:22:42.320 and please tell us what comes to your mind first. 508 00:22:42.320 --> 00:22:43.230 KHATEREH KHODAVIRDI: Sounds good. 509 00:22:43.230 --> 00:22:45.760 SHERVIN KHODABANDEH: What has been your proudest AI moment? 510 00:22:45.760 --> 00:22:46.610 511 00:22:46.610 --> 00:22:49.110 KHATEREH KHODAVIRDI: My proudest AI moment goes back to grad 512 00:22:49.110 --> 00:22:50.770 school, when I was at Carnegie Mellon, 513 00:22:50.770 --> 00:22:53.190 before this field was this much in demand, 514 00:22:53.190 --> 00:22:56.770 and as part of my graduate research studies, 515 00:22:56.770 --> 00:23:00.990 I was building capabilities and platforms for energy management 516 00:23:00.990 --> 00:23:04.160 -- smart energy management for residential buildings. 517 00:23:04.160 --> 00:23:07.297 That was the most proud AI moment of my life. 518 00:23:07.297 --> 00:23:08.380 SHERVIN KHODABANDEH: Cool. 519 00:23:08.380 --> 00:23:12.120 What's your favorite activity that involves no technology? 520 00:23:12.120 --> 00:23:13.620 KHATEREH KHODAVIRDI: Playing tennis, 521 00:23:13.620 --> 00:23:16.760 because it really helps me to focus on the moment. 522 00:23:16.760 --> 00:23:20.340 SHERVIN KHODABANDEH: What was the first career you wanted? 523 00:23:20.340 --> 00:23:22.240 What'd you want to be when you grew up? 524 00:23:22.240 --> 00:23:24.670 KHATEREH KHODAVIRDI: Probably, for my first career 525 00:23:24.670 --> 00:23:27.690 I wanted to become a professor or a pilot. 526 00:23:27.690 --> 00:23:30.500 I don't exactly remember which one came first, 527 00:23:30.500 --> 00:23:34.180 because in our family, education is a big piece, 528 00:23:34.180 --> 00:23:37.390 and my mom actually had an educational career, 529 00:23:37.390 --> 00:23:39.800 but probably either a professor or a pilot. 530 00:23:39.800 --> 00:23:40.792 531 00:23:40.792 --> 00:23:42.750 SHERVIN KHODABANDEH: What worries you about AI? 532 00:23:42.750 --> 00:23:45.580 KHATEREH KHODAVIRDI: I would say there have been a lot 533 00:23:45.580 --> 00:23:49.100 of conversations about responsible AI and bias, 534 00:23:49.100 --> 00:23:50.780 and I chatted about this earlier -- 535 00:23:50.780 --> 00:23:53.780 that it's qualitative and quantitative as well. 536 00:23:53.780 --> 00:23:55.510 I think it would be a huge mistake 537 00:23:55.510 --> 00:23:58.100 to assume that AI can solve basically 538 00:23:58.100 --> 00:24:00.170 all the problems without having the right checks 539 00:24:00.170 --> 00:24:02.150 and balances in place. 540 00:24:02.150 --> 00:24:03.900 SHERVIN KHODABANDEH: What is your greatest 541 00:24:03.900 --> 00:24:05.105 wish for AI in the future? 542 00:24:05.105 --> 00:24:05.605 543 00:24:05.605 --> 00:24:08.950 KHATEREH KHODAVIRDI: I would say there are a lot of challenges 544 00:24:08.950 --> 00:24:11.560 that the humanities are facing, from climate 545 00:24:11.560 --> 00:24:13.770 change or a bunch of other stuff, 546 00:24:13.770 --> 00:24:16.350 so I really hope that more and more people actually 547 00:24:16.350 --> 00:24:19.620 play a role in using AI to solve those problems. 548 00:24:19.620 --> 00:24:22.100 A lot of my colleagues actually started 549 00:24:22.100 --> 00:24:23.900 investing more of their time and energy, 550 00:24:23.900 --> 00:24:26.490 and I really hope that eventually in my career, 551 00:24:26.490 --> 00:24:28.105 I also can play a role as well. 552 00:24:28.105 --> 00:24:29.230 SHERVIN KHODABANDEH: Great. 553 00:24:29.230 --> 00:24:30.360 Thank you very much. 554 00:24:30.360 --> 00:24:30.540 555 00:24:30.540 --> 00:24:32.540 SAM RANSBOTHAM: So, KK, I think that, obviously, 556 00:24:32.540 --> 00:24:36.180 the Lego analogy is going to be interesting for people, 557 00:24:36.180 --> 00:24:40.520 but I also like what you were saying about things like 558 00:24:40.520 --> 00:24:43.870 governance that, I think, are fundamentally important 559 00:24:43.870 --> 00:24:46.820 for this, and the idea that you would come on and mention some 560 00:24:46.820 --> 00:24:50.090 of that importance of things like governance getting you 561 00:24:50.090 --> 00:24:52.530 to that scale that you need -- 562 00:24:52.530 --> 00:24:54.280 I think that's something that maybe is 563 00:24:54.280 --> 00:24:56.998 more widespread or universal. 564 00:24:56.998 --> 00:24:59.040 Thanks for taking the time to talk with us today. 565 00:24:59.040 --> 00:25:00.150 We really appreciate it. 566 00:25:00.150 --> 00:25:01.220 Thanks for joining us. 567 00:25:01.220 --> 00:25:03.637 KHATEREH KHODAVIRDI: Thank you guys so much for having me. 568 00:25:03.637 --> 00:25:05.190 SAM RANSBOTHAM: Thanks for listening. 569 00:25:05.190 --> 00:25:07.090 Join us next time, when we talk with Fiona 570 00:25:07.090 --> 00:25:09.490 Tan, chief technology officer at Wayfair. 571 00:25:09.490 --> 00:25:11.080 Please join us. 572 00:25:11.080 --> 00:25:14.580 ALLISON RYDER: Thanks for listening 573 00:25:14.580 --> 00:25:16.080 to Me, Myself, and AI. 574 00:25:16.080 --> 00:25:18.540 We believe, like you, that the conversation 575 00:25:18.540 --> 00:25:20.760 about AI implementation doesn't start and stop 576 00:25:20.760 --> 00:25:21.877 with this podcast. 577 00:25:21.877 --> 00:25:23.710 That's why we've created a group on LinkedIn 578 00:25:23.710 --> 00:25:25.550 specifically for listeners like you. 579 00:25:25.550 --> 00:25:28.300 It's called AI for Leaders, and if you join us, 580 00:25:28.300 --> 00:25:30.320 you can chat with show creators and hosts, 581 00:25:30.320 --> 00:25:33.030 ask your own questions, share your insights, 582 00:25:33.030 --> 00:25:35.680 and gain access to valuable resources about AI 583 00:25:35.680 --> 00:25:38.520 implementation from MIT SMR and BCG. 584 00:25:38.520 --> 00:25:43.640 You can access it by visiting mitsmr.com/AIforLeaders. 585 00:25:43.640 --> 00:25:46.360 We'll put that link in the show notes, 586 00:25:46.360 --> 00:25:48.800 and we hope to see you there. 587 00:25:48.800 --> 00:25:54.000