WEBVTT 1 00:00:00.000 --> 00:00:01.975 2 00:00:01.975 --> 00:00:03.850 SAM RANSBOTHAM: Most of us have used LinkedIn 3 00:00:03.850 --> 00:00:07.700 to search for a job or to make new professional connections, 4 00:00:07.700 --> 00:00:11.260 but how can AI help facilitate all the many ways 5 00:00:11.260 --> 00:00:13.490 users engage with LinkedIn? 6 00:00:13.490 --> 00:00:16.100 Find out today, when we talk with Ya 7 00:00:16.100 --> 00:00:19.540 Xu, LinkedIn's head of data. 8 00:00:19.540 --> 00:00:22.350 Welcome to Me, Myself, and AI, a podcast 9 00:00:22.350 --> 00:00:24.480 on artificial intelligence in business. 10 00:00:24.480 --> 00:00:28.190 Each episode, we introduce you to someone innovating with AI. 11 00:00:28.190 --> 00:00:31.520 I'm Sam Ransbotham, professor of information systems 12 00:00:31.520 --> 00:00:32.850 at Boston College. 13 00:00:32.850 --> 00:00:35.420 I'm also the guest editor for the artificial intelligence 14 00:00:35.420 --> 00:00:38.785 and business strategy Big Idea program at MIT Sloan Management 15 00:00:38.785 --> 00:00:39.285 Review. 16 00:00:39.285 --> 00:00:39.540 17 00:00:39.540 --> 00:00:41.650 SHERVIN KHODABANDEH: And I'm Shervin Khodabandeh, 18 00:00:41.650 --> 00:00:45.640 senior partner with BCG, and I also colead BCG's AI practice 19 00:00:45.640 --> 00:00:46.610 in North America. 20 00:00:46.610 --> 00:00:49.690 Together, MIT SMR and BCG have been 21 00:00:49.690 --> 00:00:53.220 researching AI for six years now, interviewing hundreds 22 00:00:53.220 --> 00:00:55.780 of practitioners and surveying thousands 23 00:00:55.780 --> 00:01:00.000 of companies on what it takes to build and deploy and scale 24 00:01:00.000 --> 00:01:02.200 AI capabilities across the organization 25 00:01:02.200 --> 00:01:05.812 and really transform the way organizations operate. 26 00:01:05.812 --> 00:01:07.770 SAM RANSBOTHAM: Shervin and I are excited today 27 00:01:07.770 --> 00:01:10.900 to be talking with Ya Xu, head of data at LinkedIn. 28 00:01:10.900 --> 00:01:12.650 Ya, thanks for taking the time to join us. 29 00:01:12.650 --> 00:01:13.278 Welcome. 30 00:01:13.278 --> 00:01:14.570 YA XU: Thank you for having me. 31 00:01:14.570 --> 00:01:16.896 SAM RANSBOTHAM: Let's start with your current role. 32 00:01:16.896 --> 00:01:18.063 What do you do for LinkedIn? 33 00:01:18.063 --> 00:01:18.750 34 00:01:18.750 --> 00:01:21.460 YA XU: I am part of the engineering organization. 35 00:01:21.460 --> 00:01:23.970 I lead a team that's called "data." 36 00:01:23.970 --> 00:01:27.050 I know it's very confusing to people outside LinkedIn 37 00:01:27.050 --> 00:01:29.620 to realize what "data" really means, but, essentially, 38 00:01:29.620 --> 00:01:33.700 if you think about all the data science, AI, 39 00:01:33.700 --> 00:01:35.380 and privacy engineering, it's all 40 00:01:35.380 --> 00:01:37.160 happening in my organization. 41 00:01:37.160 --> 00:01:40.470 So we [work] all the way from research 42 00:01:40.470 --> 00:01:45.780 to production, so it's a pretty big organization that is really 43 00:01:45.780 --> 00:01:48.830 helping the company realize our ambition in data and AI. 44 00:01:48.830 --> 00:01:51.092 SHERVIN KHODABANDEH: And what is that ambition? 45 00:01:51.092 --> 00:01:52.240 46 00:01:52.240 --> 00:01:55.370 YA XU: It is really using AI, using data, 47 00:01:55.370 --> 00:01:58.250 to help create economic opportunity 48 00:01:58.250 --> 00:02:00.200 for the global workforce. 49 00:02:00.200 --> 00:02:03.460 SHERVIN KHODABANDEH: Comment a little bit more about that. 50 00:02:03.460 --> 00:02:05.330 I like, by the way, how you said, "Look, 51 00:02:05.330 --> 00:02:07.620 it's all about creating opportunity 52 00:02:07.620 --> 00:02:10.419 and economic value," and you didn't 53 00:02:10.419 --> 00:02:15.170 say, "It's about bringing the most advanced technologies 54 00:02:15.170 --> 00:02:19.140 and building sophisticated algorithms," 55 00:02:19.140 --> 00:02:22.750 which I'm sure is all part and parcel of what you do, 56 00:02:22.750 --> 00:02:28.030 but you started with a very high meta framing and positioning. 57 00:02:28.030 --> 00:02:29.620 It's really encouraging to hear that. 58 00:02:29.620 --> 00:02:33.520 But, just generally comment on what that is for LinkedIn 59 00:02:33.520 --> 00:02:35.720 and what role AI plays in that. 60 00:02:35.720 --> 00:02:39.680 YA XU: First of all, I mean, I think that creating advanced 61 00:02:39.680 --> 00:02:41.750 technology -- to me, that is "how." 62 00:02:41.750 --> 00:02:44.560 The end goal is not just creating technology, 63 00:02:44.560 --> 00:02:47.320 but really, like I mentioned, creating opportunities. 64 00:02:47.320 --> 00:02:49.880 Maybe I would break it down a little bit. 65 00:02:49.880 --> 00:02:53.660 I don't know how well the audience understands 66 00:02:53.660 --> 00:02:54.970 LinkedIn in general. 67 00:02:54.970 --> 00:03:00.650 We like to think of ourselves as having three key marketplaces. 68 00:03:00.650 --> 00:03:04.520 We have our knowledge marketplace where people [are] 69 00:03:04.520 --> 00:03:07.610 creating content on LinkedIn and people [are] consuming content 70 00:03:07.610 --> 00:03:09.380 and getting informed so that they 71 00:03:09.380 --> 00:03:13.080 can advance in their career and make the right connections 72 00:03:13.080 --> 00:03:14.380 to the right people. 73 00:03:14.380 --> 00:03:16.620 That's what we call the knowledge marketplace. 74 00:03:16.620 --> 00:03:19.260 We also have the talent marketplace. 75 00:03:19.260 --> 00:03:21.640 This is when job seekers come to LinkedIn -- 76 00:03:21.640 --> 00:03:25.960 and also employers and companies post jobs and recruiters find 77 00:03:25.960 --> 00:03:27.780 the right talent for companies. 78 00:03:27.780 --> 00:03:29.840 So that's our talent marketplace, 79 00:03:29.840 --> 00:03:31.910 which also includes our learning -- 80 00:03:31.910 --> 00:03:35.190 thinking about individuals or professionals who are 81 00:03:35.190 --> 00:03:38.190 reskilling themselves so that they can continue to advance 82 00:03:38.190 --> 00:03:41.150 in their career or find new opportunities. 83 00:03:41.150 --> 00:03:44.050 The third marketplace is really the product and services 84 00:03:44.050 --> 00:03:44.800 marketplace. 85 00:03:44.800 --> 00:03:48.460 This is where marketers come to the LinkedIn platform and then 86 00:03:48.460 --> 00:03:52.470 really try to identify the buyers for their products 87 00:03:52.470 --> 00:03:55.420 and services -- sales individuals come to LinkedIn 88 00:03:55.420 --> 00:03:58.590 and really try to identify the buyers as well for them. 89 00:03:58.590 --> 00:04:00.440 So if you're thinking about the role 90 00:04:00.440 --> 00:04:03.300 that AI plays, hopefully with that context, 91 00:04:03.300 --> 00:04:07.770 it's very clear that obviously in the knowledge marketplace, 92 00:04:07.770 --> 00:04:10.670 we are really trying to match the content 93 00:04:10.670 --> 00:04:13.420 creators with the right content consumers. 94 00:04:13.420 --> 00:04:15.580 In the talent marketplace, we really 95 00:04:15.580 --> 00:04:17.850 just try to match the job seekers 96 00:04:17.850 --> 00:04:21.250 with the right companies and with the right opportunities. 97 00:04:21.250 --> 00:04:23.980 And then, for the product and services marketplace, 98 00:04:23.980 --> 00:04:27.350 we're just trying to match buyers and sellers, etc. 99 00:04:27.350 --> 00:04:29.630 When you put it in that context, I 100 00:04:29.630 --> 00:04:33.040 hope it is very straightforward to see that obviously data 101 00:04:33.040 --> 00:04:37.210 and AI play such an essential role, 102 00:04:37.210 --> 00:04:39.120 because how do you match them? 103 00:04:39.120 --> 00:04:42.093 It's really through advanced technologies 104 00:04:42.093 --> 00:04:43.260 that we have in data and AI. 105 00:04:43.260 --> 00:04:46.090 SHERVIN KHODABANDEH: So "How do you match them?" 106 00:04:46.090 --> 00:04:53.000 then takes us to a variety of use cases and experiences, 107 00:04:53.000 --> 00:04:58.780 and I'm just curious, how do you come up with those? 108 00:04:58.780 --> 00:05:02.470 What is the process where you've got the business side 109 00:05:02.470 --> 00:05:05.620 and you've got the technology and engineering and data side 110 00:05:05.620 --> 00:05:06.458 working together? 111 00:05:06.458 --> 00:05:08.250 SAM RANSBOTHAM: Take us through the recipe. 112 00:05:08.250 --> 00:05:08.717 113 00:05:08.717 --> 00:05:09.800 SHERVIN KHODABANDEH: Yeah. 114 00:05:09.800 --> 00:05:12.820 YA XU: [Laughs.] I would actually maybe start getting 115 00:05:12.820 --> 00:05:15.030 to the tactical aspects of things. 116 00:05:15.030 --> 00:05:18.200 I think LinkedIn is a very unique place, because the way 117 00:05:18.200 --> 00:05:21.740 that I describe LinkedIn as three different marketplaces 118 00:05:21.740 --> 00:05:24.980 and the way that data and AI play a role, that's 119 00:05:24.980 --> 00:05:27.850 how our CEO describes LinkedIn. 120 00:05:27.850 --> 00:05:30.400 What I've described is not so much like, 121 00:05:30.400 --> 00:05:34.460 "Oh, this is our data or AI view of LinkedIn"; 122 00:05:34.460 --> 00:05:39.130 this is LinkedIn as a company's overarching view of LinkedIn. 123 00:05:39.130 --> 00:05:46.910 So hopefully that sets the stage for how integrated data and AI 124 00:05:46.910 --> 00:05:51.850 are for the mission and the vision of the company, for how 125 00:05:51.850 --> 00:05:55.060 the company operates, for the collaboration, which we can 126 00:05:55.060 --> 00:05:58.060 talk more about, for how different 127 00:05:58.060 --> 00:06:01.620 product development processes work and collaboration happens 128 00:06:01.620 --> 00:06:03.300 across different team boundaries. 129 00:06:03.300 --> 00:06:06.740 That's where I wanted to start, because it's just one word: 130 00:06:06.740 --> 00:06:07.957 It's very integrated. 131 00:06:07.957 --> 00:06:09.040 SHERVIN KHODABANDEH: Yeah. 132 00:06:09.040 --> 00:06:11.970 YA XU: Tactically maybe [I'll] just talk about little bit 133 00:06:11.970 --> 00:06:13.600 of how we are organized. 134 00:06:13.600 --> 00:06:16.160 LinkedIn is very functionally organized. 135 00:06:16.160 --> 00:06:21.110 So if you're looking at who sits at our CEO's table, 136 00:06:21.110 --> 00:06:23.110 you have, essentially, all the functional 137 00:06:23.110 --> 00:06:25.820 heads: head of engineering, head of product, head of sales, 138 00:06:25.820 --> 00:06:28.160 head of marketing, head of legal, 139 00:06:28.160 --> 00:06:30.740 you know, HR, head of, obviously, finance. 140 00:06:30.740 --> 00:06:35.000 We have a very strong muscle because of that 141 00:06:35.000 --> 00:06:37.130 to work cross-functionally. 142 00:06:37.130 --> 00:06:40.860 And so even though that all the data science and AI functions 143 00:06:40.860 --> 00:06:46.140 are in my team, we have the muscle and the structure 144 00:06:46.140 --> 00:06:50.470 to enable really strong collaboration between all 145 00:06:50.470 --> 00:06:52.990 the other functions that we work with, in order 146 00:06:52.990 --> 00:06:56.960 to bring the AI solutions to production, to life, 147 00:06:56.960 --> 00:07:00.370 and to deliver that value to our members and customers. 148 00:07:00.370 --> 00:07:02.140 So concretely, my team -- 149 00:07:02.140 --> 00:07:06.060 I have leaders who are focused on different areas. 150 00:07:06.060 --> 00:07:08.380 And as an example, I have a leader 151 00:07:08.380 --> 00:07:11.850 who is focusing on our consumer experience 152 00:07:11.850 --> 00:07:14.810 in particular, so the way that he would operate 153 00:07:14.810 --> 00:07:17.620 is that he would work very closely, 154 00:07:17.620 --> 00:07:22.230 in a very embedded fashion, with other cross-functional leaders 155 00:07:22.230 --> 00:07:24.490 who focus on consumer experience. 156 00:07:24.490 --> 00:07:26.510 So, what does that mean? 157 00:07:26.510 --> 00:07:30.510 They have many, many touch points, 158 00:07:30.510 --> 00:07:33.650 all the way from quarterly planning -- 159 00:07:33.650 --> 00:07:35.590 "How do you set OKRs? 160 00:07:35.590 --> 00:07:37.510 How do you do reviews?" 161 00:07:37.510 --> 00:07:40.590 When it comes to a particular initiative, 162 00:07:40.590 --> 00:07:43.980 then you've got functional heads all coming together and then 163 00:07:43.980 --> 00:07:46.510 strategizing what needs to happen, 164 00:07:46.510 --> 00:07:49.770 who [should] work on them -- planning our road maps. 165 00:07:49.770 --> 00:07:53.220 All [of this is] happening very seamlessly 166 00:07:53.220 --> 00:07:54.410 as a cross-functional team. 167 00:07:54.410 --> 00:07:57.882 And, as I said, because we as a company have always, 168 00:07:57.882 --> 00:08:00.340 ever since the existence of LinkedIn, operated in this way; 169 00:08:00.340 --> 00:08:02.280 everyone has that muscle. 170 00:08:02.280 --> 00:08:02.780 171 00:08:02.780 --> 00:08:05.210 SHERVIN KHODABANDEH: It's brilliant, 172 00:08:05.210 --> 00:08:07.630 because Sam and I have done a lot of research 173 00:08:07.630 --> 00:08:10.610 here and talked with companies across sectors, 174 00:08:10.610 --> 00:08:13.350 and it's really interesting: You said 175 00:08:13.350 --> 00:08:16.770 the three key [things that] have really 176 00:08:16.770 --> 00:08:19.920 been challenges for most who aren't 177 00:08:19.920 --> 00:08:21.320 getting any value from AI. 178 00:08:21.320 --> 00:08:23.900 You said "strategy and mission" -- that's where you started. 179 00:08:23.900 --> 00:08:26.880 You said "integrated cross-functional teams," 180 00:08:26.880 --> 00:08:28.930 and then you said "collaboration." 181 00:08:28.930 --> 00:08:33.840 And, and in fact, when you look at the data of 90% of companies 182 00:08:33.840 --> 00:08:35.880 [that] aren't really getting as much value 183 00:08:35.880 --> 00:08:37.830 for their investments in AI [compared with] 184 00:08:37.830 --> 00:08:40.299 those 10% who are getting it, it comes down 185 00:08:40.299 --> 00:08:41.799 to these very three points. 186 00:08:41.799 --> 00:08:45.390 And it's interesting that LinkedIn, since its inception, 187 00:08:45.390 --> 00:08:49.960 has been that kind of a place, where it's been "data 188 00:08:49.960 --> 00:08:52.870 first" in an integrated way. 189 00:08:52.870 --> 00:08:57.080 [This] completely resonates, and [it is] also very refreshing 190 00:08:57.080 --> 00:09:01.130 that in your role as the head of engineering and data, 191 00:09:01.130 --> 00:09:05.370 that's also where you first go, before you go and you 192 00:09:05.370 --> 00:09:09.423 talk about everything else, which is all important, 193 00:09:09.423 --> 00:09:11.090 but it's the "how"; it's not the "what." 194 00:09:11.090 --> 00:09:11.970 195 00:09:11.970 --> 00:09:13.880 YA XU: Yeah, absolutely, and I always 196 00:09:13.880 --> 00:09:16.550 look to the leaders on my team as well. 197 00:09:16.550 --> 00:09:18.860 They are not just like, "Hey, I'm just a leader of AI; 198 00:09:18.860 --> 00:09:20.610 I'm just a leader of data." 199 00:09:20.610 --> 00:09:24.210 They really need to understand where the end goal is, right? 200 00:09:24.210 --> 00:09:26.050 The end goal is never just creating 201 00:09:26.050 --> 00:09:29.590 maybe the largest model or the best state of the art. 202 00:09:29.590 --> 00:09:31.290 It's really about delivering the value. 203 00:09:31.290 --> 00:09:33.020 When you understand that, and then 204 00:09:33.020 --> 00:09:35.720 when you've got the cross-functional team all 205 00:09:35.720 --> 00:09:38.110 having the same shared goal and purpose, 206 00:09:38.110 --> 00:09:40.760 then that also brings that collaboration to life 207 00:09:40.760 --> 00:09:41.780 really easily. 208 00:09:41.780 --> 00:09:45.370 SHERVIN KHODABANDEH: How hard was that to put that in place 209 00:09:45.370 --> 00:09:48.800 and align the incentive and the organization and keep -- 210 00:09:48.800 --> 00:09:50.310 you said "different muscles"? 211 00:09:50.310 --> 00:09:53.290 How hard was that to get to that place? 212 00:09:53.290 --> 00:09:54.320 Was it always there? 213 00:09:54.320 --> 00:09:57.460 Were there some very hard decisions that had to be made? 214 00:09:57.460 --> 00:09:58.420 215 00:09:58.420 --> 00:10:02.570 YA XU: I have been at LinkedIn for close to nine years now. 216 00:10:02.570 --> 00:10:06.520 I want to say it's pretty much always there, 217 00:10:06.520 --> 00:10:10.030 because this is how we've always been organized. 218 00:10:10.030 --> 00:10:12.460 The culture is so ingrained that I 219 00:10:12.460 --> 00:10:16.440 think the new employees will just very quickly assimilate 220 00:10:16.440 --> 00:10:18.657 into "This is how we operate." 221 00:10:18.657 --> 00:10:20.490 SAM RANSBOTHAM: You mentioned getting better 222 00:10:20.490 --> 00:10:23.190 in collaboration, and I'm kind of thinking back on what 223 00:10:23.190 --> 00:10:24.760 you were saying about matching. 224 00:10:24.760 --> 00:10:28.910 You used the word matching in all three of your scenarios. 225 00:10:28.910 --> 00:10:31.330 What risk is there that you get down 226 00:10:31.330 --> 00:10:33.350 this path of a bunch of engineers 227 00:10:33.350 --> 00:10:36.460 ever-improving the existing matching algorithms 228 00:10:36.460 --> 00:10:40.350 and perhaps missing out on a fourth area 229 00:10:40.350 --> 00:10:41.970 that you need to be focusing on? 230 00:10:41.970 --> 00:10:44.250 Is there a tension that you're feeling 231 00:10:44.250 --> 00:10:48.100 between ever-improving that matching process through better 232 00:10:48.100 --> 00:10:51.500 and better algorithms and data, and figuring out 233 00:10:51.500 --> 00:10:54.210 where to apply it in some of your newer products 234 00:10:54.210 --> 00:10:55.640 like your I don't know newsletter, 235 00:10:55.640 --> 00:10:58.898 LinkedIn Lives, or some of those kinds of new things? 236 00:10:58.898 --> 00:10:59.773 Where's that tension? 237 00:10:59.773 --> 00:11:00.580 238 00:11:00.580 --> 00:11:03.670 YA XU: There are nonstop new areas 239 00:11:03.670 --> 00:11:06.690 that pop up regarding how we can innovate 240 00:11:06.690 --> 00:11:09.970 and should be innovating, and even just starting 241 00:11:09.970 --> 00:11:14.415 with matching, it's very simplistic to say, 242 00:11:14.415 --> 00:11:16.290 "Hey, we just can try to do better matching." 243 00:11:16.290 --> 00:11:19.460 But what does "better matching" even mean, right? 244 00:11:19.460 --> 00:11:21.950 Let's take our data marketplace as an example. 245 00:11:21.950 --> 00:11:26.610 Obviously, we want to match the best qualified candidate 246 00:11:26.610 --> 00:11:28.350 with the best company. 247 00:11:28.350 --> 00:11:30.888 But how do you even define that? 248 00:11:30.888 --> 00:11:32.180 SAM RANSBOTHAM: What is "best"? 249 00:11:32.180 --> 00:11:32.680 250 00:11:32.680 --> 00:11:35.260 YA XU: What is "best," not only for the job seeker, 251 00:11:35.260 --> 00:11:38.200 but also, what's the best for the other side 252 00:11:38.200 --> 00:11:40.020 of the marketplace, which is the companies? 253 00:11:40.020 --> 00:11:42.180 If I were trying to hire somebody, 254 00:11:42.180 --> 00:11:45.730 my ultimate ideal state is, I really only 255 00:11:45.730 --> 00:11:47.980 talk to one candidate, and that is the dream candidate 256 00:11:47.980 --> 00:11:49.080 I wanted to hire. 257 00:11:49.080 --> 00:11:52.390 And then, same thing, let's say, in the knowledge marketplace, 258 00:11:52.390 --> 00:11:55.400 as we try to connect to the content creators, 259 00:11:55.400 --> 00:11:58.540 to people who are interested in reading their articles: 260 00:11:58.540 --> 00:12:01.580 What does that even mean by "the right matching" as well? 261 00:12:01.580 --> 00:12:06.060 Is it that we want to maximize the engagement that people 262 00:12:06.060 --> 00:12:08.130 have on a particular post? 263 00:12:08.130 --> 00:12:10.460 How do we think about distribution of those 264 00:12:10.460 --> 00:12:11.770 engagements across ? 265 00:12:11.770 --> 00:12:15.650 How do we think about maybe I'm a new creator, first time 266 00:12:15.650 --> 00:12:18.850 writing a post on LinkedIn, and I didn't get any response. 267 00:12:18.850 --> 00:12:21.650 I will be so discouraged and never post again. 268 00:12:21.650 --> 00:12:24.080 How do we think about that shorter-term trade-off 269 00:12:24.080 --> 00:12:25.290 and the long-term trade-off? 270 00:12:25.290 --> 00:12:30.000 There's so much more, even just in this very simplistic 271 00:12:30.000 --> 00:12:33.150 framework of "We're just doing matching." 272 00:12:33.150 --> 00:12:35.780 So now, coming out of the matching aspect of it, 273 00:12:35.780 --> 00:12:39.330 there's again way more, like thinking about, 274 00:12:39.330 --> 00:12:41.660 how do we help people do content discovery? 275 00:12:41.660 --> 00:12:45.990 And when advertisers come to us, how do we actually 276 00:12:45.990 --> 00:12:47.870 help them pace their budget? 277 00:12:47.870 --> 00:12:50.950 How do we help them utilize their budget better? 278 00:12:50.950 --> 00:12:54.570 SHERVIN KHODABANDEH: Given the progressive culture 279 00:12:54.570 --> 00:12:59.480 and the highly collaborative, integrated functional culture 280 00:12:59.480 --> 00:13:03.080 of LinkedIn, what makes a good candidate for your team? 281 00:13:03.080 --> 00:13:05.420 What are you looking for, in addition to 282 00:13:05.420 --> 00:13:09.500 the hard skills of technology and data and AI and data 283 00:13:09.500 --> 00:13:10.080 science? 284 00:13:10.080 --> 00:13:12.510 What do you think is the secret sauce there? 285 00:13:12.510 --> 00:13:13.890 286 00:13:13.890 --> 00:13:16.550 YA XU: I would say, number one, someone who -- 287 00:13:16.550 --> 00:13:19.080 I think Satya [Nadella] was the one who 288 00:13:19.080 --> 00:13:22.050 said that quote, "Would you want to hire 289 00:13:22.050 --> 00:13:25.500 a learn-it-all or a know-it-all?" 290 00:13:25.500 --> 00:13:31.070 I am a strong believer in a learn-it-all, 291 00:13:31.070 --> 00:13:34.075 and what made a candidate successful in the past 292 00:13:34.075 --> 00:13:35.700 doesn't necessarily mean that they will 293 00:13:35.700 --> 00:13:37.160 be successful in the future. 294 00:13:37.160 --> 00:13:40.860 But that attitude of "I'm going to learn; I'm going to adapt" 295 00:13:40.860 --> 00:13:41.360 -- 296 00:13:41.360 --> 00:13:43.400 I think that's so important. 297 00:13:43.400 --> 00:13:46.570 So I would say learning -- somebody who is really a big 298 00:13:46.570 --> 00:13:47.470 learner. 299 00:13:47.470 --> 00:13:54.900 And the second one is someone who is just curious. 300 00:13:54.900 --> 00:13:57.272 Because when you are curious, you have that drive, 301 00:13:57.272 --> 00:13:59.480 you have that "I'm going to get to the bottom of it." 302 00:13:59.480 --> 00:14:05.250 And so much of the amazing progress we've made 303 00:14:05.250 --> 00:14:07.330 is because someone is like, "You know what? 304 00:14:07.330 --> 00:14:10.720 I'm not here for all the fluffy things. 305 00:14:10.720 --> 00:14:13.980 I'm just going to be here to really focus on this problem 306 00:14:13.980 --> 00:14:16.722 that I saw that I'm just trying to figure out how to solve." 307 00:14:16.722 --> 00:14:18.680 SHERVIN KHODABANDEH: You know, it reminds me -- 308 00:14:18.680 --> 00:14:21.900 I had a mentor who used to tell me, 309 00:14:21.900 --> 00:14:25.560 "There are no boring projects, only boring people." 310 00:14:25.560 --> 00:14:28.010 And so every time I was like, "Adam, 311 00:14:28.010 --> 00:14:29.900 I'm not sure I'm crazy about this project," 312 00:14:29.900 --> 00:14:31.620 he said, "You could make it interesting. 313 00:14:31.620 --> 00:14:34.340 You have the ability to learn; you have the ability 314 00:14:34.340 --> 00:14:36.050 to complain constructively." 315 00:14:36.050 --> 00:14:39.230 So now you're saying that; it brought back to my memory, 316 00:14:39.230 --> 00:14:41.750 there's only boring people, no boring projects. 317 00:14:41.750 --> 00:14:42.740 YA XU: Absolutely. 318 00:14:42.740 --> 00:14:43.330 I love that. 319 00:14:43.330 --> 00:14:44.900 I might start quoting that as well. 320 00:14:44.900 --> 00:14:45.400 321 00:14:45.400 --> 00:14:46.870 SAM RANSBOTHAM: What's interesting, 322 00:14:46.870 --> 00:14:48.950 and I think that ties into you're 323 00:14:48.950 --> 00:14:51.128 a bit of a hero in the academic community. 324 00:14:51.128 --> 00:14:53.170 I think maybe you're attracted to those people -- 325 00:14:53.170 --> 00:14:55.770 academics complain a lot; maybe you're attracted to those 326 00:14:55.770 --> 00:14:57.200 academics that complain a lot. 327 00:14:57.200 --> 00:14:59.800 But I wanted to tie that to another way 328 00:14:59.800 --> 00:15:03.272 that you're creating economic value through LinkedIn. 329 00:15:03.272 --> 00:15:05.230 And I think one thing that's really fascinating 330 00:15:05.230 --> 00:15:07.600 here is that you've got a platform that 331 00:15:07.600 --> 00:15:11.040 has insight into what's going on into the economics 332 00:15:11.040 --> 00:15:15.610 of the invisible hand, making the invisible hand visible. 333 00:15:15.610 --> 00:15:17.435 You've got unparalleled insight. 334 00:15:17.435 --> 00:15:19.810 I was hoping you could talk a little bit about the things 335 00:15:19.810 --> 00:15:20.310 that you -- 336 00:15:20.310 --> 00:15:22.893 I mean, I'm aware of some of the things you've done with code, 337 00:15:22.893 --> 00:15:24.580 and with your LinkedIn graphs projects 338 00:15:24.580 --> 00:15:25.920 with the academic world. 339 00:15:25.920 --> 00:15:28.150 What are you doing in that aspect 340 00:15:28.150 --> 00:15:30.040 to get insights into those things 341 00:15:30.040 --> 00:15:32.300 that we've just never had insight into before? 342 00:15:32.300 --> 00:15:33.530 343 00:15:33.530 --> 00:15:35.610 YA XU: Really, really good point. 344 00:15:35.610 --> 00:15:39.920 Because of the volume that we have on the platform 345 00:15:39.920 --> 00:15:42.630 and how much economic opportunity activity that 346 00:15:42.630 --> 00:15:46.740 happens on the platform, we particularly 347 00:15:46.740 --> 00:15:50.330 have that insight into future of work, what skills 348 00:15:50.330 --> 00:15:53.330 are in demand, how different companies are hiring, 349 00:15:53.330 --> 00:15:58.060 which industry is hiring more and hiring less, and even just 350 00:15:58.060 --> 00:16:03.340 thinking about the equity aspect of it as well: Do women 351 00:16:03.340 --> 00:16:06.520 [or] men have a different rate of job changing, 352 00:16:06.520 --> 00:16:08.280 etc., [or] advancements in their careers? 353 00:16:08.280 --> 00:16:09.080 All that. 354 00:16:09.080 --> 00:16:12.620 So tons of insights that we have on our platform. 355 00:16:12.620 --> 00:16:16.440 So, what we have done, and, Sam, you kind of alluded to, is we 356 00:16:16.440 --> 00:16:18.900 -- by the way, we call all these activities, 357 00:16:18.900 --> 00:16:21.440 this vibrant activity on our graph, 358 00:16:21.440 --> 00:16:25.050 what we call economic graph -- and we have started 359 00:16:25.050 --> 00:16:28.090 a particular effort on economic graph, 360 00:16:28.090 --> 00:16:32.660 probably seven at least six years ago, 361 00:16:32.660 --> 00:16:36.030 where we essentially stand up a team that includes a bunch 362 00:16:36.030 --> 00:16:39.100 of folks on my team with our policy teams, 363 00:16:39.100 --> 00:16:42.300 with our comms teams, with our editorial teams, 364 00:16:42.300 --> 00:16:45.830 that really try to share and bring some of the data 365 00:16:45.830 --> 00:16:48.770 and insights to the external communities. 366 00:16:48.770 --> 00:16:50.807 And we have been very successful, 367 00:16:50.807 --> 00:16:51.640 as a matter of fact. 368 00:16:51.640 --> 00:17:00.640 For example, last year we sent a report to most of the [U.S.] 369 00:17:00.640 --> 00:17:05.780 congressmen/women on what the labor marketplace looks like 370 00:17:05.780 --> 00:17:07.690 for their region. 371 00:17:07.690 --> 00:17:10.280 We have a collaboration partnership, for example, 372 00:17:10.280 --> 00:17:13.490 with Singapore's education department 373 00:17:13.490 --> 00:17:18.020 to help them figure out what skills [are] in demand 374 00:17:18.020 --> 00:17:20.230 and [are] lacking so that they can 375 00:17:20.230 --> 00:17:23.484 change their educational curriculum to help. 376 00:17:23.484 --> 00:17:24.859 SAM RANSBOTHAM: That's just huge. 377 00:17:24.859 --> 00:17:28.390 YA XU: And then we have worked with obviously a lot 378 00:17:28.390 --> 00:17:31.460 of other institutions, either much more directly 379 00:17:31.460 --> 00:17:33.550 with a particular local government 380 00:17:33.550 --> 00:17:37.770 or with some, like, the World Economic Forum, the G-20; 381 00:17:37.770 --> 00:17:39.860 we share a lot of our reports with them 382 00:17:39.860 --> 00:17:43.270 to help influence some of the policies they have. 383 00:17:43.270 --> 00:17:48.630 Another simple example is, we are really helping the broader 384 00:17:48.630 --> 00:17:51.740 community know what the green skills [are] -- 385 00:17:51.740 --> 00:17:55.150 people who are either hiring for green skills, 386 00:17:55.150 --> 00:17:58.440 people who are like, "Where is that talent going?" 387 00:17:58.440 --> 00:18:02.840 so that as we invest more in green energy, 388 00:18:02.840 --> 00:18:05.790 both the governments and those industries 389 00:18:05.790 --> 00:18:08.313 can be more guided in that from a talent perspective. 390 00:18:08.313 --> 00:18:09.480 SAM RANSBOTHAM: That's huge. 391 00:18:09.480 --> 00:18:10.495 392 00:18:10.495 --> 00:18:12.870 SHERVIN KHODABANDEH: You've been featured in Fortune's 40 393 00:18:12.870 --> 00:18:17.370 Under 40, you've written a book, you've given numerous speeches, 394 00:18:17.370 --> 00:18:20.220 you're a very successful practitioner in a very 395 00:18:20.220 --> 00:18:21.370 successful company. 396 00:18:21.370 --> 00:18:27.480 I'm curious, what would your advice be to your peers 397 00:18:27.480 --> 00:18:32.540 in other organizations that are leveraging AI as the 398 00:18:32.540 --> 00:18:37.440 how to achieve whatever the what of their company is? 399 00:18:37.440 --> 00:18:40.140 What would be the two things that you think 400 00:18:40.140 --> 00:18:41.863 might not be obvious to others? 401 00:18:41.863 --> 00:18:42.770 402 00:18:42.770 --> 00:18:44.610 YA XU: The first thing that came to my mind 403 00:18:44.610 --> 00:18:47.430 is maybe not so provocative. 404 00:18:47.430 --> 00:18:50.990 It's really just, have the best talent. 405 00:18:50.990 --> 00:18:52.730 It's so important. 406 00:18:52.730 --> 00:18:58.070 And I am such a strong believer that when you bring the best 407 00:18:58.070 --> 00:19:01.950 people, all you need to do is to get out of the way, 408 00:19:01.950 --> 00:19:03.760 and then help them to be successful, 409 00:19:03.760 --> 00:19:05.840 and then wonder just happens. 410 00:19:05.840 --> 00:19:08.410 Many of folks on my team can do their job way 411 00:19:08.410 --> 00:19:11.570 better than I can do their job, especially 412 00:19:11.570 --> 00:19:16.350 in a field that is constantly innovating. 413 00:19:16.350 --> 00:19:22.410 I always joke about how the pace that this field, this domain, 414 00:19:22.410 --> 00:19:28.660 is changing is like 300, 400, 500 miles per hour; it's crazy. 415 00:19:28.660 --> 00:19:31.780 I mean, I got my Ph.D. in this domain, 416 00:19:31.780 --> 00:19:35.860 and I was in this workshop that was talking about graph neural 417 00:19:35.860 --> 00:19:38.320 networks and just graph learning in general, 418 00:19:38.320 --> 00:19:43.920 and what the practitioners are doing today versus when I was 419 00:19:43.920 --> 00:19:48.050 doing my thesis 10 years ago -- it's entirely different. 420 00:19:48.050 --> 00:19:51.880 And what was state-of-the-art 10 years ago is nowhere to be 421 00:19:51.880 --> 00:19:54.490 found in today's practice. 422 00:19:54.490 --> 00:19:56.990 So I think that's, again, just really emphasizing 423 00:19:56.990 --> 00:19:59.560 how important it is to bring the best 424 00:19:59.560 --> 00:20:01.080 people and the talent, especially 425 00:20:01.080 --> 00:20:03.160 in a very innovative space. 426 00:20:03.160 --> 00:20:05.990 And the second thing I want to say is -- 427 00:20:05.990 --> 00:20:08.210 maybe this is a little less obvious -- 428 00:20:08.210 --> 00:20:13.590 is to make data and AI work in our company, 429 00:20:13.590 --> 00:20:20.690 the way to do it is not to build a wall between folks who know 430 00:20:20.690 --> 00:20:24.280 data and AI and people who don't know data and AI. 431 00:20:24.280 --> 00:20:28.340 And, by the way, this is a general pitfall, 432 00:20:28.340 --> 00:20:33.040 either in the mindset of people or how companies are organized. 433 00:20:33.040 --> 00:20:37.350 Let's say you've got an expert team who 434 00:20:37.350 --> 00:20:41.010 is world-class in data and AI, and then 435 00:20:41.010 --> 00:20:43.020 you just expect, "Hey, you know what? 436 00:20:43.020 --> 00:20:46.555 The rest of the company knows nothing about data and AI." 437 00:20:46.555 --> 00:20:47.930 SHERVIN KHODABANDEH: I think what 438 00:20:47.930 --> 00:20:50.510 you said about talent is probably not obvious to many, 439 00:20:50.510 --> 00:20:53.930 and I also think it really corroborates your earlier point 440 00:20:53.930 --> 00:20:55.940 about curiosity and learning. 441 00:20:55.940 --> 00:20:58.400 I mean, if those are the ingredients, 442 00:20:58.400 --> 00:21:00.590 then of course talent matters a lot. 443 00:21:00.590 --> 00:21:01.763 So thank you for that. 444 00:21:01.763 --> 00:21:03.930 SAM RANSBOTHAM: We're just seeing that over and over 445 00:21:03.930 --> 00:21:05.860 again, or maybe it's just the kinds of people 446 00:21:05.860 --> 00:21:08.150 that we're, perhaps, attracted to on the show. 447 00:21:08.150 --> 00:21:10.990 But it does seem to be showing up a lot. 448 00:21:10.990 --> 00:21:12.128 YA XU: Absolutely. 449 00:21:12.128 --> 00:21:12.920 SAM RANSBOTHAM: OK. 450 00:21:12.920 --> 00:21:13.320 So 451 00:21:13.320 --> 00:21:15.430 SHERVIN KHODABANDEH: Is this time for the five questions? 452 00:21:15.430 --> 00:21:16.140 Should we do that, or -- 453 00:21:16.140 --> 00:21:16.500 SAM RANSBOTHAM: Sure. 454 00:21:16.500 --> 00:21:16.560 Yeah. 455 00:21:16.560 --> 00:21:16.970 Do that. 456 00:21:16.970 --> 00:21:18.195 SHERVIN KHODABANDEH: Do you know about this? 457 00:21:18.195 --> 00:21:18.422 458 00:21:18.422 --> 00:21:20.380 YA XU: I do not know about this, but fire away. 459 00:21:20.380 --> 00:21:21.550 SHERVIN KHODABANDEH: Oh, we didn't tell you? 460 00:21:21.550 --> 00:21:22.050 All right. 461 00:21:22.050 --> 00:21:23.092 SAM RANSBOTHAM: Surprise! 462 00:21:23.092 --> 00:21:24.195 YA XU: I like surprises. 463 00:21:24.195 --> 00:21:25.820 SHERVIN KHODABANDEH: We have this thing 464 00:21:25.820 --> 00:21:27.620 where we have five questions. 465 00:21:27.620 --> 00:21:29.620 You could just riff, give an answer. 466 00:21:29.620 --> 00:21:32.232 So, what's your proudest AI moment? 467 00:21:32.232 --> 00:21:34.040 468 00:21:34.040 --> 00:21:36.420 YA XU: I would probably go back all the way 469 00:21:36.420 --> 00:21:41.660 to when I was in grad school, and I was taking this class, 470 00:21:41.660 --> 00:21:46.280 and it was probably the very first time I really saw 471 00:21:46.280 --> 00:21:49.730 AI in application in the way that you can feel 472 00:21:49.730 --> 00:21:50.550 and you can touch. 473 00:21:50.550 --> 00:21:53.710 I was taking this class with Andrew Ng where 474 00:21:53.710 --> 00:21:56.810 we were supposedly building an algorithm that 475 00:21:56.810 --> 00:21:59.740 was able to, given a stream of video, 476 00:21:59.740 --> 00:22:02.460 identify objects in the video. 477 00:22:02.460 --> 00:22:06.160 I worked really hard with a classmate of mine 478 00:22:06.160 --> 00:22:07.810 and a close friend of mine. 479 00:22:07.810 --> 00:22:10.750 At the end of the day, they had this competition 480 00:22:10.750 --> 00:22:14.090 of the accuracy and precision recall, etc. 481 00:22:14.090 --> 00:22:17.590 In a pretty large class, we won second place -- 482 00:22:17.590 --> 00:22:20.540 so not the first place, so still room for improvement -- 483 00:22:20.540 --> 00:22:23.780 but [I'm] very proud that, especially [since it was the] 484 00:22:23.780 --> 00:22:26.350 second year in my Ph.D. and before that a lot 485 00:22:26.350 --> 00:22:29.383 of my experience was a little more contrived examples. 486 00:22:29.383 --> 00:22:30.178 487 00:22:30.178 --> 00:22:31.470 SHERVIN KHODABANDEH: Very good. 488 00:22:31.470 --> 00:22:33.410 What worries you about AI? 489 00:22:33.410 --> 00:22:36.210 YA XU: On one hand, you know, obviously 490 00:22:36.210 --> 00:22:38.020 I'm super excited for the potential, 491 00:22:38.020 --> 00:22:42.480 but what worries me about AI would be in the responsible AI 492 00:22:42.480 --> 00:22:43.950 space in particular. 493 00:22:43.950 --> 00:22:46.840 Obviously, I'm really glad the attention 494 00:22:46.840 --> 00:22:49.800 that responsible AI is able to get in the public 495 00:22:49.800 --> 00:22:52.720 and in the research community, and in industry as well. 496 00:22:52.720 --> 00:22:55.790 But at the same time, it's not just a buzzword. 497 00:22:55.790 --> 00:22:58.122 We've got to really put it in practice 498 00:22:58.122 --> 00:22:59.830 and make sure that we are [continuing to] 499 00:22:59.830 --> 00:23:03.510 research how we are able to identify biases 500 00:23:03.510 --> 00:23:07.730 that AI systems can bring, and it's a super challenging space. 501 00:23:07.730 --> 00:23:11.430 I've been working in this space for, I want to say, like, 502 00:23:11.430 --> 00:23:13.660 you know, extensively, a couple years 503 00:23:13.660 --> 00:23:16.060 now, and just know how challenging this is, 504 00:23:16.060 --> 00:23:18.370 so my call to action to your audience 505 00:23:18.370 --> 00:23:22.620 is to definitely lean in the space and research. 506 00:23:22.620 --> 00:23:24.877 Continue to push the boundaries on what's possible. 507 00:23:24.877 --> 00:23:25.550 508 00:23:25.550 --> 00:23:27.050 SHERVIN KHODABANDEH: Very well said. 509 00:23:27.050 --> 00:23:29.080 What is your favorite activity that 510 00:23:29.080 --> 00:23:31.250 does not require technology? 511 00:23:31.250 --> 00:23:31.750 512 00:23:31.750 --> 00:23:32.570 YA XU: That's easy. 513 00:23:32.570 --> 00:23:33.070 It's skiing. 514 00:23:33.070 --> 00:23:33.570 515 00:23:33.570 --> 00:23:35.490 SHERVIN KHODABANDEH: That requires technology. 516 00:23:35.490 --> 00:23:35.508 517 00:23:35.508 --> 00:23:36.550 SAM RANSBOTHAM: [Laughs.] 518 00:23:36.550 --> 00:23:38.180 YA XU: It does not require technology! 519 00:23:38.180 --> 00:23:39.690 SHERVIN KHODABANDEH: The ski. 520 00:23:39.690 --> 00:23:42.160 SAM RANSBOTHAM: Oh, oh, come on. 521 00:23:42.160 --> 00:23:42.660 522 00:23:42.660 --> 00:23:44.840 YA XU: By that argument 523 00:23:44.840 --> 00:23:47.410 SAM RANSBOTHAM: You can expand that to anything; come on! 524 00:23:47.410 --> 00:23:48.150 525 00:23:48.150 --> 00:23:51.130 YA XU: About four years ago, I really got into skiing. 526 00:23:51.130 --> 00:23:56.100 And the reason I love skiing is it's a forced meditation. 527 00:23:56.100 --> 00:24:00.020 It's like, you go down the hill, and I'm not 528 00:24:00.020 --> 00:24:02.330 nearly a good skier, but all I think 529 00:24:02.330 --> 00:24:03.830 about is, how can I get down safely? 530 00:24:03.830 --> 00:24:04.872 SHERVIN KHODABANDEH: Yes. 531 00:24:04.872 --> 00:24:04.880 532 00:24:04.880 --> 00:24:06.060 YA XU: Nothing else comes to mind. 533 00:24:06.060 --> 00:24:06.650 So that -- 534 00:24:06.650 --> 00:24:08.220 SAM RANSBOTHAM: I was sure you were going 535 00:24:08.220 --> 00:24:09.350 to say "gradient descent." 536 00:24:09.350 --> 00:24:10.580 537 00:24:10.580 --> 00:24:13.920 YA XU: [Laughs.] That's funny. 538 00:24:13.920 --> 00:24:17.150 But I also like that every time when you go out, 539 00:24:17.150 --> 00:24:21.500 it's different, because the snow conditions, the weather, 540 00:24:21.500 --> 00:24:26.010 the slope all add variability to how you actually 541 00:24:26.010 --> 00:24:27.675 do when you're skiing that day, and I 542 00:24:27.675 --> 00:24:30.050 love that, just because, you know, it's never status quo. 543 00:24:30.050 --> 00:24:33.430 SHERVIN KHODABANDEH: What was the first career you wanted, 544 00:24:33.430 --> 00:24:34.940 like in your childhood? 545 00:24:34.940 --> 00:24:37.160 What did you want to be when you grew up? 546 00:24:37.160 --> 00:24:40.300 YA XU: Well, if you say way back, 547 00:24:40.300 --> 00:24:42.610 I joke about it still today. 548 00:24:42.610 --> 00:24:44.820 Like I told my mom, I wanted to be president then. 549 00:24:44.820 --> 00:24:46.487 SAM RANSBOTHAM: Oh, I will vote for you. 550 00:24:46.487 --> 00:24:48.230 SHERVIN KHODABANDEH: What's your greatest 551 00:24:48.230 --> 00:24:50.123 wish for AI in the future? 552 00:24:50.123 --> 00:24:50.830 553 00:24:50.830 --> 00:24:52.980 YA XU: It's just really help. 554 00:24:52.980 --> 00:24:57.387 AI should serve the people and help everything 555 00:24:57.387 --> 00:24:59.470 that we are doing to be more efficient and better. 556 00:24:59.470 --> 00:25:02.470 And I would say that in general about technology. 557 00:25:02.470 --> 00:25:05.510 So many things that we were not able to do, 558 00:25:05.510 --> 00:25:08.400 now we can do because of AI and technology, 559 00:25:08.400 --> 00:25:12.870 so I continue to be very bullish about that, 560 00:25:12.870 --> 00:25:15.420 and I'm very excited that I can be part of it. 561 00:25:15.420 --> 00:25:16.790 SHERVIN KHODABANDEH: Thank you. 562 00:25:16.790 --> 00:25:17.070 563 00:25:17.070 --> 00:25:18.220 SAM RANSBOTHAM: Ya, it is absolutely wonderful 564 00:25:18.220 --> 00:25:18.750 talking to you. 565 00:25:18.750 --> 00:25:20.330 You know, I was thinking back to when 566 00:25:20.330 --> 00:25:22.413 you were talking about muscles, and when you first 567 00:25:22.413 --> 00:25:24.740 said "muscles," I have to say, I was thinking biceps. 568 00:25:24.740 --> 00:25:26.450 You know, I'm thinking about big muscles. 569 00:25:26.450 --> 00:25:30.490 But as we've talked, I'm now thinking more like eye muscles 570 00:25:30.490 --> 00:25:33.270 -- like focus, like getting the granularity -- 571 00:25:33.270 --> 00:25:35.900 because what you're talking about is the data that's going 572 00:25:35.900 --> 00:25:39.490 to let us see things that are happening in the world that we 573 00:25:39.490 --> 00:25:43.203 just have not been able to see before, and it's fascinating, 574 00:25:43.203 --> 00:25:45.120 and we've really enjoyed talking to you today. 575 00:25:45.120 --> 00:25:45.910 Thank you so much. 576 00:25:45.910 --> 00:25:46.272 577 00:25:46.272 --> 00:25:46.980 YA XU: Thank you. 578 00:25:46.980 --> 00:25:48.120 Thank you for having me. 579 00:25:48.120 --> 00:25:50.450 SHERVIN KHODABANDEH: It's been truly, truly insightful. 580 00:25:50.450 --> 00:25:52.370 And thank you for making time. 581 00:25:52.370 --> 00:25:53.690 YA XU: Of course. 582 00:25:53.690 --> 00:25:56.330 SAM RANSBOTHAM: Thanks for joining us today. 583 00:25:56.330 --> 00:25:58.470 On our next episode, Shervin and I 584 00:25:58.470 --> 00:26:02.380 talk with Nitzan Mekel-Bobrov, eBay's chief AI officer. 585 00:26:02.380 --> 00:26:04.120 Hope you can join us then. 586 00:26:04.120 --> 00:26:05.680 ALLISON RYDER: Thanks for listening 587 00:26:05.680 --> 00:26:07.180 to Me, Myself, and AI. 588 00:26:07.180 --> 00:26:09.630 We believe, like you, that the conversation 589 00:26:09.630 --> 00:26:11.860 about AI implementation doesn't start and stop 590 00:26:11.860 --> 00:26:13.040 with this podcast. 591 00:26:13.040 --> 00:26:15.530 That's why we've created a group on LinkedIn, specifically 592 00:26:15.530 --> 00:26:16.640 for leaders like you. 593 00:26:16.640 --> 00:26:19.390 It's called AI for Leaders, and if you join us, 594 00:26:19.390 --> 00:26:21.410 you can chat with show creators and hosts, 595 00:26:21.410 --> 00:26:25.040 ask your own questions, share insights, and gain access 596 00:26:25.040 --> 00:26:27.520 to valuable resources about AI implementation 597 00:26:27.520 --> 00:26:29.620 from MIT SMR and BCG. 598 00:26:29.620 --> 00:26:34.740 You can access it by visiting mitsmr.com/AIforLeaders. 599 00:26:34.740 --> 00:26:37.460 We'll put that link in the show notes, 600 00:26:37.460 --> 00:26:39.890 and we hope to see you there. 601 00:26:39.890 --> 00:26:45.000