WEBVTT 1 00:00:00.000 --> 00:00:02.078 2 00:00:02.078 --> 00:00:04.370 SAM RANSBOTHAM: Engineers are not stereotypically known 3 00:00:04.370 --> 00:00:07.690 for their dancing prowess, but increasingly, organizations 4 00:00:07.690 --> 00:00:10.960 need to choreograph how humans and machines work together. 5 00:00:10.960 --> 00:00:14.460 In this episode, we chat with Sidney Madison Prescott, 6 00:00:14.460 --> 00:00:16.480 global head of intelligent automation 7 00:00:16.480 --> 00:00:19.660 at Spotify, about the dance between humans and machines 8 00:00:19.660 --> 00:00:22.430 that improves business processes. 9 00:00:22.430 --> 00:00:25.320 Welcome to Me, Myself, and AI, a podcast 10 00:00:25.320 --> 00:00:27.410 on artificial intelligence in business. 11 00:00:27.410 --> 00:00:31.160 Each episode, we introduce you to someone innovating with AI. 12 00:00:31.160 --> 00:00:34.490 I'm Sam Ransbotham, professor of information systems 13 00:00:34.490 --> 00:00:35.830 at Boston College. 14 00:00:35.830 --> 00:00:39.180 I'm also the guest editor for the AI and Business Strategy 15 00:00:39.180 --> 00:00:42.700 Big Ideas program at MIT Sloan Management Review. 16 00:00:42.700 --> 00:00:45.250 SHERVIN KHODABANDEH: And I'm Shervin Khodabandeh, 17 00:00:45.250 --> 00:00:49.420 senior partner with BCG, and I colead BCG's AI practice 18 00:00:49.420 --> 00:00:50.650 in North America. 19 00:00:50.650 --> 00:00:54.140 Together, MIT SMR and BCG have been 20 00:00:54.140 --> 00:00:57.560 researching AI for five years, interviewing hundreds 21 00:00:57.560 --> 00:00:59.900 of practitioners and surveying thousands 22 00:00:59.900 --> 00:01:04.010 of companies on what it takes to build and to deploy and scale 23 00:01:04.010 --> 00:01:07.530 AI capabilities and really transform 24 00:01:07.530 --> 00:01:09.530 the way organizations operate. 25 00:01:09.530 --> 00:01:13.660 SAM RANSBOTHAM: Today we're talking with Sidney Madison 26 00:01:13.660 --> 00:01:14.430 Prescott. 27 00:01:14.430 --> 00:01:16.620 She's the global head of intelligent automation 28 00:01:16.620 --> 00:01:17.560 at Spotify. 29 00:01:17.560 --> 00:01:18.850 Sidney, great to have you. 30 00:01:18.850 --> 00:01:20.180 Thanks for joining us. 31 00:01:20.180 --> 00:01:21.030 SIDNEY MADISON PRESCOTT: Great to be here. 32 00:01:21.030 --> 00:01:23.300 SAM RANSBOTHAM: Well, the title of our podcast 33 00:01:23.300 --> 00:01:25.290 is Me, Myself, and AI, and we tend 34 00:01:25.290 --> 00:01:27.410 to focus on you as an individual, 35 00:01:27.410 --> 00:01:29.103 so maybe let's start there. 36 00:01:29.103 --> 00:01:31.270 Can you tell us a little bit about your current role 37 00:01:31.270 --> 00:01:31.785 at Spotify? 38 00:01:31.785 --> 00:01:32.190 39 00:01:32.190 --> 00:01:33.690 SIDNEY MADISON PRESCOTT: I currently 40 00:01:33.690 --> 00:01:35.750 head up a global team. 41 00:01:35.750 --> 00:01:41.700 We are primarily within the U.S. and the U.K./European areas, 42 00:01:41.700 --> 00:01:45.820 and we are working towards modernizing and really 43 00:01:45.820 --> 00:01:50.230 improving the efficiency of the workforce within Spotify. 44 00:01:50.230 --> 00:01:53.060 So this is looking specifically at all 45 00:01:53.060 --> 00:01:56.770 of the processes across many cross-functional teams 46 00:01:56.770 --> 00:01:59.480 to ensure that, whether we're working 47 00:01:59.480 --> 00:02:02.730 with the UI of the application, whether we are talking 48 00:02:02.730 --> 00:02:05.660 about the finance function, we are 49 00:02:05.660 --> 00:02:08.729 being as efficient as possible in the ways 50 00:02:08.729 --> 00:02:12.820 that we go about our daily routines as Spotifiers. 51 00:02:12.820 --> 00:02:15.840 And really, we're taking that on by looking 52 00:02:15.840 --> 00:02:19.800 very closely at ways to leverage intelligent automation. 53 00:02:19.800 --> 00:02:22.000 This is, in essence, a tool stack 54 00:02:22.000 --> 00:02:25.590 where we combine robotic process automation, 55 00:02:25.590 --> 00:02:29.230 artificial intelligence, and machine learning in order 56 00:02:29.230 --> 00:02:32.190 to really facilitate that digital transformation. 57 00:02:32.190 --> 00:02:36.060 And our team is working very closely primarily right now 58 00:02:36.060 --> 00:02:37.570 within the finance function. 59 00:02:37.570 --> 00:02:41.910 However, we have branched out to our ads business operations as 60 00:02:41.910 --> 00:02:45.090 well, really looking at, again, that digital transformation 61 00:02:45.090 --> 00:02:48.630 and how we can really unify the front-office and back-office 62 00:02:48.630 --> 00:02:51.610 processes that our Spotifiers work on today. 63 00:02:51.610 --> 00:02:54.350 SAM RANSBOTHAM: So your degree is ... 64 00:02:54.350 --> 00:02:57.230 you have a major in philosophy with a double minor 65 00:02:57.230 --> 00:03:00.580 in political science and ethics from Georgia State University 66 00:03:00.580 --> 00:03:01.850 -- go Panthers! 67 00:03:01.850 --> 00:03:03.960 That's certainly not a typical connection 68 00:03:03.960 --> 00:03:05.360 to artificial intelligence. 69 00:03:05.360 --> 00:03:07.670 Can you make a connection between your background 70 00:03:07.670 --> 00:03:10.865 and your path to get from there to your current role? 71 00:03:10.865 --> 00:03:12.240 SIDNEY MADISON PRESCOTT: I agree; 72 00:03:12.240 --> 00:03:15.740 it's not the traditional trajectory of someone 73 00:03:15.740 --> 00:03:18.200 who would work specifically within these really 74 00:03:18.200 --> 00:03:21.070 kind of emerging and innovative technologies. 75 00:03:21.070 --> 00:03:23.290 I would say my degree really prepared 76 00:03:23.290 --> 00:03:27.050 me to think critically about various tasks, 77 00:03:27.050 --> 00:03:30.170 various responsibilities, and to identify 78 00:03:30.170 --> 00:03:34.030 a business problem [and] potentially pitch a solution 79 00:03:34.030 --> 00:03:37.620 to that business problem that was technical in nature. 80 00:03:37.620 --> 00:03:42.460 And I really do rest very firmly on the critical thinking 81 00:03:42.460 --> 00:03:44.930 and the logic skills that I received out 82 00:03:44.930 --> 00:03:46.510 of my philosophy degree. 83 00:03:46.510 --> 00:03:49.930 Those really helped facilitate my growth 84 00:03:49.930 --> 00:03:53.420 and really amplified my strengths in terms of process 85 00:03:53.420 --> 00:03:56.340 reengineering and really being able to think outside 86 00:03:56.340 --> 00:04:00.760 of the box when it comes to how we are translating business 87 00:04:00.760 --> 00:04:03.450 problems into business efficiencies 88 00:04:03.450 --> 00:04:05.280 through technology. 89 00:04:05.280 --> 00:04:09.110 My career path really began with an internship, 90 00:04:09.110 --> 00:04:12.930 and I was specifically working with configuration and asset 91 00:04:12.930 --> 00:04:13.800 management. 92 00:04:13.800 --> 00:04:16.089 And so I was actually a pre-law concentration, 93 00:04:16.089 --> 00:04:17.920 and I wanted to become a lawyer. 94 00:04:17.920 --> 00:04:21.839 I began working on software contracts, 95 00:04:21.839 --> 00:04:25.470 and that really was my first foray into technology. 96 00:04:25.470 --> 00:04:30.220 I began to understand business law and business contracts; 97 00:04:30.220 --> 00:04:37.040 however, I really started to see the amazing contrast 98 00:04:37.040 --> 00:04:40.760 between how we think about what technologists do 99 00:04:40.760 --> 00:04:46.030 and the reality of the work that is done around the world. 100 00:04:46.030 --> 00:04:49.210 Once I started diving into the software contract 101 00:04:49.210 --> 00:04:52.620 and the true-up aspect of the business -- 102 00:04:52.620 --> 00:04:57.750 and that's really where I moved very firmly into becoming 103 00:04:57.750 --> 00:05:00.230 a technologist -- once I made that transition, 104 00:05:00.230 --> 00:05:03.360 from there I was really diving very heavily 105 00:05:03.360 --> 00:05:07.600 into the connections between the disparate systems 106 00:05:07.600 --> 00:05:12.090 and the challenges that we have with those configurations 107 00:05:12.090 --> 00:05:16.050 and the outputs or the quality of the data that comes from 108 00:05:16.050 --> 00:05:19.810 those various configurations and integrations with different 109 00:05:19.810 --> 00:05:20.760 systems. 110 00:05:20.760 --> 00:05:24.060 And so through that, I really started to home 111 00:05:24.060 --> 00:05:27.880 in on a passion for data quality and governance, 112 00:05:27.880 --> 00:05:30.880 and really looking at data as an asset. 113 00:05:30.880 --> 00:05:36.020 And this is where I began to firmly understand 114 00:05:36.020 --> 00:05:41.340 the correlation between process efficiency, process automation, 115 00:05:41.340 --> 00:05:46.980 and [how] moving towards a more automated business provides you 116 00:05:46.980 --> 00:05:50.450 with greater data insights and better ... 117 00:05:50.450 --> 00:05:54.410 executive decisions that you can make based off of those data 118 00:05:54.410 --> 00:05:55.540 attributes. 119 00:05:55.540 --> 00:06:00.210 I started that journey within Fiserv, and then from there, 120 00:06:00.210 --> 00:06:04.310 I had an opportunity to dive into a proof of concept 121 00:06:04.310 --> 00:06:06.420 on robotic process automation. 122 00:06:06.420 --> 00:06:09.380 And this was relatively early on, 123 00:06:09.380 --> 00:06:12.660 so I would call myself basically an early evangelist 124 00:06:12.660 --> 00:06:14.220 of that technology. 125 00:06:14.220 --> 00:06:16.720 This was back in maybe 2015. 126 00:06:16.720 --> 00:06:19.560 And so through that proof of concept, 127 00:06:19.560 --> 00:06:23.430 I began to understand: What is robotic process automation? 128 00:06:23.430 --> 00:06:25.370 Why is it something that can be used 129 00:06:25.370 --> 00:06:29.260 to facilitate a digital transformation at an enterprise 130 00:06:29.260 --> 00:06:29.860 level? 131 00:06:29.860 --> 00:06:31.650 I also started diving at that point 132 00:06:31.650 --> 00:06:34.250 into intelligent automation, which 133 00:06:34.250 --> 00:06:36.720 is that combination of robotic process 134 00:06:36.720 --> 00:06:39.410 automation, artificial intelligence and machine 135 00:06:39.410 --> 00:06:43.130 learning, and sometimes optical character recognition as well. 136 00:06:43.130 --> 00:06:47.230 So it was very much a transition from software contract 137 00:06:47.230 --> 00:06:51.170 legalese, to configuration and asset management, 138 00:06:51.170 --> 00:06:54.400 to data quality and governance, and then that 139 00:06:54.400 --> 00:06:57.650 segued into more of the emerging technologies, 140 00:06:57.650 --> 00:07:01.770 such as artificial intelligence and robotic process automation. 141 00:07:01.770 --> 00:07:03.770 SAM RANSBOTHAM: That's particularly interesting, 142 00:07:03.770 --> 00:07:05.400 because most of the people, I think 143 00:07:05.400 --> 00:07:08.040 they come to this awareness of the importance of data 144 00:07:08.040 --> 00:07:11.550 governance and quality after that foray 145 00:07:11.550 --> 00:07:14.190 into machine learning and artificial intelligence, 146 00:07:14.190 --> 00:07:17.210 when they find how much depends on data. 147 00:07:17.210 --> 00:07:20.110 And then they investigate why everything's falling apart, 148 00:07:20.110 --> 00:07:21.992 because of ugly data, and then they 149 00:07:21.992 --> 00:07:23.200 get interested in governance. 150 00:07:23.200 --> 00:07:24.460 And you've swapped that around to come 151 00:07:24.460 --> 00:07:25.752 from the other direction, then. 152 00:07:25.752 --> 00:07:28.490 I don't want to take this too much in a governance angle, 153 00:07:28.490 --> 00:07:30.290 but I find it difficult to talk about. 154 00:07:30.290 --> 00:07:32.040 So one of the things I talk about in class 155 00:07:32.040 --> 00:07:34.082 is the importance of data quality and governance. 156 00:07:34.082 --> 00:07:36.730 I just feel so boring as I say it. 157 00:07:36.730 --> 00:07:38.690 How do you get over that sort of, away from -- 158 00:07:38.690 --> 00:07:40.220 SHERVIN KHODABANDEH: How do you make it interesting? 159 00:07:40.220 --> 00:07:41.950 SAM RANSBOTHAM: Yeah, how do I make that interesting? 160 00:07:41.950 --> 00:07:44.075 How do you get people fired up about those aspects? 161 00:07:44.075 --> 00:07:44.223 162 00:07:44.223 --> 00:07:46.390 SIDNEY MADISON PRESCOTT: Data quality and governance 163 00:07:46.390 --> 00:07:49.490 is often seen as an afterthought, 164 00:07:49.490 --> 00:07:51.880 meaning it's just not something to focus on, right? 165 00:07:51.880 --> 00:07:54.760 You're more focused on accelerating business growth 166 00:07:54.760 --> 00:07:57.060 or solving different business challenges. 167 00:07:57.060 --> 00:08:00.510 But from my standpoint and how I speak to my stakeholders 168 00:08:00.510 --> 00:08:03.880 about this is, the acceleration of your processes 169 00:08:03.880 --> 00:08:07.950 and the automation of your processes, that is only one 170 00:08:07.950 --> 00:08:09.800 piece of the bigger puzzle. 171 00:08:09.800 --> 00:08:12.680 And so if you are automating processes, 172 00:08:12.680 --> 00:08:16.160 but the outputs or the outcomes of those processes 173 00:08:16.160 --> 00:08:20.800 are data elements that are not reliable, in essence 174 00:08:20.800 --> 00:08:23.170 you are defeating the purpose of that automation. 175 00:08:23.170 --> 00:08:26.190 I mean, automation is only as good as the outputs 176 00:08:26.190 --> 00:08:27.950 that it provides back to us, right? 177 00:08:27.950 --> 00:08:29.440 Whether it's the engineering team, 178 00:08:29.440 --> 00:08:32.760 whether it's business stakeholders in finance, etc. 179 00:08:32.760 --> 00:08:36.549 And particularly when you look at the decisions 180 00:08:36.549 --> 00:08:39.480 that we have to make based off of data outputs, 181 00:08:39.480 --> 00:08:42.100 that is where it really becomes critical. 182 00:08:42.100 --> 00:08:44.920 Because when we look at risk and control, 183 00:08:44.920 --> 00:08:48.070 when we look at SOX [Sarbanes-Oxley Act] processes 184 00:08:48.070 --> 00:08:50.830 specifically -- so, those that impact the financial health 185 00:08:50.830 --> 00:08:55.290 of a company -- this is where data is absolutely essential. 186 00:08:55.290 --> 00:08:57.080 It's essential that we get it right, 187 00:08:57.080 --> 00:08:59.020 that we can rely on that data. 188 00:08:59.020 --> 00:09:02.090 And I think really what I bring attention 189 00:09:02.090 --> 00:09:06.790 to is a deep desire to mitigate the risk that 190 00:09:06.790 --> 00:09:11.480 is created by bad data in your environment and the decisions 191 00:09:11.480 --> 00:09:14.938 that you make based off of that incorrect data. 192 00:09:14.938 --> 00:09:16.480 SAM RANSBOTHAM: As part of your role, 193 00:09:16.480 --> 00:09:19.330 your team goes in and increases automation in areas. 194 00:09:19.330 --> 00:09:20.780 How do people react to that? 195 00:09:20.780 --> 00:09:24.270 Are they thinking, "Oh my gosh, there goes my job? 196 00:09:24.270 --> 00:09:27.020 I used to do these tasks, and now they're going away?" 197 00:09:27.020 --> 00:09:28.490 Or are people happier? 198 00:09:28.490 --> 00:09:29.490 Do they feel threatened? 199 00:09:29.490 --> 00:09:31.540 What's the reaction to the sorts of processes 200 00:09:31.540 --> 00:09:33.570 that your team brings in? 201 00:09:33.570 --> 00:09:35.820 SIDNEY MADISON PRESCOTT: It has been very interesting. 202 00:09:35.820 --> 00:09:38.850 Initially, there is a thought that, "Oh, 203 00:09:38.850 --> 00:09:40.390 it's going to be scary." 204 00:09:40.390 --> 00:09:42.280 But the great thing about Spotify 205 00:09:42.280 --> 00:09:46.760 is it's a very dynamic and agile environment, and a very 206 00:09:46.760 --> 00:09:48.630 creative environment. 207 00:09:48.630 --> 00:09:51.450 And so, because of that, it was very 208 00:09:51.450 --> 00:09:57.120 conducive to more excitement and curiosity about the technology. 209 00:09:57.120 --> 00:10:00.450 It's been primarily really excited. 210 00:10:00.450 --> 00:10:02.160 People are happier. 211 00:10:02.160 --> 00:10:03.560 They enjoy the bots. 212 00:10:03.560 --> 00:10:05.270 They nickname their bots. 213 00:10:05.270 --> 00:10:08.520 So it's been primarily, I would say, very positive, 214 00:10:08.520 --> 00:10:09.770 for the most part. 215 00:10:09.770 --> 00:10:11.220 And one thing I'll say as well -- 216 00:10:11.220 --> 00:10:16.320 I think that is a result of both the culture within Spotify 217 00:10:16.320 --> 00:10:19.640 and also the fact that we are very much focused 218 00:10:19.640 --> 00:10:24.320 on making the Spotifiers' engagement within the company 219 00:10:24.320 --> 00:10:28.840 more efficient and hopefully facilitating less attrition 220 00:10:28.840 --> 00:10:31.980 as a result because of that increase in employee 221 00:10:31.980 --> 00:10:33.390 satisfaction with their role. 222 00:10:33.390 --> 00:10:35.170 SHERVIN KHODABANDEH: Sidney, I'd like 223 00:10:35.170 --> 00:10:37.500 to pivot on something you said. 224 00:10:37.500 --> 00:10:41.290 You've been talking about automation and various ways 225 00:10:41.290 --> 00:10:43.820 of automation, intelligent automation. 226 00:10:43.820 --> 00:10:50.000 What are your thoughts on other ways that AI can create value? 227 00:10:50.000 --> 00:10:52.810 Because automation is sort of at the far end of the spectrum, 228 00:10:52.810 --> 00:10:57.900 right, where AI decides and executes, and completely 229 00:10:57.900 --> 00:10:59.170 replaces [a] human? 230 00:10:59.170 --> 00:11:03.180 And we're seeing a lot of interest and attention 231 00:11:03.180 --> 00:11:05.940 recently in where AI and human work 232 00:11:05.940 --> 00:11:08.400 together to do something that neither one of them 233 00:11:08.400 --> 00:11:10.500 could do alone as effectively. 234 00:11:10.500 --> 00:11:12.950 What are some examples of that, and what 235 00:11:12.950 --> 00:11:15.680 are your thoughts on more, I would say, 236 00:11:15.680 --> 00:11:18.900 integrated human-AI ways of working? 237 00:11:18.900 --> 00:11:19.936 Does that make sense? 238 00:11:19.936 --> 00:11:19.985 239 00:11:19.985 --> 00:11:21.360 SIDNEY MADISON PRESCOTT: It does. 240 00:11:21.360 --> 00:11:24.190 And this is actually a subject I'm very passionate about, 241 00:11:24.190 --> 00:11:27.740 because I think this is the future of what we are going 242 00:11:27.740 --> 00:11:30.540 to see, not only as technologists, but really 243 00:11:30.540 --> 00:11:35.350 as a society in terms of incorporating machines that 244 00:11:35.350 --> 00:11:38.340 are capable of, I'll say, a version 245 00:11:38.340 --> 00:11:44.010 of cognitive functionality, with that intuitive cognitive nature 246 00:11:44.010 --> 00:11:45.690 that humans have. 247 00:11:45.690 --> 00:11:47.980 The merging of those two? 248 00:11:47.980 --> 00:11:52.310 I think that is an extremely propelling value proposition, 249 00:11:52.310 --> 00:11:54.670 specifically in relation to the ways 250 00:11:54.670 --> 00:11:57.030 that we can continue to evolve technology. 251 00:11:57.030 --> 00:12:00.830 And I would agree: I think what we are seeing now is less 252 00:12:00.830 --> 00:12:04.950 of a desire to say that humans step away from a particular 253 00:12:04.950 --> 00:12:09.720 process and the machine takes over, and more of a, I'll say, 254 00:12:09.720 --> 00:12:12.380 almost a dance -- a collaborative dance, 255 00:12:12.380 --> 00:12:13.830 between the machine and the human. 256 00:12:13.830 --> 00:12:17.790 And I think this is wonderful, because what it enables us 257 00:12:17.790 --> 00:12:23.700 to do is look at the ways that humans flourish in relation 258 00:12:23.700 --> 00:12:28.590 to their cognitive capabilities, and then the ways that machines 259 00:12:28.590 --> 00:12:33.620 can basically help enable humans in the areas where we are not 260 00:12:33.620 --> 00:12:37.860 as strong, with large populations of data being able 261 00:12:37.860 --> 00:12:40.210 to quickly assess deltas in that data -- 262 00:12:40.210 --> 00:12:43.680 being able to see patterns in millions of rows of data. 263 00:12:43.680 --> 00:12:47.210 These are the areas where AI can really flourish for us, 264 00:12:47.210 --> 00:12:48.510 AI/machine learning. 265 00:12:48.510 --> 00:12:51.320 The more cognitive functions where maybe a decision 266 00:12:51.320 --> 00:12:53.700 needs to be made off of that data, 267 00:12:53.700 --> 00:12:57.450 that's where the humans can come back in and pick up that work. 268 00:12:57.450 --> 00:13:00.900 I call it human, really, augmentation of humans, 269 00:13:00.900 --> 00:13:01.990 at the end of the day. 270 00:13:01.990 --> 00:13:06.470 And the goal would be maximizing our potential as humans 271 00:13:06.470 --> 00:13:07.128 to really -- 272 00:13:07.128 --> 00:13:08.670 SHERVIN KHODABANDEH: Digital dancing. 273 00:13:08.670 --> 00:13:09.030 SIDNEY MADISON PRESCOTT: Yes. 274 00:13:09.030 --> 00:13:10.190 Exactly, exactly. 275 00:13:10.190 --> 00:13:13.020 To really facilitate better outputs, 276 00:13:13.020 --> 00:13:15.110 and to also allow us to really ... 277 00:13:15.110 --> 00:13:18.480 it's almost as if we amplify our own abilities with these 278 00:13:18.480 --> 00:13:22.140 machines, to the point that we can really say that we are 279 00:13:22.140 --> 00:13:25.670 operating at an optimal level, whether we're business, 280 00:13:25.670 --> 00:13:27.850 whether we are engineers. 281 00:13:27.850 --> 00:13:30.680 And then of course those outputs would naturally 282 00:13:30.680 --> 00:13:32.760 benefit the enterprise at the end of the day. 283 00:13:32.760 --> 00:13:36.000 SHERVIN KHODABANDEH: So this notion of the digital tango -- 284 00:13:36.000 --> 00:13:38.000 and, Sam, I have to give you credit it for that, 285 00:13:38.000 --> 00:13:40.860 because this is something you've been talking about -- 286 00:13:40.860 --> 00:13:43.690 it's actually quite interesting and it feels quite 287 00:13:43.690 --> 00:13:50.430 underappreciated: that it's either human solo or AI solo, 288 00:13:50.430 --> 00:13:54.370 and the middle ground, where [a] human has to do things 289 00:13:54.370 --> 00:13:59.840 differently because of AI, but AI sort of morphs around what 290 00:13:59.840 --> 00:14:03.240 humans [are] good at and adapts, and [the] human does a bit 291 00:14:03.240 --> 00:14:04.460 of the same thing. 292 00:14:04.460 --> 00:14:07.560 It feels completely underexplored, 293 00:14:07.560 --> 00:14:10.100 partly because it feels a little scary, right? 294 00:14:10.100 --> 00:14:13.920 You have to rethink processes and existing ways of working, 295 00:14:13.920 --> 00:14:18.260 and existing norms of job descriptions and things 296 00:14:18.260 --> 00:14:21.040 like that, and as you said, this is sort of the future. 297 00:14:21.040 --> 00:14:27.120 What are your thoughts or ideas on how this could become more 298 00:14:27.120 --> 00:14:32.150 accepted -- that evolution, that opening of people's eyes 299 00:14:32.150 --> 00:14:35.740 to, it's not all or nothing, but it's actually in the middle 300 00:14:35.740 --> 00:14:38.500 ground, that there's so much opportunity for us. 301 00:14:38.500 --> 00:14:40.490 SAM RANSBOTHAM: Any analogy for dancing 302 00:14:40.490 --> 00:14:42.216 is going to terrify me right off the bat. 303 00:14:42.216 --> 00:14:43.010 304 00:14:43.010 --> 00:14:45.382 SHERVIN KHODABANDEH: Well, you came up with it. 305 00:14:45.382 --> 00:14:47.840 SIDNEY MADISON PRESCOTT: It is a very interesting question. 306 00:14:47.840 --> 00:14:52.510 And I think that the hesitancy to really embrace this kind 307 00:14:52.510 --> 00:14:58.920 of human-in-the-loop machine and human partnership comes from 308 00:14:58.920 --> 00:15:02.050 the human resistance to change. 309 00:15:02.050 --> 00:15:04.780 And we all know that this is something that we really 310 00:15:04.780 --> 00:15:07.180 struggle with -- change -- as humans. 311 00:15:07.180 --> 00:15:09.370 We believe we are quick to adapt, 312 00:15:09.370 --> 00:15:13.080 but in reality, we don't adapt quite as quickly as we'd like. 313 00:15:13.080 --> 00:15:17.390 And I think the fact that we have not yet embraced 314 00:15:17.390 --> 00:15:23.540 the digital-human workforce, and really combining those two 315 00:15:23.540 --> 00:15:27.080 together in almost a seamless workforce, 316 00:15:27.080 --> 00:15:32.520 is because we are so reliant on our prior understanding of what 317 00:15:32.520 --> 00:15:35.800 work -- and I'll put that in air quotes -- what "work" is, 318 00:15:35.800 --> 00:15:36.300 right? 319 00:15:36.300 --> 00:15:40.130 The fundamental element that is going to change this 320 00:15:40.130 --> 00:15:44.610 is redefining what we mean as humans when we say work. 321 00:15:44.610 --> 00:15:45.940 What does work mean? 322 00:15:45.940 --> 00:15:49.500 Whether it is an accountant, whether it is an engineer, 323 00:15:49.500 --> 00:15:52.330 whether it is an engineering manager, 324 00:15:52.330 --> 00:15:56.040 an executive in the C-suite, what do we mean by work? 325 00:15:56.040 --> 00:15:58.430 And we've started to see a bit of this transition 326 00:15:58.430 --> 00:15:59.920 within the pandemic, right? 327 00:15:59.920 --> 00:16:01.880 We've gone into the virtual world. 328 00:16:01.880 --> 00:16:04.170 We've heard pros and cons about this. 329 00:16:04.170 --> 00:16:07.900 And we started to see pushback on some of the companies that 330 00:16:07.900 --> 00:16:13.850 are attempting to embrace that more virtual workforce concept. 331 00:16:13.850 --> 00:16:16.620 And this, I believe, is the playing ground 332 00:16:16.620 --> 00:16:19.400 for the future, which is, can we embrace 333 00:16:19.400 --> 00:16:24.290 a virtual workforce and a reality in which workers 334 00:16:24.290 --> 00:16:29.000 around the world are able to work in a virtual manner 335 00:16:29.000 --> 00:16:31.780 to facilitate the growth of the business? 336 00:16:31.780 --> 00:16:35.750 And the next piece of that is, in that virtual work, 337 00:16:35.750 --> 00:16:41.160 can we move away from thinking of these silos of human 338 00:16:41.160 --> 00:16:44.340 versus machine-relegated tasks? 339 00:16:44.340 --> 00:16:46.860 And I believe if we can merge those together -- 340 00:16:46.860 --> 00:16:51.710 the virtual workforce combined with a redefining of what it 341 00:16:51.710 --> 00:16:54.770 means to work as a human, and almost think of ourselves 342 00:16:54.770 --> 00:16:58.280 as "What skills do the future accountants need? 343 00:16:58.280 --> 00:17:01.460 What skills do the future backend engineers need? 344 00:17:01.460 --> 00:17:06.040 And how are those skills going to play seamlessly with these 345 00:17:06.040 --> 00:17:08.859 emerging technologies?" -- that, I believe, 346 00:17:08.859 --> 00:17:12.380 is where we're going to hit the sweet spot where we no longer 347 00:17:12.380 --> 00:17:15.140 have this almost tension between, "OK, 348 00:17:15.140 --> 00:17:17.280 well only humans can do this." 349 00:17:17.280 --> 00:17:19.119 I hear and see this a lot in my work, 350 00:17:19.119 --> 00:17:21.020 which is, "Oh, a human has to do this." 351 00:17:21.020 --> 00:17:22.990 I'm like, "Well, humans are prone to error, 352 00:17:22.990 --> 00:17:27.240 so maybe we don't always need to rely solely on ourselves," 353 00:17:27.240 --> 00:17:31.200 and thinking less of the machine as an adversary 354 00:17:31.200 --> 00:17:35.250 and more as a partner and enabler in that business 355 00:17:35.250 --> 00:17:35.750 process. 356 00:17:35.750 --> 00:17:39.750 SAM RANSBOTHAM: It sounds like I know you have this citizen data 357 00:17:39.750 --> 00:17:41.140 scientist program. 358 00:17:41.140 --> 00:17:43.740 That sounds like exactly what that program is all about, 359 00:17:43.740 --> 00:17:45.428 teaching those basic dance steps. 360 00:17:45.428 --> 00:17:47.220 SIDNEY MADISON PRESCOTT: Yes, very much so. 361 00:17:47.220 --> 00:17:51.140 So within Spotify, we're very focused on our citizen 362 00:17:51.140 --> 00:17:53.430 developer community, which is what we call it. 363 00:17:53.430 --> 00:17:56.710 And this is where we are enabling our business 364 00:17:56.710 --> 00:18:00.330 stakeholders, through upskilling, various boot camps 365 00:18:00.330 --> 00:18:03.570 and trainings internally, to really enhance 366 00:18:03.570 --> 00:18:07.070 their understanding of these emerging technologies, 367 00:18:07.070 --> 00:18:10.640 and we are really encouraging our business stakeholders, 368 00:18:10.640 --> 00:18:15.700 specifically, to embrace the technology in a way that really 369 00:18:15.700 --> 00:18:19.510 resonates with their job responsibilities, currently 370 00:18:19.510 --> 00:18:21.760 and in the future as we continue to grow the business. 371 00:18:21.760 --> 00:18:24.850 SAM RANSBOTHAM: Sidney, we're recording this episode right 372 00:18:24.850 --> 00:18:27.740 around the holiday season, when Wrapped has just come out. 373 00:18:27.740 --> 00:18:31.090 So in this Wrapped product, Spotify members get a deep dive 374 00:18:31.090 --> 00:18:33.480 into their most memorable listening moments of the year, 375 00:18:33.480 --> 00:18:36.370 including podcasts, and we hope this podcast, I guess. 376 00:18:36.370 --> 00:18:38.810 Can you perhaps use that to illustrate how you're 377 00:18:38.810 --> 00:18:40.680 doing some of these things? 378 00:18:40.680 --> 00:18:42.180 SIDNEY MADISON PRESCOTT: Absolutely. 379 00:18:42.180 --> 00:18:45.500 So there is a heavy amount of data mining/machine learning 380 00:18:45.500 --> 00:18:48.180 that's being used in order to facilitate 381 00:18:48.180 --> 00:18:51.310 kind of the presentation of, "Here's your top artist. 382 00:18:51.310 --> 00:18:53.170 Here's how many times you listened to them. 383 00:18:53.170 --> 00:18:56.260 Here are the different genres that you've visited." 384 00:18:56.260 --> 00:19:00.410 It's a great example of, again, leveraging data mining 385 00:19:00.410 --> 00:19:04.460 in such a way to create almost this visualization, if you 386 00:19:04.460 --> 00:19:08.020 will, of what your listening landscape has 387 00:19:08.020 --> 00:19:10.610 been as a customer of Spotify. 388 00:19:10.610 --> 00:19:13.590 And we even see this in a lot of our back-end processes 389 00:19:13.590 --> 00:19:17.500 as well, where we are really looking at ways to leverage 390 00:19:17.500 --> 00:19:20.190 we have massive amounts of data, and we're really looking 391 00:19:20.190 --> 00:19:21.600 at, "OK, what are the insights? 392 00:19:21.600 --> 00:19:25.040 How do we actually use this to do some predictive analytics?" 393 00:19:25.040 --> 00:19:27.552 whether it's the ways our systems are interacting, 394 00:19:27.552 --> 00:19:29.010 whether it's actually understanding 395 00:19:29.010 --> 00:19:30.610 process inefficiencies. 396 00:19:30.610 --> 00:19:34.980 So we're looking very closely at monitoring deviations 397 00:19:34.980 --> 00:19:38.080 in our various process flows in order 398 00:19:38.080 --> 00:19:41.930 to better identify where we have areas of inefficiency 399 00:19:41.930 --> 00:19:45.380 that we can zero in on and tackle those to make 400 00:19:45.380 --> 00:19:48.900 the overall workflows of our employees faster 401 00:19:48.900 --> 00:19:50.030 and more efficient. 402 00:19:50.030 --> 00:19:52.780 So it really is about mining that data, 403 00:19:52.780 --> 00:19:55.790 but we're mining it for critical information that 404 00:19:55.790 --> 00:20:00.010 can help us to be more proactive about inefficiencies, more 405 00:20:00.010 --> 00:20:03.340 proactive about making better decisions, predictive analytics 406 00:20:03.340 --> 00:20:06.600 on maybe challenges that we might foresee coming down 407 00:20:06.600 --> 00:20:09.130 the road for a particular business process, 408 00:20:09.130 --> 00:20:11.490 even predicting increases in volume 409 00:20:11.490 --> 00:20:14.630 and how that will translate throughout the environment. 410 00:20:14.630 --> 00:20:17.090 So it's a really great use of, I think, 411 00:20:17.090 --> 00:20:22.010 data that otherwise would ... if you don't use that data, 412 00:20:22.010 --> 00:20:25.950 it's almost a waste, because it's like you can just glean 413 00:20:25.950 --> 00:20:27.440 so many insights from it. 414 00:20:27.440 --> 00:20:29.450 SHERVIN KHODABANDEH: Sidney, there's 415 00:20:29.450 --> 00:20:35.200 a war for talent in this space, and many companies, 416 00:20:35.200 --> 00:20:37.240 digital natives and otherwise, are 417 00:20:37.240 --> 00:20:41.490 struggling to attract, and retain, and cultivate 418 00:20:41.490 --> 00:20:43.050 the top talent here. 419 00:20:43.050 --> 00:20:45.612 What are your thoughts on that, and is there 420 00:20:45.612 --> 00:20:47.070 something you could share about how 421 00:20:47.070 --> 00:20:49.225 Spotify goes about doing that? 422 00:20:49.225 --> 00:20:50.032 423 00:20:50.032 --> 00:20:51.240 SIDNEY MADISON PRESCOTT: Yes. 424 00:20:51.240 --> 00:20:53.350 So this is a really interesting topic, 425 00:20:53.350 --> 00:20:56.320 because it really comes down to the pipeline -- 426 00:20:56.320 --> 00:21:00.030 the pipeline from all the way up to elementary school, 427 00:21:00.030 --> 00:21:03.420 down to whether or not it's technical college, 428 00:21:03.420 --> 00:21:06.320 four-year college, etc., and into the professional 429 00:21:06.320 --> 00:21:07.330 workforce. 430 00:21:07.330 --> 00:21:11.490 And the question becomes kind of almost back 431 00:21:11.490 --> 00:21:15.500 to what we initially spoke about: Are we facilitating 432 00:21:15.500 --> 00:21:18.700 an understanding of what the future roles 433 00:21:18.700 --> 00:21:20.740 within the enterprise will look like 434 00:21:20.740 --> 00:21:22.970 and the skill sets that will be needed? 435 00:21:22.970 --> 00:21:27.600 And are we enabling individuals throughout the course 436 00:21:27.600 --> 00:21:31.310 of their academic or technical lifetimes 437 00:21:31.310 --> 00:21:34.510 to better engage with these different technologies 438 00:21:34.510 --> 00:21:35.820 at an early age? 439 00:21:35.820 --> 00:21:37.620 And I think what we're seeing is, 440 00:21:37.620 --> 00:21:41.310 there's a lack of understanding of what 441 00:21:41.310 --> 00:21:43.070 it means to be a technologist. 442 00:21:43.070 --> 00:21:46.240 I think there's a lot of misconceptions floating out 443 00:21:46.240 --> 00:21:49.030 there, specifically within, I would say, 444 00:21:49.030 --> 00:21:50.270 the younger generation. 445 00:21:50.270 --> 00:21:55.470 And I think we need to do a better job of really teaching 446 00:21:55.470 --> 00:21:58.680 younger students -- and even preteens -- 447 00:21:58.680 --> 00:22:02.270 what a career in technology looks like and, more 448 00:22:02.270 --> 00:22:04.540 importantly, what it looks like 10 years from now, 449 00:22:04.540 --> 00:22:07.680 15 years from now, when they'll be entering the workforce. 450 00:22:07.680 --> 00:22:11.370 And then you back in from the enterprise perspective, 451 00:22:11.370 --> 00:22:14.330 and you start reaching out to those college students, 452 00:22:14.330 --> 00:22:16.760 you start reaching out to those high schoolers, 453 00:22:16.760 --> 00:22:20.120 and you help them to see a day in the life. 454 00:22:20.120 --> 00:22:22.550 We do a lot of "a day in the life of an engineer," 455 00:22:22.550 --> 00:22:25.080 "a day in the life of an engineering leader," 456 00:22:25.080 --> 00:22:28.360 and kind of showing it's not someone in a hoodie 457 00:22:28.360 --> 00:22:32.800 that's hacking on a computer, which is the movie element, 458 00:22:32.800 --> 00:22:33.300 right? 459 00:22:33.300 --> 00:22:35.030 It's a lot more than that. 460 00:22:35.030 --> 00:22:37.760 And even myself, when I moved into technology, 461 00:22:37.760 --> 00:22:41.870 I found it quite fascinating to see the day to day, 462 00:22:41.870 --> 00:22:44.670 because I myself had no idea. 463 00:22:44.670 --> 00:22:47.760 I had no foundation to understand what 464 00:22:47.760 --> 00:22:49.700 it meant to be an engineer. 465 00:22:49.700 --> 00:22:52.970 And I think that piece is where we can really 466 00:22:52.970 --> 00:22:57.330 tap into a future workforce, helping them to understand 467 00:22:57.330 --> 00:23:02.080 the nuances of today's engineer but also tomorrow's engineer, 468 00:23:02.080 --> 00:23:04.410 and what it takes to actually succeed 469 00:23:04.410 --> 00:23:06.340 in a career in technology. 470 00:23:06.340 --> 00:23:07.840 SHERVIN KHODABANDEH: The distinction 471 00:23:07.840 --> 00:23:10.320 between today and tomorrow that you rightly point out 472 00:23:10.320 --> 00:23:13.290 is actually quite important, because I 473 00:23:13.290 --> 00:23:16.270 could imagine if my son asked me, 474 00:23:16.270 --> 00:23:19.450 but for the fact that I spent time in this area 475 00:23:19.450 --> 00:23:22.210 professionally, I would be quite biased 476 00:23:22.210 --> 00:23:25.180 about what engineering was like in my era. 477 00:23:25.180 --> 00:23:27.720 And it feels like the rate of change 478 00:23:27.720 --> 00:23:32.560 is so fast that prior experience and advice becomes 479 00:23:32.560 --> 00:23:34.100 very quickly irrelevant. 480 00:23:34.100 --> 00:23:37.610 And so this notion of ... if you go back 20, 481 00:23:37.610 --> 00:23:42.080 30 years ago and you looked up at somebody in a STEM program, 482 00:23:42.080 --> 00:23:47.410 or would be looking up to folks, trying to understand what it 483 00:23:47.410 --> 00:23:49.550 would be like to step in their shoes, 484 00:23:49.550 --> 00:23:52.130 it feels like those steps are going to be completely 485 00:23:52.130 --> 00:23:53.820 different in the future. 486 00:23:53.820 --> 00:23:55.950 And so this willingness to really 487 00:23:55.950 --> 00:24:00.220 be open-minded, and also the exemplars of, what [does] 488 00:24:00.220 --> 00:24:01.680 tomorrow's engineer look like? 489 00:24:01.680 --> 00:24:05.050 Because it feels like the old approaches 490 00:24:05.050 --> 00:24:06.390 are not going to work anymore. 491 00:24:06.390 --> 00:24:09.000 I mean, folks have to be much more 492 00:24:09.000 --> 00:24:12.150 exposed to what's happening now versus what 493 00:24:12.150 --> 00:24:13.760 happened 10 years ago. 494 00:24:13.760 --> 00:24:15.710 SIDNEY MADISON PRESCOTT: Agreed, and this also 495 00:24:15.710 --> 00:24:22.370 leads back to that need to really ensure that as engineers 496 00:24:22.370 --> 00:24:23.880 not only move into the workforce, 497 00:24:23.880 --> 00:24:27.120 but as they mature out their career footprint, that they are 498 00:24:27.120 --> 00:24:31.370 consistently remaining abreast of the changes 499 00:24:31.370 --> 00:24:35.060 and the ever evolving ways that technology 500 00:24:35.060 --> 00:24:37.110 is manifesting in their field. 501 00:24:37.110 --> 00:24:40.050 Because even the engineers today will tell you, 502 00:24:40.050 --> 00:24:43.680 getting into engineering is very different, even from sometimes 503 00:24:43.680 --> 00:24:45.160 what they are taught in school. 504 00:24:45.160 --> 00:24:47.970 And so that really is the disconnect: Are we 505 00:24:47.970 --> 00:24:49.750 teaching the right things? 506 00:24:49.750 --> 00:24:53.240 And then also, are we continuing the upskilling, right? 507 00:24:53.240 --> 00:24:56.160 As technology evolves, are we still moving 508 00:24:56.160 --> 00:24:59.450 midcareer engineers into a position where 509 00:24:59.450 --> 00:25:03.950 they feel empowered to continue learning about new technologies 510 00:25:03.950 --> 00:25:06.370 and being open to new opportunities 511 00:25:06.370 --> 00:25:08.490 to actually leverage those innovative tools? 512 00:25:08.490 --> 00:25:09.310 513 00:25:09.310 --> 00:25:10.810 SHERVIN KHODABANDEH: Very well said. 514 00:25:10.810 --> 00:25:12.810 SAM RANSBOTHAM: What are you excited about next? 515 00:25:12.810 --> 00:25:14.230 What's coming next for Spotify? 516 00:25:14.230 --> 00:25:15.750 What's fun for you? 517 00:25:15.750 --> 00:25:17.630 What's big on the horizon that we 518 00:25:17.630 --> 00:25:19.202 should be looking forward to? 519 00:25:19.202 --> 00:25:21.410 SIDNEY MADISON PRESCOTT: Oh, I'm really excited about 520 00:25:21.410 --> 00:25:25.620 the, again, that human augmentation, looking for ways, 521 00:25:25.620 --> 00:25:28.500 whether it is through artificial intelligence, 522 00:25:28.500 --> 00:25:31.750 whether it's through chatbots, whether it's through optical 523 00:25:31.750 --> 00:25:37.120 character recognition, of really creating a seamless experience 524 00:25:37.120 --> 00:25:37.620 -- 525 00:25:37.620 --> 00:25:39.740 I'll say a seamless work experience -- 526 00:25:39.740 --> 00:25:42.580 between the human and the machine. 527 00:25:42.580 --> 00:25:46.940 I fundamentally believe that that next frontier 528 00:25:46.940 --> 00:25:50.440 is a frontier where humans almost seamlessly 529 00:25:50.440 --> 00:25:51.870 interact with the different tools 530 00:25:51.870 --> 00:25:54.760 in their environment in a way to better facilitate 531 00:25:54.760 --> 00:25:56.150 faster outcomes. 532 00:25:56.150 --> 00:25:58.230 And we're really looking, specifically 533 00:25:58.230 --> 00:26:00.920 at Spotify within my team, we're looking at ways 534 00:26:00.920 --> 00:26:02.930 to leverage both machine learning 535 00:26:02.930 --> 00:26:05.280 and artificial intelligence in order 536 00:26:05.280 --> 00:26:09.200 to really enable the business to move faster 537 00:26:09.200 --> 00:26:12.350 and to focus on more value-added tasks. 538 00:26:12.350 --> 00:26:16.950 And so we're looking at the amplification of our existing 539 00:26:16.950 --> 00:26:19.270 machine learning footprint -- specifically, 540 00:26:19.270 --> 00:26:23.530 looking at ways that we can take data that is currently not 541 00:26:23.530 --> 00:26:27.430 machine readable, translate that into machine readable data, 542 00:26:27.430 --> 00:26:30.680 pass that off to a bot, and then potentially pass that off 543 00:26:30.680 --> 00:26:31.460 to a human. 544 00:26:31.460 --> 00:26:34.470 So again, building that tool stack, and really 545 00:26:34.470 --> 00:26:38.590 a very almost intuitive workflow between the machine, 546 00:26:38.590 --> 00:26:40.940 between the human, and the outputs 547 00:26:40.940 --> 00:26:42.930 that are received out of that. 548 00:26:42.930 --> 00:26:45.270 And we're also looking at ways that we 549 00:26:45.270 --> 00:26:48.920 can start to continue to mature out the tool stack, 550 00:26:48.920 --> 00:26:51.970 whether we're looking at workflow automation. 551 00:26:51.970 --> 00:26:55.610 We're starting to look at the dynamic between having 552 00:26:55.610 --> 00:26:58.180 a front end, which might be some sort of user 553 00:26:58.180 --> 00:27:01.510 interface with a chatbot, and then have a back end where it's 554 00:27:01.510 --> 00:27:04.690 actually a robotic process automation workflow built 555 00:27:04.690 --> 00:27:07.590 into it, to then trigger some different workflows. 556 00:27:07.590 --> 00:27:12.550 So creating more, I'll say, complex and nuanced ways 557 00:27:12.550 --> 00:27:15.230 to leverage these tools rather than siloing them 558 00:27:15.230 --> 00:27:18.510 off and saying, "Oh, this process is only robotic process 559 00:27:18.510 --> 00:27:21.310 automation," or, "This process is only a chatbot." 560 00:27:21.310 --> 00:27:24.640 So, breaking down those barriers, and then, of course, 561 00:27:24.640 --> 00:27:29.530 continuing to break down the two barriers that I feel truly 562 00:27:29.530 --> 00:27:33.220 exist within most enterprise environments 563 00:27:33.220 --> 00:27:36.170 and also lead to the most blockers, which 564 00:27:36.170 --> 00:27:40.910 are the silo between the business and the technology 565 00:27:40.910 --> 00:27:43.560 side, and then also the silos that exist 566 00:27:43.560 --> 00:27:47.420 between the front-office and the back-office functions. 567 00:27:47.420 --> 00:27:50.600 SAM RANSBOTHAM: Sidney, all quite fascinating. 568 00:27:50.600 --> 00:27:52.893 I think I particularly enjoyed discussing 569 00:27:52.893 --> 00:27:54.310 breaking these important barriers, 570 00:27:54.310 --> 00:27:56.350 like the barriers between business and technology, 571 00:27:56.350 --> 00:27:58.230 and between the front office and back office. 572 00:27:58.230 --> 00:28:00.900 And I think that analogy to digital dance 573 00:28:00.900 --> 00:28:02.258 has a lot of potential. 574 00:28:02.258 --> 00:28:04.300 Thanks for taking the time to talk with us today. 575 00:28:04.300 --> 00:28:05.360 I've really enjoyed it. 576 00:28:05.360 --> 00:28:06.860 SIDNEY MADISON PRESCOTT: Absolutely. 577 00:28:06.860 --> 00:28:09.230 This has been a great conversation. 578 00:28:09.230 --> 00:28:13.750 And again, this is ever evolving, this entire industry, 579 00:28:13.750 --> 00:28:16.310 and it's very exciting to be a part of, so I would definitely 580 00:28:16.310 --> 00:28:20.110 encourage listeners to dive in and start really 581 00:28:20.110 --> 00:28:22.940 learning about all of this, because it's the future, 582 00:28:22.940 --> 00:28:26.250 but it's also now, so it's really important for all of us 583 00:28:26.250 --> 00:28:28.550 to get engaged and get excited about what 584 00:28:28.550 --> 00:28:31.255 this means for each and every single one of us in society. 585 00:28:31.255 --> 00:28:32.880 SHERVIN KHODABANDEH: Thank you, Sidney. 586 00:28:32.880 --> 00:28:34.050 You've been very inspiring. 587 00:28:34.050 --> 00:28:34.550 588 00:28:34.550 --> 00:28:36.120 SAM RANSBOTHAM: Thanks for joining us 589 00:28:36.120 --> 00:28:38.470 for Season 3 of Me, Myself, and AI. 590 00:28:38.470 --> 00:28:42.030 We'll be back in two weeks to kick off Season 4 with a lively 591 00:28:42.030 --> 00:28:44.700 conversation about innovation -- and Star Wars -- 592 00:28:44.700 --> 00:28:47.700 with Mark Maybury, chief technology officer of Stanley 593 00:28:47.700 --> 00:28:48.660 Black & Decker. 594 00:28:48.660 --> 00:28:51.880 In the meantime, please remember to subscribe and rate 595 00:28:51.880 --> 00:28:54.570 and review our show on Apple podcasts and Spotify. 596 00:28:54.570 --> 00:28:55.070 597 00:28:55.070 --> 00:28:57.100 ALLISON RYDER: Thanks for listening 598 00:28:57.100 --> 00:28:58.570 to Me, Myself, and AI. 599 00:28:58.570 --> 00:29:01.050 We believe, like you, that the conversation 600 00:29:01.050 --> 00:29:03.270 about AI implementation doesn't start and stop 601 00:29:03.270 --> 00:29:04.380 with this podcast. 602 00:29:04.380 --> 00:29:07.000 That's why we've created a group on LinkedIn, specifically 603 00:29:07.000 --> 00:29:08.070 for leaders like you. 604 00:29:08.070 --> 00:29:10.810 It's called AI for Leaders, and if you join us, 605 00:29:10.810 --> 00:29:12.780 you can chat with show creators and hosts, 606 00:29:12.780 --> 00:29:16.450 ask your own questions, share insights, and gain access 607 00:29:16.450 --> 00:29:18.940 to valuable resources about AI implementation 608 00:29:18.940 --> 00:29:21.030 from MIT SMR and BCG. 609 00:29:21.030 --> 00:29:27.287 You can access it by visiting mitsmr.com/AIforLeaders. 610 00:29:27.287 --> 00:29:28.870 We'll put that link in the show notes, 611 00:29:28.870 --> 00:29:31.320 and we hope to see you there. 612 00:29:31.320 --> 00:29:37.000