WEBVTT 1 00:00:00.000 --> 00:00:02.140 2 00:00:02.140 --> 00:00:03.780 SAM RANSBOTHAM: What's in a name? 3 00:00:03.780 --> 00:00:06.280 Today we talk with Gerri Martin-Flickinger, 4 00:00:06.280 --> 00:00:08.900 former chief technology officer at Starbucks, 5 00:00:08.900 --> 00:00:11.370 about how the names we use can make a big difference 6 00:00:11.370 --> 00:00:14.770 in innovation and motivation. 7 00:00:14.770 --> 00:00:17.480 Welcome to Me, Myself, and AI, a podcast 8 00:00:17.480 --> 00:00:19.620 on artificial intelligence in business. 9 00:00:19.620 --> 00:00:23.420 Each episode, we introduce you to someone innovating with AI. 10 00:00:23.420 --> 00:00:26.700 I'm Sam Ransbotham, professor of information systems 11 00:00:26.700 --> 00:00:28.050 at Boston College. 12 00:00:28.050 --> 00:00:31.370 I'm also the guest editor for the AI and Business Strategy 13 00:00:31.370 --> 00:00:34.960 Big Ideas program at MIT Sloan Management Review. 14 00:00:34.960 --> 00:00:37.450 SHERVIN KHODABANDEH: And I'm Shervin Khodabandeh, 15 00:00:37.450 --> 00:00:41.630 senior partner with BCG, and I colead BCG's AI practice 16 00:00:41.630 --> 00:00:42.870 in North America. 17 00:00:42.870 --> 00:00:46.350 Together, MIT SMR and BCG have been 18 00:00:46.350 --> 00:00:49.770 researching AI for five years, interviewing hundreds 19 00:00:49.770 --> 00:00:52.120 of practitioners and surveying thousands 20 00:00:52.120 --> 00:00:56.220 of companies on what it takes to build and to deploy and scale 21 00:00:56.220 --> 00:00:59.740 AI capabilities and really transform 22 00:00:59.740 --> 00:01:01.900 the way organizations operate. 23 00:01:01.900 --> 00:01:04.700 SAM RANSBOTHAM: Today we're talking 24 00:01:04.700 --> 00:01:07.350 with Gerri Martin-Flickinger, former executive vice 25 00:01:07.350 --> 00:01:09.730 president and chief technology officer at Starbucks. 26 00:01:09.730 --> 00:01:11.490 Gerri, thanks for taking the time to talk with us. 27 00:01:11.490 --> 00:01:11.990 Welcome. 28 00:01:11.990 --> 00:01:12.560 29 00:01:12.560 --> 00:01:15.760 SHERVIN KHODABANDEH: It's really great to have you here, Gerri. 30 00:01:15.760 --> 00:01:16.260 31 00:01:16.260 --> 00:01:17.190 GERRI MARTIN-FLICKINGER: It's great to be here. 32 00:01:17.190 --> 00:01:18.010 Thanks for having me. 33 00:01:18.010 --> 00:01:20.427 SHERVIN KHODABANDEH: So, Gerri, tell us a little bit about 34 00:01:20.427 --> 00:01:21.420 yourself -- 35 00:01:21.420 --> 00:01:23.480 your background, your journey to this point, 36 00:01:23.480 --> 00:01:24.762 and what it's been like. 37 00:01:24.762 --> 00:01:26.470 GERRI MARTIN-FLICKINGER: This topic today 38 00:01:26.470 --> 00:01:27.950 is going to be about AI, so I would really 39 00:01:27.950 --> 00:01:29.250 love to go way back in time to the 1980s. 40 00:01:29.250 --> 00:01:31.020 I know you probably didn't expect me to start there when 41 00:01:31.020 --> 00:01:33.560 you asked me that question, but I have to start there, 42 00:01:33.560 --> 00:01:37.720 because I went to Washington State University -- go Cougs! 43 00:01:37.720 --> 00:01:41.470 -- and I had an emphasis in artificial intelligence way 44 00:01:41.470 --> 00:01:45.100 back in the day and, in fact, did my senior project with 45 00:01:45.100 --> 00:01:47.430 a neural net that was built in Lisp, 46 00:01:47.430 --> 00:01:50.430 which nobody even remembers what that is anymore. 47 00:01:50.430 --> 00:01:53.030 But the reason why I start there is 48 00:01:53.030 --> 00:01:57.070 because I have had a love for AI and the potential of AI 49 00:01:57.070 --> 00:01:59.510 techniques for all of this time. 50 00:01:59.510 --> 00:02:03.940 And it's been really exciting in the last five or six years 51 00:02:03.940 --> 00:02:07.100 to see that we're finally at a place where technology, 52 00:02:07.100 --> 00:02:09.743 both compute and storage -- and data -- 53 00:02:09.743 --> 00:02:11.910 have gotten to the point where we can actually start 54 00:02:11.910 --> 00:02:15.200 to achieve some of those visions we had way back then. 55 00:02:15.200 --> 00:02:16.880 But let me start in '85. 56 00:02:16.880 --> 00:02:18.860 After going to school and getting 57 00:02:18.860 --> 00:02:21.190 a degree in computer science, I went to Chevron, 58 00:02:21.190 --> 00:02:22.820 the oil/energy company. 59 00:02:22.820 --> 00:02:26.150 [I] spent my early years there and did some really cool AI 60 00:02:26.150 --> 00:02:29.870 work while I was there, which was really more in a research 61 00:02:29.870 --> 00:02:32.380 mode, and did some technology that actually 62 00:02:32.380 --> 00:02:34.960 was sold and purchased by some other companies 63 00:02:34.960 --> 00:02:36.490 outside of Chevron. 64 00:02:36.490 --> 00:02:39.470 After that, I became the first CIO for McAfee 65 00:02:39.470 --> 00:02:43.220 in the late '90s, when McAfee was still quite small 66 00:02:43.220 --> 00:02:44.780 but grew quite large. 67 00:02:44.780 --> 00:02:48.550 And that was really fun, because if you think about antivirus 68 00:02:48.550 --> 00:02:52.410 in those early days, it was one of the first SaaS companies, 69 00:02:52.410 --> 00:02:55.030 because you would buy your antivirus software, 70 00:02:55.030 --> 00:02:57.860 but then you would get these payloads every month 71 00:02:57.860 --> 00:02:59.850 that would be the new virus signatures to keep 72 00:02:59.850 --> 00:03:01.110 protecting your machine. 73 00:03:01.110 --> 00:03:04.650 So, in effect, it was like a subscription business. 74 00:03:04.650 --> 00:03:07.200 Being part of that in the early days of, 75 00:03:07.200 --> 00:03:11.160 frankly, the internet and right during the whole dot-com period 76 00:03:11.160 --> 00:03:14.610 was super insightful and really taught me 77 00:03:14.610 --> 00:03:17.910 a lot about scale and consumer digital 78 00:03:17.910 --> 00:03:20.120 before we called it consumer digital. 79 00:03:20.120 --> 00:03:22.490 I then took a little bit of time out of my career 80 00:03:22.490 --> 00:03:25.500 to have two beautiful twin daughters, 81 00:03:25.500 --> 00:03:30.420 and then I came back as the CIO for Verisign, another security 82 00:03:30.420 --> 00:03:34.360 company, and then became the CIO for Adobe 83 00:03:34.360 --> 00:03:37.730 and was at Adobe for about 10 years and part of the team 84 00:03:37.730 --> 00:03:41.370 who migrated their product offerings into the cloud, 85 00:03:41.370 --> 00:03:44.097 into a true subscription SaaS business. 86 00:03:44.097 --> 00:03:45.930 And then [I] joined Starbucks six years ago, 87 00:03:45.930 --> 00:03:49.120 a little over six years ago, as their first chief technology 88 00:03:49.120 --> 00:03:53.880 officer and really helped them migrate and move into a more 89 00:03:53.880 --> 00:03:57.750 modern architecture stack; evolve their entire digital 90 00:03:57.750 --> 00:04:00.250 platform, including mobile order and pay; 91 00:04:00.250 --> 00:04:03.420 [and] innovate IoT into all of the stores -- so really, 92 00:04:03.420 --> 00:04:06.960 just a whole lot of really fun at-scale technology 93 00:04:06.960 --> 00:04:10.362 and, in the process, got back to my roots and did some AI 94 00:04:10.362 --> 00:04:11.070 at the same time. 95 00:04:11.070 --> 00:04:14.650 SHERVIN KHODABANDEH: This is a phenomenal series 96 00:04:14.650 --> 00:04:16.149 of things you've been part of. 97 00:04:16.149 --> 00:04:20.920 And I'm curious, as someone whose interest and exposure 98 00:04:20.920 --> 00:04:26.220 and practicing of AI goes back to the '80s, as you said, 99 00:04:26.220 --> 00:04:32.260 and you've seen the various eras of technological 100 00:04:32.260 --> 00:04:36.850 and organizational innovations, and the playing field has 101 00:04:36.850 --> 00:04:40.030 continuously increased -- what do you think are some 102 00:04:40.030 --> 00:04:43.590 of the biggest misconceptions that still exist in the minds 103 00:04:43.590 --> 00:04:46.932 of executives, particularly when it comes to this topic? 104 00:04:46.932 --> 00:04:48.320 105 00:04:48.320 --> 00:04:50.610 GERRI MARTIN-FLICKINGER: That is a great question. 106 00:04:50.610 --> 00:04:55.600 One of them is that if you build a model, it can solve anything. 107 00:04:55.600 --> 00:04:59.620 There's no such thing as a generalized AI model that will 108 00:04:59.620 --> 00:05:01.890 solve anything or everything. 109 00:05:01.890 --> 00:05:03.660 That just isn't possible. 110 00:05:03.660 --> 00:05:07.190 If you think about where people are having the most 111 00:05:07.190 --> 00:05:11.690 success right now with AI at scale, a lot of it 112 00:05:11.690 --> 00:05:16.790 is tightly coupled to statistical analysis, frankly, 113 00:05:16.790 --> 00:05:20.290 and is a lot about taking very large learning sets 114 00:05:20.290 --> 00:05:24.400 and building very sophisticated models that 115 00:05:24.400 --> 00:05:26.690 can be predictive in nature. 116 00:05:26.690 --> 00:05:29.440 And that's awesome, and what that takes 117 00:05:29.440 --> 00:05:33.840 is a lot of data to make it accurate. 118 00:05:33.840 --> 00:05:36.080 And so I think one of the misconceptions 119 00:05:36.080 --> 00:05:39.000 that I've certainly talked to a lot of executives 120 00:05:39.000 --> 00:05:40.890 about in the last few years is that in order 121 00:05:40.890 --> 00:05:43.720 to do AI in a really meaningful way, 122 00:05:43.720 --> 00:05:46.030 you've got to get your data in order. 123 00:05:46.030 --> 00:05:48.000 It isn't as simple as just saying, "Hey, we 124 00:05:48.000 --> 00:05:50.030 have loads of data! 125 00:05:50.030 --> 00:05:52.180 We should be able to have amazing AI models." 126 00:05:52.180 --> 00:05:55.533 You've got to have a little bit of structure to that data. 127 00:05:55.533 --> 00:05:57.700 You have to have a little bit of thought about where 128 00:05:57.700 --> 00:06:00.390 that data sits, even something as simple as, 129 00:06:00.390 --> 00:06:02.250 what does your data lake look like? 130 00:06:02.250 --> 00:06:03.700 Where are you putting that data? 131 00:06:03.700 --> 00:06:06.200 What is the currency of that data? 132 00:06:06.200 --> 00:06:10.310 Do you need the models to retrain in real time? 133 00:06:10.310 --> 00:06:12.270 Do you need to build them once and then 134 00:06:12.270 --> 00:06:14.010 retrain them once a quarter? 135 00:06:14.010 --> 00:06:16.260 And all of that starts to get really 136 00:06:16.260 --> 00:06:18.370 wrapped up into your data architecture. 137 00:06:18.370 --> 00:06:21.140 So I think the one thing that I always 138 00:06:21.140 --> 00:06:23.130 ask people when they want to talk about AI 139 00:06:23.130 --> 00:06:24.590 is, "Tell me about your data. 140 00:06:24.590 --> 00:06:26.640 Do you have data? 141 00:06:26.640 --> 00:06:27.600 Where is your data? 142 00:06:27.600 --> 00:06:28.250 Do you own it? 143 00:06:28.250 --> 00:06:28.890 Can you use it? 144 00:06:28.890 --> 00:06:30.308 Do you have the right to use it?" 145 00:06:30.308 --> 00:06:32.100 This is the other thing about consumer data 146 00:06:32.100 --> 00:06:34.220 that's so very important: You have 147 00:06:34.220 --> 00:06:36.560 to make sure you're not doing things 148 00:06:36.560 --> 00:06:38.500 that you shouldn't be doing with that data. 149 00:06:38.500 --> 00:06:41.800 So I really think that one of the misconceptions out there 150 00:06:41.800 --> 00:06:44.890 is this idea that there's this thing you buy called AI, 151 00:06:44.890 --> 00:06:47.100 and you plug it in and it all works. 152 00:06:47.100 --> 00:06:50.207 That is almost the end of a long series of things you need to do 153 00:06:50.207 --> 00:06:50.790 and architect. 154 00:06:50.790 --> 00:06:53.760 SAM RANSBOTHAM: OK, so, once an organization has their data 155 00:06:53.760 --> 00:06:55.050 in place, what happens next? 156 00:06:55.050 --> 00:06:56.897 What does having that data enable? 157 00:06:56.897 --> 00:06:58.300 158 00:06:58.300 --> 00:07:01.170 GERRI MARTIN-FLICKINGER: I do think the evolution of AI 159 00:07:01.170 --> 00:07:06.210 and ML specifically has led all parts of businesses 160 00:07:06.210 --> 00:07:07.920 that are sophisticated to ask very 161 00:07:07.920 --> 00:07:11.060 different questions than they did even five years ago. 162 00:07:11.060 --> 00:07:13.770 Suddenly, there's an expectation of, 163 00:07:13.770 --> 00:07:17.780 "No, we should be able to predict supply chain 164 00:07:17.780 --> 00:07:21.280 forecast based on X, Y, and Z." 165 00:07:21.280 --> 00:07:22.840 And that isn't going to take somebody 166 00:07:22.840 --> 00:07:24.740 to build a giant spreadsheet to do it. 167 00:07:24.740 --> 00:07:27.100 There's a better way to do that now. 168 00:07:27.100 --> 00:07:30.120 And I think you're seeing that evolution. 169 00:07:30.120 --> 00:07:32.350 What I think I would compare this to, though, 170 00:07:32.350 --> 00:07:38.040 is do we all remember the days before we all had Excel? 171 00:07:38.040 --> 00:07:40.398 It's hard to remember those days, but -- 172 00:07:40.398 --> 00:07:41.440 SAM RANSBOTHAM: VisiCalc? 173 00:07:41.440 --> 00:07:43.050 GERRI MARTIN-FLICKINGER: Yeah, VisiCalc! 174 00:07:43.050 --> 00:07:43.890 I remember VisiCalc. 175 00:07:43.890 --> 00:07:44.480 But there was a time -- 176 00:07:44.480 --> 00:07:45.250 SHERVIN KHODABANDEH: Lotus 1-2-3. 177 00:07:45.250 --> 00:07:46.667 GERRI MARTIN-FLICKINGER: There was 178 00:07:46.667 --> 00:07:50.810 a time when the idea that all of us in business 179 00:07:50.810 --> 00:07:54.260 would be able to build a spreadsheet 180 00:07:54.260 --> 00:07:57.400 and then share it and collaborate on it. 181 00:07:57.400 --> 00:07:59.040 That was like, "What? 182 00:07:59.040 --> 00:07:59.750 Why would you " 183 00:07:59.750 --> 00:08:01.060 SAM RANSBOTHAM: Crazy talk. 184 00:08:01.060 --> 00:08:01.088 185 00:08:01.088 --> 00:08:02.880 GERRI MARTIN-FLICKINGER: Crazy talk, right? 186 00:08:02.880 --> 00:08:04.980 Same thing when I first started working; 187 00:08:04.980 --> 00:08:06.240 there was a typing pool. 188 00:08:06.240 --> 00:08:09.850 We didn't have email; you sent memos. 189 00:08:09.850 --> 00:08:13.620 My point being that as tools become 190 00:08:13.620 --> 00:08:16.620 more available to more people, more people 191 00:08:16.620 --> 00:08:18.930 are able to explore more ideas. 192 00:08:18.930 --> 00:08:22.860 How many times do all of us open spreadsheets all week 193 00:08:22.860 --> 00:08:25.650 long to do everything from help our kids with homework 194 00:08:25.650 --> 00:08:28.990 to track our personal finances, or just keep a list? 195 00:08:28.990 --> 00:08:32.480 The best list thing I have is a spreadsheet. 196 00:08:32.480 --> 00:08:36.919 And suddenly, that has changed how we all work. 197 00:08:36.919 --> 00:08:40.200 So I just go back to these fundamental shifts 198 00:08:40.200 --> 00:08:42.350 we've seen in the past, like the spreadsheet, 199 00:08:42.350 --> 00:08:44.330 and I don't think this is so very different. 200 00:08:44.330 --> 00:08:46.830 I think as we continue to watch this evolve, 201 00:08:46.830 --> 00:08:50.510 we're going to find better tools and more effective 202 00:08:50.510 --> 00:08:53.020 ways that we can all explore these techniques. 203 00:08:53.020 --> 00:08:55.790 I'll use a couple of examples from Starbucks: 204 00:08:55.790 --> 00:09:00.090 doing models for labor scheduling. 205 00:09:00.090 --> 00:09:02.290 That's not a leap to think about; certainly, 206 00:09:02.290 --> 00:09:04.540 [it's] something that you can do with the kind of data 207 00:09:04.540 --> 00:09:05.373 that you have today. 208 00:09:05.373 --> 00:09:08.610 Now, do you think a store manager knows or cares 209 00:09:08.610 --> 00:09:11.540 that when they build a labor schedule for their store, 210 00:09:11.540 --> 00:09:15.870 there's actually a reasonably sophisticated ML model 211 00:09:15.870 --> 00:09:17.490 behind the scenes doing that? 212 00:09:17.490 --> 00:09:18.150 No. 213 00:09:18.150 --> 00:09:21.750 So I think the amount of embedded AI and ML 214 00:09:21.750 --> 00:09:25.620 that we all We all probably have some in our cars right now; 215 00:09:25.620 --> 00:09:27.270 a lot, probably. 216 00:09:27.270 --> 00:09:30.360 Your power companies, your phone companies 217 00:09:30.360 --> 00:09:31.590 It's embedded everywhere. 218 00:09:31.590 --> 00:09:34.440 Your credit card companies have had it probably longer than you 219 00:09:34.440 --> 00:09:38.100 even realize, for fraud detection, so it's already 220 00:09:38.100 --> 00:09:38.950 pretty pervasive. 221 00:09:38.950 --> 00:09:42.350 And I think the question is, how much more accessible 222 00:09:42.350 --> 00:09:45.200 could it be as people become more sophisticated? 223 00:09:45.200 --> 00:09:46.070 I don't know. 224 00:09:46.070 --> 00:09:47.600 I have kids in high school. 225 00:09:47.600 --> 00:09:51.690 Kids in high school talk about data science now; 226 00:09:51.690 --> 00:09:54.380 it's classes in high school, so there's 227 00:09:54.380 --> 00:09:58.140 nothing to think that in five, six, seven years, when they 228 00:09:58.140 --> 00:10:01.680 come out of college, they're going to be probably pretty 229 00:10:01.680 --> 00:10:03.820 fluent in some of these techniques, 230 00:10:03.820 --> 00:10:07.260 even if it seems completely inconceivable to all of us. 231 00:10:07.260 --> 00:10:09.800 SHERVIN KHODABANDEH: No, I think it actually reminds me 232 00:10:09.800 --> 00:10:12.470 of a Wall Street [Journal] article I was reading today 233 00:10:12.470 --> 00:10:14.300 about chess. 234 00:10:14.300 --> 00:10:21.540 Chess grandmasters have had to become experts, somewhat, 235 00:10:21.540 --> 00:10:24.230 in AI, because everybody's using it. 236 00:10:24.230 --> 00:10:25.930 And they have these AI teams to be 237 00:10:25.930 --> 00:10:32.150 able to handicap different lines of thinking of the algorithm 238 00:10:32.150 --> 00:10:35.380 to throw off their opponents, because everybody's using 239 00:10:35.380 --> 00:10:38.780 AI to plan and win their games. 240 00:10:38.780 --> 00:10:40.910 And so you need to really understand 241 00:10:40.910 --> 00:10:44.490 how the engine works if you're going to beat somebody who's 242 00:10:44.490 --> 00:10:46.510 using that engine to beat you. 243 00:10:46.510 --> 00:10:49.380 And it builds on the point you were making on, 244 00:10:49.380 --> 00:10:50.360 it is quite pervasive. 245 00:10:50.360 --> 00:10:52.650 SAM RANSBOTHAM: In your analogy, I guess, 246 00:10:52.650 --> 00:10:54.650 that would be "Out-schedule the competition." 247 00:10:54.650 --> 00:10:55.750 You mentioned Starbucks. 248 00:10:55.750 --> 00:10:58.417 Is there something particular that you're excited about 249 00:10:58.417 --> 00:10:59.500 that you want to showcase? 250 00:10:59.500 --> 00:11:01.750 GERRI MARTIN-FLICKINGER: I can certainly 251 00:11:01.750 --> 00:11:04.140 highlight a couple of examples. 252 00:11:04.140 --> 00:11:07.350 We have a moniker, Deep Brew, which 253 00:11:07.350 --> 00:11:11.650 stands for a broad section of AI projects 254 00:11:11.650 --> 00:11:13.270 underway across the company. 255 00:11:13.270 --> 00:11:15.800 The ones that folks are most familiar with 256 00:11:15.800 --> 00:11:18.410 are personalization models. 257 00:11:18.410 --> 00:11:21.330 Whether it's personalization on the mobile app 258 00:11:21.330 --> 00:11:24.800 or personalization when you pull into a drive-through 259 00:11:24.800 --> 00:11:26.810 where they have a digital display, 260 00:11:26.810 --> 00:11:31.390 those experiences are being driven 261 00:11:31.390 --> 00:11:35.620 from models that are based on lots of different inputs, some 262 00:11:35.620 --> 00:11:38.950 of which are very personal, like maybe your own buying patterns. 263 00:11:38.950 --> 00:11:40.660 Some of them are regional, like, "What's 264 00:11:40.660 --> 00:11:43.120 going on with buying patterns in this region?" 265 00:11:43.120 --> 00:11:45.840 They could be environmental factors, like, "What's 266 00:11:45.840 --> 00:11:47.000 the weather today?" 267 00:11:47.000 --> 00:11:50.850 It could have to do with supply chain loads, like, "What do we 268 00:11:50.850 --> 00:11:53.300 actually have in stock that we need to sell?" 269 00:11:53.300 --> 00:11:57.090 Those are actually much harder to do than they sound. 270 00:11:57.090 --> 00:11:58.310 It sounds very simple. 271 00:11:58.310 --> 00:12:02.010 But let me give you an example to illustrate why some of this 272 00:12:02.010 --> 00:12:04.530 is hard and why I always start with data. 273 00:12:04.530 --> 00:12:07.860 Doesn't it sound easy to figure out 274 00:12:07.860 --> 00:12:10.670 if there are the ingredients for a latte 275 00:12:10.670 --> 00:12:13.180 so that you can promote a latte on the phone? 276 00:12:13.180 --> 00:12:14.956 SAM RANSBOTHAM: Naively, I'll say yes. 277 00:12:14.956 --> 00:12:15.033 278 00:12:15.033 --> 00:12:16.450 GERRI MARTIN-FLICKINGER: It sounds 279 00:12:16.450 --> 00:12:17.492 like it's a thing, right? 280 00:12:17.492 --> 00:12:18.860 Like, "Yeah, we have lattes." 281 00:12:18.860 --> 00:12:21.310 Well, actually, lattes are manufactured 282 00:12:21.310 --> 00:12:23.130 in the moment at a store, and they're 283 00:12:23.130 --> 00:12:24.930 made of component parts. 284 00:12:24.930 --> 00:12:27.160 They're made of some espresso, which 285 00:12:27.160 --> 00:12:29.920 could be different kinds of espresso, made 286 00:12:29.920 --> 00:12:31.580 with some type of a milk product, 287 00:12:31.580 --> 00:12:33.630 which could be cow milk, it could 288 00:12:33.630 --> 00:12:34.770 be an alt [nondairy] milk. 289 00:12:34.770 --> 00:12:37.240 And then it could be heated to different temperatures 290 00:12:37.240 --> 00:12:39.280 based on what the customer has asked for. 291 00:12:39.280 --> 00:12:40.730 And that's a simple drink. 292 00:12:40.730 --> 00:12:43.840 That is the simplest espresso drink, probably, 293 00:12:43.840 --> 00:12:45.530 you can get in a store. 294 00:12:45.530 --> 00:12:47.440 So here's why that's complicated. 295 00:12:47.440 --> 00:12:50.220 If you're in the store, the customer 296 00:12:50.220 --> 00:12:51.730 just knows it as a latte. 297 00:12:51.730 --> 00:12:53.760 But if you think about the entire supply 298 00:12:53.760 --> 00:12:55.630 chain of all of the component parts 299 00:12:55.630 --> 00:12:59.010 that have to be available at that moment in the front 300 00:12:59.010 --> 00:13:03.050 of the house, behind the counter, to make that latte, 301 00:13:03.050 --> 00:13:04.500 that's a whole different problem. 302 00:13:04.500 --> 00:13:06.792 And now you've got to get all the way back to your data 303 00:13:06.792 --> 00:13:10.030 master, the data master that is the component parts, 304 00:13:10.030 --> 00:13:13.550 and understanding if they were delivered that night 305 00:13:13.550 --> 00:13:16.590 in the back of the store. 306 00:13:16.590 --> 00:13:18.430 Now you've got a data problem that 307 00:13:18.430 --> 00:13:20.515 requires you to decompose and restructure 308 00:13:20.515 --> 00:13:23.140 the data all the way back to the origin, if you haven't already 309 00:13:23.140 --> 00:13:24.000 done that. 310 00:13:24.000 --> 00:13:27.590 And I'm only illustrating this because so often you think, 311 00:13:27.590 --> 00:13:29.800 "Well, it's an easy ML problem to say, 312 00:13:29.800 --> 00:13:33.480 'We want to promote lattes.'" But the second you do that, 313 00:13:33.480 --> 00:13:36.690 you actually have to know the deepest level of data possible 314 00:13:36.690 --> 00:13:39.210 to ensure you actually have the product to sell. 315 00:13:39.210 --> 00:13:41.210 SAM RANSBOTHAM: That's tricky because, actually, 316 00:13:41.210 --> 00:13:43.090 when you say "latte," I know exactly what you mean, 317 00:13:43.090 --> 00:13:45.180 because you mean exactly the one that I would drink. 318 00:13:45.180 --> 00:13:47.150 You don't mean the one that Shervin would drink. 319 00:13:47.150 --> 00:13:47.880 GERRI MARTIN-FLICKINGER: Right. 320 00:13:47.880 --> 00:13:49.010 SAM RANSBOTHAM: And to answer the question, 321 00:13:49.010 --> 00:13:51.010 you've got to answer it for every single person. 322 00:13:51.010 --> 00:13:51.180 323 00:13:51.180 --> 00:13:52.070 GERRI MARTIN-FLICKINGER: Right. 324 00:13:52.070 --> 00:13:53.903 And you're not going to enumerate all those. 325 00:13:53.903 --> 00:13:56.100 There's infinite possibilities for customization. 326 00:13:56.100 --> 00:13:56.870 Infinite. 327 00:13:56.870 --> 00:13:58.020 So you can't do that. 328 00:13:58.020 --> 00:14:01.413 You have to actually work at it as a data problem. 329 00:14:01.413 --> 00:14:03.330 And then you can do the AI model on top of it, 330 00:14:03.330 --> 00:14:05.975 because you've actually figured out what you have to work with. 331 00:14:05.975 --> 00:14:07.350 SHERVIN KHODABANDEH: And then you 332 00:14:07.350 --> 00:14:11.440 also talked about the supply chain issue and the inventory 333 00:14:11.440 --> 00:14:12.150 management. 334 00:14:12.150 --> 00:14:16.970 And the point to me is, these use cases are not in silos 335 00:14:16.970 --> 00:14:17.470 anymore. 336 00:14:17.470 --> 00:14:18.845 GERRI MARTIN-FLICKINGER: Exactly. 337 00:14:18.845 --> 00:14:18.940 338 00:14:18.940 --> 00:14:21.440 SHERVIN KHODABANDEH: The whole foundational data, of course, 339 00:14:21.440 --> 00:14:23.740 is critical to power them. 340 00:14:23.740 --> 00:14:28.540 But how we market impacts what happens in the store, 341 00:14:28.540 --> 00:14:32.482 and supply chain issues impact what we should be able 342 00:14:32.482 --> 00:14:33.440 to market or shouldn't. 343 00:14:33.440 --> 00:14:33.860 And so -- 344 00:14:33.860 --> 00:14:34.780 GERRI MARTIN-FLICKINGER: Totally. 345 00:14:34.780 --> 00:14:37.350 SHERVIN KHODABANDEH: My follow-up is, for the business 346 00:14:37.350 --> 00:14:39.800 leaders who are listening to this, 347 00:14:39.800 --> 00:14:44.860 I think there are many analogs of what you just described 348 00:14:44.860 --> 00:14:48.440 that would resonate in any line of business, 349 00:14:48.440 --> 00:14:50.580 because you've got these groups that 350 00:14:50.580 --> 00:14:53.090 are different lines of business or different functional 351 00:14:53.090 --> 00:14:57.400 components that, in today's world with today's data, 352 00:14:57.400 --> 00:15:00.490 are much more interactive, and there's 353 00:15:00.490 --> 00:15:02.930 a network effect of all of these things, which 354 00:15:02.930 --> 00:15:05.570 requires teams to come together that normally wouldn't 355 00:15:05.570 --> 00:15:07.090 work together. 356 00:15:07.090 --> 00:15:10.980 What advice do you have for the CEO or the president 357 00:15:10.980 --> 00:15:14.310 of a business unit to break these silos, because you've 358 00:15:14.310 --> 00:15:17.150 got different teams with different tools, 359 00:15:17.150 --> 00:15:19.600 different incentives, right? 360 00:15:19.600 --> 00:15:22.690 That must be a daunting organizational problem. 361 00:15:22.690 --> 00:15:25.250 It's not just a technology problem. 362 00:15:25.250 --> 00:15:26.920 And you've seen that work well. 363 00:15:26.920 --> 00:15:29.427 I'm just curious -- what advice would you have? 364 00:15:29.427 --> 00:15:31.010 GERRI MARTIN-FLICKINGER: Well, I don't 365 00:15:31.010 --> 00:15:32.250 think there's any magic here. 366 00:15:32.250 --> 00:15:34.980 I think, as in most things in business, 367 00:15:34.980 --> 00:15:37.470 you have to start by being really clear on, 368 00:15:37.470 --> 00:15:38.620 what is the objective? 369 00:15:38.620 --> 00:15:40.650 What are you solving for? 370 00:15:40.650 --> 00:15:42.980 That sounds so simple, but sometimes that's 371 00:15:42.980 --> 00:15:44.530 really hard to figure out. 372 00:15:44.530 --> 00:15:47.780 Is what you're solving for increased revenue? 373 00:15:47.780 --> 00:15:50.410 Is it increased customer retention? 374 00:15:50.410 --> 00:15:52.760 Is it improved margin? 375 00:15:52.760 --> 00:15:55.130 Is it something else? 376 00:15:55.130 --> 00:15:57.800 Getting really clear on that is part of what gets all 377 00:15:57.800 --> 00:16:01.370 the constituencies to go at loggerheads -- 378 00:16:01.370 --> 00:16:03.307 somebody carries the hat of revenue, 379 00:16:03.307 --> 00:16:05.140 and somebody else carries the hat of margin, 380 00:16:05.140 --> 00:16:07.820 and somebody else carries customer experience. 381 00:16:07.820 --> 00:16:09.980 You've got to get clear on what you're solving for. 382 00:16:09.980 --> 00:16:12.105 And you can't solve for all of it at the same time. 383 00:16:12.105 --> 00:16:13.860 Now, you can benefit it all, but you 384 00:16:13.860 --> 00:16:16.440 have to get really clear on "What are we going after?" 385 00:16:16.440 --> 00:16:19.370 So that's my first advice: [to] be really clear on what problem 386 00:16:19.370 --> 00:16:20.870 you're trying to solve. 387 00:16:20.870 --> 00:16:24.810 I think one thing I believe a lot in bringing people together 388 00:16:24.810 --> 00:16:27.410 who have different expertise. 389 00:16:27.410 --> 00:16:30.110 I actually think it's a good thing. 390 00:16:30.110 --> 00:16:31.860 It's a good thing to bring people together 391 00:16:31.860 --> 00:16:34.970 who have five different specialties, because they're 392 00:16:34.970 --> 00:16:37.830 going to bring the very best thinking for that domain. 393 00:16:37.830 --> 00:16:39.310 But then you also have to have them 394 00:16:39.310 --> 00:16:41.430 feel like they're in it together, 395 00:16:41.430 --> 00:16:43.848 and that's good, old-fashioned teamwork. 396 00:16:43.848 --> 00:16:45.890 And I hate to say "good, old-fashioned teamwork," 397 00:16:45.890 --> 00:16:48.860 but for as long as I've been in the business world, 398 00:16:48.860 --> 00:16:52.670 it all comes down to the same things: Are you getting people 399 00:16:52.670 --> 00:16:54.380 together with a common vision? 400 00:16:54.380 --> 00:16:58.600 Are you giving them room to fail so that they 401 00:16:58.600 --> 00:17:00.490 can get onto a path to success? 402 00:17:00.490 --> 00:17:03.120 Are you giving them a goal that's 403 00:17:03.120 --> 00:17:07.240 really clear, with a timeline that's achievable but also 404 00:17:07.240 --> 00:17:08.430 really clear? 405 00:17:08.430 --> 00:17:11.060 And then are you supporting them with the resources 406 00:17:11.060 --> 00:17:13.260 and the budget that they need to be successful? 407 00:17:13.260 --> 00:17:14.700 It's all that same stuff. 408 00:17:14.700 --> 00:17:17.200 There's nothing new there. 409 00:17:17.200 --> 00:17:18.922 I do think where a lot of people fail 410 00:17:18.922 --> 00:17:21.130 is, they don't start off with a clear problem they're 411 00:17:21.130 --> 00:17:22.030 trying to solve. 412 00:17:22.030 --> 00:17:25.410 And that tends to get people to all get really entrenched 413 00:17:25.410 --> 00:17:27.385 in their silos and then go off and try 414 00:17:27.385 --> 00:17:28.510 to solve their own problem. 415 00:17:28.510 --> 00:17:30.380 SHERVIN KHODABANDEH: That's very well said. 416 00:17:30.380 --> 00:17:32.260 SAM RANSBOTHAM: I'm struck as I'm 417 00:17:32.260 --> 00:17:34.320 listening about how much depth you obviously 418 00:17:34.320 --> 00:17:38.317 have in making a latte, but that wasn't your background; 419 00:17:38.317 --> 00:17:40.400 we didn't hear that stop [in your career journey]. 420 00:17:40.400 --> 00:17:43.400 How do you get people to know so much about the domain area 421 00:17:43.400 --> 00:17:45.670 to then be able to solve it with the technology 422 00:17:45.670 --> 00:17:47.030 that you're using to solve it? 423 00:17:47.030 --> 00:17:49.648 It seems like a very difficult thing to pull together? 424 00:17:49.648 --> 00:17:51.190 GERRI MARTIN-FLICKINGER: I don't know 425 00:17:51.190 --> 00:17:54.190 which question to answer: the one about how did I end up 426 00:17:54.190 --> 00:17:55.782 learning to make a latte at Starbucks 427 00:17:55.782 --> 00:17:57.740 or, in general, how do you do that in business? 428 00:17:57.740 --> 00:17:58.700 SHERVIN KHODABANDEH: Let's start there. 429 00:17:58.700 --> 00:18:01.117 GERRI MARTIN-FLICKINGER: OK, we can start with my journey. 430 00:18:01.117 --> 00:18:04.180 So, yeah, I came out of technology. 431 00:18:04.180 --> 00:18:07.490 I had been in enterprise software for many, many years 432 00:18:07.490 --> 00:18:09.180 in Silicon Valley. 433 00:18:09.180 --> 00:18:12.010 And my decision to come to Starbucks 434 00:18:12.010 --> 00:18:13.100 was kind of interesting. 435 00:18:13.100 --> 00:18:15.020 I was, first of all, just intrigued. 436 00:18:15.020 --> 00:18:17.140 I was intrigued by the scale. 437 00:18:17.140 --> 00:18:19.640 And the scale is interesting when you think about Starbucks, 438 00:18:19.640 --> 00:18:23.350 because today there's over 30,000 stores around the world. 439 00:18:23.350 --> 00:18:26.960 There's over 300,000 baristas around the world. 440 00:18:26.960 --> 00:18:29.350 And why it's an interesting scale problem 441 00:18:29.350 --> 00:18:33.550 is not just the number of customers that visit Starbucks. 442 00:18:33.550 --> 00:18:37.060 But if you think about those 30,000-some stores, 443 00:18:37.060 --> 00:18:40.360 each one is like a little business unto itself. 444 00:18:40.360 --> 00:18:41.890 When you're in enterprise software, 445 00:18:41.890 --> 00:18:44.648 you might have a hundred offices around the world. 446 00:18:44.648 --> 00:18:46.440 You might have a ton of people, but they're 447 00:18:46.440 --> 00:18:49.330 in these big offices with big pipes 448 00:18:49.330 --> 00:18:53.020 and lots of infrastructure, and you have a support team there. 449 00:18:53.020 --> 00:18:56.560 When you've got a store in the middle of Oklahoma 450 00:18:56.560 --> 00:18:59.980 on a dial-up line, that's a whole different thing 451 00:18:59.980 --> 00:19:00.980 to manage. 452 00:19:00.980 --> 00:19:04.180 And to have the same expectation of quality for a customer who's 453 00:19:04.180 --> 00:19:07.010 got their mobile order-and-pay application, that's 454 00:19:07.010 --> 00:19:08.660 just a whole different game. 455 00:19:08.660 --> 00:19:11.740 And I was really intrigued by, of course, IoT 456 00:19:11.740 --> 00:19:15.050 and how much more could be done in brick-and-mortar [retail] 457 00:19:15.050 --> 00:19:16.720 with IoT devices. 458 00:19:16.720 --> 00:19:22.090 I was intrigued by how quickly I saw consumer digital growing. 459 00:19:22.090 --> 00:19:26.230 And so all those things are what made me come to Starbucks 460 00:19:26.230 --> 00:19:29.390 and be part of that transformation journey. 461 00:19:29.390 --> 00:19:31.120 How do you learn when you know nothing 462 00:19:31.120 --> 00:19:33.050 about food and beverage? 463 00:19:33.050 --> 00:19:35.670 First thing you do is you spend time in the stores, you know? 464 00:19:35.670 --> 00:19:39.440 I spent my first few weeks in a store learning 465 00:19:39.440 --> 00:19:41.870 about how people make lattes. 466 00:19:41.870 --> 00:19:46.210 Now, I cannot claim to be a barista by any stretch at all, 467 00:19:46.210 --> 00:19:49.720 so when I was in the store, I was mostly helping clean or I 468 00:19:49.720 --> 00:19:52.860 was greeting customers, or I was trying to do things I could 469 00:19:52.860 --> 00:19:53.540 actually do. 470 00:19:53.540 --> 00:19:56.690 But in the process, you learn a lot about what goes 471 00:19:56.690 --> 00:19:59.580 on in a store -- and not just the really cool stuff that you 472 00:19:59.580 --> 00:20:03.500 see, like making the lattes or greeting the customers, 473 00:20:03.500 --> 00:20:06.240 but what goes on in the back of house. 474 00:20:06.240 --> 00:20:07.880 How do they receive inventory? 475 00:20:07.880 --> 00:20:08.540 Oh my gosh. 476 00:20:08.540 --> 00:20:09.940 How do they do payroll? 477 00:20:09.940 --> 00:20:12.390 How do they have to do labor scheduling? 478 00:20:12.390 --> 00:20:14.550 What does that look like? 479 00:20:14.550 --> 00:20:16.970 And that's an eye-opener. 480 00:20:16.970 --> 00:20:19.850 And I would say, forget what business you're in. 481 00:20:19.850 --> 00:20:22.680 Whatever business you're in, if you're a technologist, if you 482 00:20:22.680 --> 00:20:26.530 are not sitting shoulder [to] shoulder with whoever 483 00:20:26.530 --> 00:20:28.580 is the tip of the spear of the business, 484 00:20:28.580 --> 00:20:29.850 you're missing an opportunity. 485 00:20:29.850 --> 00:20:30.950 You've got to do that. 486 00:20:30.950 --> 00:20:33.280 So when I was in enterprise software, 487 00:20:33.280 --> 00:20:35.930 I spent a lot of time going out with salespeople 488 00:20:35.930 --> 00:20:37.160 to visit customers. 489 00:20:37.160 --> 00:20:39.820 I just wanted to see, what are the customers thinking? 490 00:20:39.820 --> 00:20:40.500 Do they love us? 491 00:20:40.500 --> 00:20:41.430 Do they hate us? 492 00:20:41.430 --> 00:20:43.100 What problems are they having? 493 00:20:43.100 --> 00:20:45.590 Spending time with the customer support 494 00:20:45.590 --> 00:20:47.760 center to just sit down and listen, occasionally, 495 00:20:47.760 --> 00:20:50.340 to the calls they were getting: What is the world 496 00:20:50.340 --> 00:20:52.910 thinking about us, and how are people who are depending 497 00:20:52.910 --> 00:20:54.453 on our software feeling? 498 00:20:54.453 --> 00:20:55.110 499 00:20:55.110 --> 00:20:57.360 SHERVIN KHODABANDEH: Did you spend time on the oil rig 500 00:20:57.360 --> 00:20:58.797 when you were at Chevron? 501 00:20:58.797 --> 00:21:01.130 GERRI MARTIN-FLICKINGER: I did spend time at refineries. 502 00:21:01.130 --> 00:21:02.630 Yeah, a little bit of time at refineries. 503 00:21:02.630 --> 00:21:03.520 SHERVIN KHODABANDEH: That's great. 504 00:21:03.520 --> 00:21:03.662 505 00:21:03.662 --> 00:21:06.120 SAM RANSBOTHAM: Actually, it reminds me of Prakhar Mehrotra 506 00:21:06.120 --> 00:21:07.140 and Walmart. 507 00:21:07.140 --> 00:21:09.550 He went out there and -- this was one of our earlier 508 00:21:09.550 --> 00:21:12.320 interviews -- said a very similar thing about how you 509 00:21:12.320 --> 00:21:15.360 understand how to automate or how to put technology 510 00:21:15.360 --> 00:21:16.380 into these situations. 511 00:21:16.380 --> 00:21:19.020 And it was very much echoing the kinds of things you're saying 512 00:21:19.020 --> 00:21:21.180 -- [it] can't be done in isolation. 513 00:21:21.180 --> 00:21:23.080 GERRI MARTIN-FLICKINGER: That's right. 514 00:21:23.080 --> 00:21:27.690 And I do think there's a couple of things that, in my playbook, 515 00:21:27.690 --> 00:21:30.700 have continued to pay off over and over again. 516 00:21:30.700 --> 00:21:32.730 And they're just simple, simple things. 517 00:21:32.730 --> 00:21:35.260 The first is, words matter. 518 00:21:35.260 --> 00:21:36.270 Words matter. 519 00:21:36.270 --> 00:21:37.630 They really do. 520 00:21:37.630 --> 00:21:40.570 When I say to you -- and I'm going to ask both of you 521 00:21:40.570 --> 00:21:43.900 to answer back at me; I'm going to ask you a question now: 522 00:21:43.900 --> 00:21:46.150 When I say the word "IT", what do you think of? 523 00:21:46.150 --> 00:21:46.650 524 00:21:46.650 --> 00:21:50.510 SHERVIN KHODABANDEH: I think of email servers and software 525 00:21:50.510 --> 00:21:52.670 patches and things like that. 526 00:21:52.670 --> 00:21:54.722 GERRI MARTIN-FLICKINGER: Sam, what about you? 527 00:21:54.722 --> 00:21:56.180 SAM RANSBOTHAM: I think I'm biased. 528 00:21:56.180 --> 00:21:58.310 I thought more of a strategy-oriented, 529 00:21:58.310 --> 00:22:02.722 how you're enabling connectivity within the organization -- 530 00:22:02.722 --> 00:22:04.930 GERRI MARTIN-FLICKINGER: We're not using your answer. 531 00:22:04.930 --> 00:22:05.450 No, no, no. 532 00:22:05.450 --> 00:22:06.720 SAM RANSBOTHAM: OK. 533 00:22:06.720 --> 00:22:09.880 I'm an IT prof, so that's maybe my bias there. 534 00:22:09.880 --> 00:22:12.010 Do you think most people go, "Operations"? 535 00:22:12.010 --> 00:22:13.635 GERRI MARTIN-FLICKINGER: When I say IT, 536 00:22:13.635 --> 00:22:15.445 most people will talk about the help desk; 537 00:22:15.445 --> 00:22:16.570 they'll talk about outages. 538 00:22:16.570 --> 00:22:17.730 SAM RANSBOTHAM: Trouble tickets. 539 00:22:17.730 --> 00:22:19.938 GERRI MARTIN-FLICKINGER: They'll talk about services, 540 00:22:19.938 --> 00:22:21.340 trouble tickets, data centers. 541 00:22:21.340 --> 00:22:21.890 Right? 542 00:22:21.890 --> 00:22:22.400 OK. 543 00:22:22.400 --> 00:22:26.220 When I say technology, people say, "The future. 544 00:22:26.220 --> 00:22:27.190 Innovation. 545 00:22:27.190 --> 00:22:27.770 New things." 546 00:22:27.770 --> 00:22:28.270 Right? 547 00:22:28.270 --> 00:22:30.580 So if you're in a business, and someone 548 00:22:30.580 --> 00:22:33.630 introduces someone who's in the IT department, 549 00:22:33.630 --> 00:22:35.180 they have one reaction. 550 00:22:35.180 --> 00:22:37.675 If I introduce someone to you and I say, "This is" -- 551 00:22:37.675 --> 00:22:38.800 in the case of Starbucks -- 552 00:22:38.800 --> 00:22:43.380 "Starbucks Technology," which one sounds and feels 553 00:22:43.380 --> 00:22:44.470 more future-leaning? 554 00:22:44.470 --> 00:22:46.290 SAM RANSBOTHAM: Actually, that's a huge difference. 555 00:22:46.290 --> 00:22:47.580 GERRI MARTIN-FLICKINGER: It's a huge difference. 556 00:22:47.580 --> 00:22:49.470 SAM RANSBOTHAM: I definitely see the difference there. 557 00:22:49.470 --> 00:22:50.762 GERRI MARTIN-FLICKINGER: Right. 558 00:22:50.762 --> 00:22:52.530 Which is why I said words really matter. 559 00:22:52.530 --> 00:22:54.750 And so we were talking about transformation 560 00:22:54.750 --> 00:22:57.100 and how do you transform a technology 561 00:22:57.100 --> 00:22:58.262 organization to the future. 562 00:22:58.262 --> 00:23:00.720 And so, one thing that's one of these tried-and-true things 563 00:23:00.720 --> 00:23:03.300 is, did you name your organization 564 00:23:03.300 --> 00:23:06.797 in a way that makes the organization proud, so 565 00:23:06.797 --> 00:23:08.630 that every single person in the organization 566 00:23:08.630 --> 00:23:11.150 sits up a little straighter and maybe works a little harder? 567 00:23:11.150 --> 00:23:14.230 Have you named the organization in a way that really represents 568 00:23:14.230 --> 00:23:16.040 what you want it to become? 569 00:23:16.040 --> 00:23:18.820 And have you named it in a way that everyone else 570 00:23:18.820 --> 00:23:21.040 in the business looks at it and goes, "Oh, that's 571 00:23:21.040 --> 00:23:22.640 something a little different"? 572 00:23:22.640 --> 00:23:23.140 OK. 573 00:23:23.140 --> 00:23:26.707 I know a long answer to one of the things 574 00:23:26.707 --> 00:23:28.790 that I think is really important in transformation 575 00:23:28.790 --> 00:23:30.660 is to signal that you're doing it. 576 00:23:30.660 --> 00:23:34.120 And so, for example, with Starbucks, 577 00:23:34.120 --> 00:23:35.950 when I joined, 90 days after I joined, 578 00:23:35.950 --> 00:23:38.900 [I] changed the name of IT to Starbucks Technology. 579 00:23:38.900 --> 00:23:40.600 Never used the word IT again. 580 00:23:40.600 --> 00:23:42.850 And if I was ever in a meeting where somebody said IT, 581 00:23:42.850 --> 00:23:45.140 I'd stop the meeting and I'd say, "We don't have IT. 582 00:23:45.140 --> 00:23:47.390 We have Starbucks Technology," and it's kind of funny, 583 00:23:47.390 --> 00:23:50.830 because that one change made a big difference. 584 00:23:50.830 --> 00:23:53.290 The next thing that I think can make a difference 585 00:23:53.290 --> 00:23:55.247 is, you need a tagline. 586 00:23:55.247 --> 00:23:57.080 I hate to say it, but everybody in business, 587 00:23:57.080 --> 00:23:58.860 everyone who's a CEO, knows it. 588 00:23:58.860 --> 00:24:02.300 You've got to tell your story, and you don't get five hours 589 00:24:02.300 --> 00:24:03.280 to tell your story. 590 00:24:03.280 --> 00:24:06.560 You get six to 10 words, and you'd better 591 00:24:06.560 --> 00:24:09.200 get people curious to ask more. 592 00:24:09.200 --> 00:24:12.090 And so, put a tagline in place. 593 00:24:12.090 --> 00:24:18.440 Super simple: "Talented technologists delivering today, 594 00:24:18.440 --> 00:24:20.900 leading into the future. 595 00:24:20.900 --> 00:24:22.440 Starbucks Technology." 596 00:24:22.440 --> 00:24:23.760 That's it. 597 00:24:23.760 --> 00:24:28.030 And that simple phrase, which is used today still, 598 00:24:28.030 --> 00:24:32.970 after six years, just continues to reinforce the value 599 00:24:32.970 --> 00:24:35.370 of the organization, the value of the people, 600 00:24:35.370 --> 00:24:37.620 the importance of getting the work done -- 601 00:24:37.620 --> 00:24:40.440 as well as continuing to build for the future. 602 00:24:40.440 --> 00:24:43.260 And so, for me, transformation comes down to people. 603 00:24:43.260 --> 00:24:45.003 And to do any transformation with tech 604 00:24:45.003 --> 00:24:46.420 has nothing to do with the tech as 605 00:24:46.420 --> 00:24:48.360 much as it has to do with the people who 606 00:24:48.360 --> 00:24:49.520 are making it happen. 607 00:24:49.520 --> 00:24:52.010 They have to feel inspired, they have 608 00:24:52.010 --> 00:24:53.780 to feel what they're doing is important, 609 00:24:53.780 --> 00:24:56.180 and they have to feel like they have room to be part 610 00:24:56.180 --> 00:24:58.330 of the invention of the future. 611 00:24:58.330 --> 00:25:00.760 And I think that's all we have to do as leaders, 612 00:25:00.760 --> 00:25:01.740 is make room for that. 613 00:25:01.740 --> 00:25:04.160 SAM RANSBOTHAM: Gerri, it was great talking with you. 614 00:25:04.160 --> 00:25:07.180 [You have] such a vast experience and a great ability 615 00:25:07.180 --> 00:25:08.950 to connect those experiences together 616 00:25:08.950 --> 00:25:11.443 to give us a holistic view of what's happening 617 00:25:11.443 --> 00:25:12.860 and what may happen in the future. 618 00:25:12.860 --> 00:25:13.770 Thank you for taking the time. 619 00:25:13.770 --> 00:25:15.340 SHERVIN KHODABANDEH: It's been really wonderful. 620 00:25:15.340 --> 00:25:16.010 Thank you. 621 00:25:16.010 --> 00:25:16.980 GERRI MARTIN-FLICKINGER: It's been fun. 622 00:25:16.980 --> 00:25:17.770 Thanks so much. 623 00:25:17.770 --> 00:25:18.123 624 00:25:18.123 --> 00:25:19.790 SAM RANSBOTHAM: Next time, Shervin and I 625 00:25:19.790 --> 00:25:21.260 talk with Barbara Martin Coppola, 626 00:25:21.260 --> 00:25:23.300 the chief digital officer for IKEA Retail. 627 00:25:23.300 --> 00:25:24.920 Join us as we hear what Barbara thinks 628 00:25:24.920 --> 00:25:26.337 about the meaning behind the words 629 00:25:26.337 --> 00:25:28.830 we use when we talk about artificial intelligence. 630 00:25:28.830 --> 00:25:30.360 ALLISON RYDER: Thanks for listening 631 00:25:30.360 --> 00:25:31.860 to Me, Myself, and AI. 632 00:25:31.860 --> 00:25:34.320 We believe, like you, that the conversation 633 00:25:34.320 --> 00:25:36.540 about AI implementation doesn't start and stop 634 00:25:36.540 --> 00:25:37.660 with this podcast. 635 00:25:37.660 --> 00:25:40.210 That's why we've created a group on LinkedIn, specifically 636 00:25:40.210 --> 00:25:41.330 for leaders like you. 637 00:25:41.330 --> 00:25:44.070 It's called AI for Leaders, and if you join us, 638 00:25:44.070 --> 00:25:46.100 you can chat with show creators and hosts, 639 00:25:46.100 --> 00:25:49.710 ask your own questions, share insights, and gain access 640 00:25:49.710 --> 00:25:52.210 to valuable resources about AI implementation 641 00:25:52.210 --> 00:25:54.300 from MIT SMR and BCG. 642 00:25:54.300 --> 00:25:59.380 You can access it by visiting mitsmr.com/AIforLeaders. 643 00:25:59.380 --> 00:26:02.140 We'll put that link in the show notes, 644 00:26:02.140 --> 00:26:04.580 and we hope to see you there. 645 00:26:04.580 --> 00:26:10.000