WEBVTT
1
00:00:02.730 --> 00:00:04.530
When you're putting a new tire on your car,
2
00:00:04.530 --> 00:00:06.350
you don't want to tighten one bolt all the way
3
00:00:06.350 --> 00:00:07.560
and then tighten the rest.
4
00:00:07.560 --> 00:00:09.080
You want to tighten them all a little bit
5
00:00:09.080 --> 00:00:10.684
and continuously tighten.
6
00:00:10.684 --> 00:00:11.994
What does that have to do with
7
00:00:11.994 --> 00:00:13.684
artificial intelligence and fashion?
8
00:00:13.684 --> 00:00:16.879
Find out today, when we talk with Arti Zeighami from H&M.
9
00:00:16.879 --> 00:00:19.913
(upbeat music)
10
00:00:19.913 --> 00:00:21.870
Welcome to Me, Myself, and AI,
11
00:00:21.870 --> 00:00:24.860
a podcast on artificial intelligence and business.
12
00:00:24.860 --> 00:00:28.050
Each week we introduce you to someone innovating with AI.
13
00:00:28.050 --> 00:00:29.250
I'm Sam Ransbotham,
14
00:00:29.250 --> 00:00:32.150
professor of information systems at Boston College,
15
00:00:32.150 --> 00:00:33.480
and I'm also the guest editor,
16
00:00:33.480 --> 00:00:36.203
for the AI and Business Strategy Big Ideas Program,
17
00:00:36.203 --> 00:00:38.272
at MIT Sloan Management Review.
18
00:00:38.272 --> 00:00:40.120
And I'm Shervin Khodabandeh,
19
00:00:40.120 --> 00:00:42.180
senior partner with BCG,
20
00:00:42.180 --> 00:00:46.000
and I colead BCG's AI practice in North America.
21
00:00:46.000 --> 00:00:49.110
And together, BCG and MIT SMR
22
00:00:49.110 --> 00:00:52.420
have been researching AI for four years,
23
00:00:52.420 --> 00:00:54.680
interviewing hundreds of practitioners
24
00:00:54.680 --> 00:00:56.800
and surveying thousands of companies
25
00:00:56.800 --> 00:01:00.120
on what it takes to build and deploy
26
00:01:00.120 --> 00:01:02.680
and scale AI capabilities,
27
00:01:02.680 --> 00:01:05.488
and really transform the way organizations operate.
28
00:01:05.488 --> 00:01:07.866
(ethereal music)
29
00:01:07.866 --> 00:01:10.350
Today we're talking with Arti Zeighami.
30
00:01:10.350 --> 00:01:13.250
He leads artificial intelligence at H&M.
31
00:01:13.250 --> 00:01:16.260
Arti is joining us from Stockholm, welcome Arti.
32
00:01:16.260 --> 00:01:19.270
Thank you, thank you very much for having me here.
33
00:01:19.270 --> 00:01:20.790
Really we'd like to hear
34
00:01:20.790 --> 00:01:21.870
a little bit about your background.
35
00:01:21.870 --> 00:01:22.780
Why don't we start there?
36
00:01:22.780 --> 00:01:24.100
How did you, how'd you get interested
37
00:01:24.100 --> 00:01:26.066
in artificial intelligence? And
38
00:01:26.066 --> 00:01:28.570
for our podcast today, I'm actually wearing a nice shirt
39
00:01:28.570 --> 00:01:31.370
because we're talking with a fashion person,
40
00:01:31.370 --> 00:01:32.330
even though this is audio,
41
00:01:32.330 --> 00:01:35.138
I can assure everyone that I look fabulous.
42
00:01:35.138 --> 00:01:36.230
How did you get interested
43
00:01:36.230 --> 00:01:38.067
in artificial intelligence though?
44
00:01:38.067 --> 00:01:40.010
I think my interest in the area
45
00:01:40.010 --> 00:01:43.680
started many years ago as a, you know, a young teenager,
46
00:01:43.680 --> 00:01:46.850
when the, like a lot of the people I was, you know,
47
00:01:46.850 --> 00:01:50.260
studying math and physics and loved all those stuff,
48
00:01:50.260 --> 00:01:53.390
and I read these books by Isaac Asimov, you know,
49
00:01:53.390 --> 00:01:57.070
the science fiction author. And I was reading about this,
50
00:01:57.070 --> 00:01:59.732
this theory that he had about psycho history, they call it,
51
00:01:59.732 --> 00:02:04.732
which was about how you can start somehow predict the future
52
00:02:04.880 --> 00:02:06.400
by looking at the past,
53
00:02:06.400 --> 00:02:09.077
and apply mathematical models on top of that.
54
00:02:09.077 --> 00:02:12.620
And I was so intrigued by that, when I was 15, 16, or 17,
55
00:02:12.620 --> 00:02:14.350
I was like, this is something amazing.
56
00:02:14.350 --> 00:02:15.850
This guy made it up by himself,
57
00:02:15.850 --> 00:02:18.450
and you know, what if this can be for true?
58
00:02:18.450 --> 00:02:21.630
What if I could work with something like this in the future?
59
00:02:21.630 --> 00:02:23.650
And, you know, it took me a long time to get there
60
00:02:23.650 --> 00:02:25.599
because there between school came,
61
00:02:25.599 --> 00:02:27.210
and then I started working as a consultant,
62
00:02:27.210 --> 00:02:30.000
and I was doing programming, I was in architecture,
63
00:02:30.000 --> 00:02:32.860
I was a business developer, I was a strategist.
64
00:02:32.860 --> 00:02:34.980
I did, you know, different startups and all that,
65
00:02:34.980 --> 00:02:38.548
and then I ended up in a fashion retailer.
66
00:02:38.548 --> 00:02:41.930
Then I got this opportunity to start looking at,
67
00:02:41.930 --> 00:02:45.110
advanced analytics as an AI, as a capability.
68
00:02:45.110 --> 00:02:48.180
I don't have a formal data science background,
69
00:02:48.180 --> 00:02:50.640
I do have an engineering background from engineering school,
70
00:02:50.640 --> 00:02:53.170
I even started business school parallel to that, you know,
71
00:02:53.170 --> 00:02:55.100
and then life brought me here.
72
00:02:55.100 --> 00:02:56.950
Somehow it's like the universe brought me
73
00:02:56.950 --> 00:03:00.107
to artificial intelligence and somebody made fun of this.
74
00:03:00.107 --> 00:03:02.060
"Yeah, your name is Arti, that's already Arti, artificial.
75
00:03:02.060 --> 00:03:03.852
It's probably that's why," so, yeah.
76
00:03:03.852 --> 00:03:05.344
Oh yeah--
77
00:03:05.344 --> 00:03:07.317
That's pretty funny.
78
00:03:07.317 --> 00:03:09.495
Yeah, I always joke about that.
79
00:03:09.495 --> 00:03:11.570
Joking aside, I would say,
80
00:03:11.570 --> 00:03:15.420
like what you described as a diverse and colorful background
81
00:03:15.420 --> 00:03:20.340
and experiences, you know, architect, engineer, strategist,
82
00:03:20.340 --> 00:03:22.650
how does it help you now?
83
00:03:22.650 --> 00:03:25.136
That diversity versus, if you'd been all focused,
84
00:03:25.136 --> 00:03:26.960
do you have a point of view on that?
85
00:03:26.960 --> 00:03:28.240
Yeah, absolutely.
86
00:03:28.240 --> 00:03:30.910
I think it has helped me tremendously.
87
00:03:30.910 --> 00:03:34.440
I think one of the major, you know,
88
00:03:34.440 --> 00:03:37.382
parts of working with introducing a new capability,
89
00:03:37.382 --> 00:03:42.382
such as AI, into a, you know, old industry, like in retail,
90
00:03:43.003 --> 00:03:44.960
which has done certain things in a certain way
91
00:03:44.960 --> 00:03:48.380
for so many years, is about shift of mindset.
92
00:03:48.380 --> 00:03:51.000
It's about transforming people's way of thinking it's,
93
00:03:51.000 --> 00:03:54.750
you know, it's very little AI, it's very little tech.
94
00:03:54.750 --> 00:03:58.640
I usually refer to that 10% AI, 20% tech,
95
00:03:58.640 --> 00:04:01.210
but it's 70% people and processes.
96
00:04:01.210 --> 00:04:04.170
So you try to shift people into thinking differently,
97
00:04:04.170 --> 00:04:06.780
to ask different questions, to, you know,
98
00:04:06.780 --> 00:04:09.880
look at the world differently and working as a consultant,
99
00:04:09.880 --> 00:04:12.380
I think that was one of the biggest help I ever got,
100
00:04:12.380 --> 00:04:14.890
because as a consultant, you always make sure that,
101
00:04:14.890 --> 00:04:17.570
it's not about you shining, it's about your clients shining,
102
00:04:17.570 --> 00:04:20.070
and it's about making them the hero of the day,
103
00:04:20.070 --> 00:04:21.600
and it's always been about that.
104
00:04:21.600 --> 00:04:23.840
And I kind of sort of even internally would,
105
00:04:23.840 --> 00:04:25.767
you know, my colleagues and my peers always said,
106
00:04:25.767 --> 00:04:27.610
"Listen, let's make sure that
107
00:04:27.610 --> 00:04:29.670
we are almost like internal consultants
108
00:04:29.670 --> 00:04:32.080
because it's about helping our colleagues
109
00:04:32.080 --> 00:04:33.650
to achieve their goals."
110
00:04:33.650 --> 00:04:34.600
Ultimately, if you're talking about
111
00:04:34.600 --> 00:04:37.099
transforming people's mindset, it's about the rhetoric's,
112
00:04:37.099 --> 00:04:40.390
it's how you make them understand what you're trying to do
113
00:04:40.390 --> 00:04:42.243
and how you make them understand
114
00:04:42.243 --> 00:04:44.160
what you're trying to help them with.
115
00:04:44.160 --> 00:04:46.460
So, you know, it goes back to the Greeks rhetoric,
116
00:04:46.460 --> 00:04:47.821
ethos, pathos, logos.
117
00:04:47.821 --> 00:04:51.290
That's a long way from the ancient Greeks to,
118
00:04:51.290 --> 00:04:53.610
from Greece to Sweden here, I guess.
119
00:04:53.610 --> 00:04:54.810
Is there something specific
120
00:04:54.810 --> 00:04:58.100
that AI has been able to help with at H&M?
121
00:04:58.100 --> 00:04:59.494
Absolutely, you know, we as a company
122
00:04:59.494 --> 00:05:01.340
have always been analytical.
123
00:05:01.340 --> 00:05:03.873
If you go back to how the company was brought up, even,
124
00:05:03.873 --> 00:05:06.010
you know, the company is a family,
125
00:05:06.010 --> 00:05:08.340
that started as this, the personal family,
126
00:05:08.340 --> 00:05:11.650
the third generation is now chairman of the company.
127
00:05:11.650 --> 00:05:13.740
Erling who started this back in 1947.
128
00:05:13.740 --> 00:05:15.200
It was very analytical on the,
129
00:05:15.200 --> 00:05:16.398
on this way already back those days,
130
00:05:16.398 --> 00:05:18.620
you know, there's a story is about,
131
00:05:18.620 --> 00:05:20.770
something they called "Following the bags,"
132
00:05:20.770 --> 00:05:23.200
you know, when he was trying to enter a new city,
133
00:05:23.200 --> 00:05:24.892
he send out people with pens and papers
134
00:05:24.892 --> 00:05:26.320
and then will walk in the street,
135
00:05:26.320 --> 00:05:29.060
looking at the bags of the people or where they were going,
136
00:05:29.060 --> 00:05:31.450
and if they were crossing an intersection,
137
00:05:31.450 --> 00:05:33.557
or this side or that side, and they didn't understand,
138
00:05:33.557 --> 00:05:34.390
"Okay, what sort of intersection
139
00:05:34.390 --> 00:05:35.960
that should be on the store?"
140
00:05:35.960 --> 00:05:38.285
We start to look at this artificial intelligence,
141
00:05:38.285 --> 00:05:43.011
back in 2016 and try to understand what it entails for us
142
00:05:43.011 --> 00:05:47.310
as a retailer, and we entered a very small area
143
00:05:47.310 --> 00:05:49.300
and we did a proof of concept in that area,
144
00:05:49.300 --> 00:05:52.510
within the personalization to understand, you know,
145
00:05:52.510 --> 00:05:56.210
how we can enhance the communication, the personalization,
146
00:05:56.210 --> 00:05:57.670
the offering to our customer
147
00:05:57.670 --> 00:06:00.470
in a way utilizing AI analytics.
148
00:06:00.470 --> 00:06:03.431
And we saw that based upon the amount of data that we have,
149
00:06:03.431 --> 00:06:06.040
you know, the vast information that we have of our product,
150
00:06:06.040 --> 00:06:08.010
of our sales, of our customer,
151
00:06:08.010 --> 00:06:09.960
we can be really precise on that.
152
00:06:09.960 --> 00:06:13.740
That by itself was not pivotal, it was more of understanding
153
00:06:13.740 --> 00:06:16.490
how you change the mindset to set, you know,
154
00:06:16.490 --> 00:06:18.570
you want people to come in Monday morning
155
00:06:18.570 --> 00:06:20.380
and ask different questions.
156
00:06:20.380 --> 00:06:22.900
And to do that, you need to get them more analytical,
157
00:06:22.900 --> 00:06:26.650
and in order to penetrate that into the entire organization,
158
00:06:26.650 --> 00:06:29.260
we took an approach that was a little bit different,
159
00:06:29.260 --> 00:06:30.854
because a lot of people asked me back in those days,
160
00:06:30.854 --> 00:06:32.922
you know, "How did you pick your AI use cases?"
161
00:06:32.922 --> 00:06:35.440
And I said, "I don't have AI use cases.
162
00:06:35.440 --> 00:06:38.700
I have like business challenges that my colleagues have.
163
00:06:38.700 --> 00:06:41.390
These are, I look at the portfolio of our project and see,
164
00:06:41.390 --> 00:06:43.862
we have a lot of problems, we have lot of challenges,
165
00:06:43.862 --> 00:06:46.526
and then those challenges are have entailed into projects,
166
00:06:46.526 --> 00:06:48.770
they're going to change and fix those things,
167
00:06:48.770 --> 00:06:53.550
and here I can come in and help in a very small part of it."
168
00:06:53.550 --> 00:06:56.160
And I think that was also a very important part,
169
00:06:56.160 --> 00:06:58.300
how you infuse something new in an organization,
170
00:06:58.300 --> 00:07:01.103
because a lot of people, again, take one part,
171
00:07:01.103 --> 00:07:02.180
a very small part,
172
00:07:02.180 --> 00:07:04.570
and then they do deep dive on that small part,
173
00:07:04.570 --> 00:07:06.300
and they create that, you know, a sexy app
174
00:07:06.300 --> 00:07:09.740
or a cool customer facing stuff, and that's fine.
175
00:07:09.740 --> 00:07:12.400
But then you make one part of your organization
176
00:07:12.400 --> 00:07:14.092
to become very, very good at that,
177
00:07:14.092 --> 00:07:17.410
and then the rest are still, you know, lagging behind.
178
00:07:17.410 --> 00:07:20.540
I believe you need to elevate everybody a little bit.
179
00:07:20.540 --> 00:07:23.640
So instead of, it's almost like putting a tire on a car,
180
00:07:23.640 --> 00:07:25.757
right, you don't screw one bolt really hard
181
00:07:25.757 --> 00:07:29.130
and then do the next one. You just do every by a little bit,
182
00:07:29.130 --> 00:07:30.740
and then tighten everything up.
183
00:07:30.740 --> 00:07:32.650
And I think that has been a really good approach
184
00:07:32.650 --> 00:07:34.050
for us to do that to everybody,
185
00:07:34.050 --> 00:07:36.610
and I'm enhancing stuff in the beginning of the value chain
186
00:07:36.610 --> 00:07:39.700
with fashion forecasting, with quantification,
187
00:07:39.700 --> 00:07:42.310
how you quantify, how many pieces you buy, to
188
00:07:42.310 --> 00:07:43.863
how you allocate the garments throughout
189
00:07:43.863 --> 00:07:47.235
the whole value chain, to how that puts prices on them,
190
00:07:47.235 --> 00:07:49.750
and maybe also working with personalization
191
00:07:49.750 --> 00:07:52.860
and all those fancy customer facing stuff as well.
192
00:07:52.860 --> 00:07:57.210
And for us, AI has not meant artificial intelligence,
193
00:07:57.210 --> 00:08:00.310
we have always talked about amplified intelligence,
194
00:08:00.310 --> 00:08:02.930
because we're amplifying an existing knowledge
195
00:08:02.930 --> 00:08:04.840
and competence of our colleagues.
196
00:08:04.840 --> 00:08:07.730
So, it doesn't have to be the AI that does the decision,
197
00:08:07.730 --> 00:08:08.563
it could be combination,
198
00:08:08.563 --> 00:08:12.530
and we see that when we do the combination of AI and machine
199
00:08:12.530 --> 00:08:16.250
and human, the gut feeling of the data, the art and science,
200
00:08:16.250 --> 00:08:18.300
that's when we get the most out of it.
201
00:08:18.300 --> 00:08:21.850
I see a lot of things that we do today is that mixture.
202
00:08:21.850 --> 00:08:23.320
I want to pick up on,
203
00:08:23.320 --> 00:08:26.500
what do you referred to as amplified intelligence?
204
00:08:26.500 --> 00:08:28.340
I think it's very elegant,
205
00:08:28.340 --> 00:08:32.240
and it sort of also underlies the theme you started
206
00:08:32.240 --> 00:08:36.930
this conversation with around organization,
207
00:08:36.930 --> 00:08:41.792
people, you know, 70% is the people and organizations.
208
00:08:41.792 --> 00:08:42.625
Yes.
209
00:08:42.625 --> 00:08:44.070
And it also ties very well
210
00:08:44.070 --> 00:08:46.820
with the research we just did, which is all again,
211
00:08:46.820 --> 00:08:48.860
talks about the role of human
212
00:08:48.860 --> 00:08:50.580
and sometimes the misunderstood
213
00:08:50.580 --> 00:08:52.760
or understated role of human.
214
00:08:52.760 --> 00:08:55.253
Comment more on that, and particularly,
215
00:08:56.110 --> 00:08:59.450
different ways that human and AI can interact.
216
00:08:59.450 --> 00:09:02.230
You know, across these different business problems.
217
00:09:02.230 --> 00:09:05.520
Right, right. When we did our first pilot,
218
00:09:05.520 --> 00:09:09.150
a test of utilizing AI and advanced analytics,
219
00:09:09.150 --> 00:09:12.160
and end-of-season sales, and that was my first attempt
220
00:09:12.160 --> 00:09:14.680
to try to use it on an actual use case
221
00:09:14.680 --> 00:09:18.380
and an actual business challenge. We saw that very early,
222
00:09:18.380 --> 00:09:20.860
that the AI could actually enhance
223
00:09:20.860 --> 00:09:22.911
what was much better than the human on putting the prices.
224
00:09:22.911 --> 00:09:25.840
I mean, end-of-season sales result.
225
00:09:25.840 --> 00:09:28.050
And important part of that journey was to make sure
226
00:09:28.050 --> 00:09:30.370
that the teams, they were actually applying it,
227
00:09:30.370 --> 00:09:32.300
not my team, not the AI team,
228
00:09:32.300 --> 00:09:34.820
but actually the people that are working the merchandise
229
00:09:34.820 --> 00:09:38.020
online on a very small selected markets,
230
00:09:38.020 --> 00:09:41.560
I let them actually calculate what the outcome was.
231
00:09:41.560 --> 00:09:44.340
So they both were responsible for the test,
232
00:09:44.340 --> 00:09:45.380
for setting up the test,
233
00:09:45.380 --> 00:09:47.250
putting all the constraints they want to me,
234
00:09:47.250 --> 00:09:50.000
and making sure that my algo was not getting anything else
235
00:09:50.000 --> 00:09:52.410
than the merchandise that we're getting on a daily basis,
236
00:09:52.410 --> 00:09:54.430
and then they'd try to understand,
237
00:09:54.430 --> 00:09:57.350
how better it is by actually calculating it themself.
238
00:09:57.350 --> 00:10:01.040
This is perfect, you know, it helps us to enhance our job.
239
00:10:01.040 --> 00:10:03.040
Let's do a next test, for the next season,
240
00:10:03.040 --> 00:10:05.140
mid-season sales, so a couple of months later,
241
00:10:05.140 --> 00:10:06.703
we added the test and we made it a little bit larger.
242
00:10:06.703 --> 00:10:10.440
We added another country, another market, two warehouses,
243
00:10:10.440 --> 00:10:13.660
sell them on the products, and we tested that.
244
00:10:13.660 --> 00:10:15.710
I want to add a little more complexity to this,
245
00:10:15.710 --> 00:10:19.060
But one thing that we want to do is also add a third bucket.
246
00:10:19.060 --> 00:10:21.410
So we're still going to have a few products to look at,
247
00:10:21.410 --> 00:10:23.170
but we want to divide it, not in two buckets,
248
00:10:23.170 --> 00:10:25.058
but three buckets, where one bucket is that
249
00:10:25.058 --> 00:10:26.630
algo putting the price on,
250
00:10:26.630 --> 00:10:28.410
one bucket is the merchandising,
251
00:10:28.410 --> 00:10:31.390
and one bucket is the algo putting the price on
252
00:10:31.390 --> 00:10:32.960
and the merchandise merchandising coming in
253
00:10:32.960 --> 00:10:34.760
and tweak those prices.
254
00:10:34.760 --> 00:10:38.130
Because we saw there some things that algo isn't good at.
255
00:10:38.130 --> 00:10:40.077
And we found that very interesting, and said,
256
00:10:40.077 --> 00:10:41.870
"Let's absolutely, let's do this."
257
00:10:41.870 --> 00:10:44.308
And then, we did the mid-season test,
258
00:10:44.308 --> 00:10:47.840
and the outcome of that was even more interesting.
259
00:10:47.840 --> 00:10:49.107
Because the algo was again,
260
00:10:49.107 --> 00:10:51.540
you know, a few percentage better than human,
261
00:10:51.540 --> 00:10:54.990
but the algo in combination with human,
262
00:10:54.990 --> 00:10:58.010
was twice as good as the algo itself.
263
00:10:58.010 --> 00:10:59.490
And actually it was, then we,
264
00:10:59.490 --> 00:11:01.530
we start talking about amplified intelligence
265
00:11:01.530 --> 00:11:04.040
because we realized the machine by itself won't help us.
266
00:11:04.040 --> 00:11:05.770
It's a combination of the human and the machine,
267
00:11:05.770 --> 00:11:07.197
the gut feeling and the data.
268
00:11:07.197 --> 00:11:08.450
May have to change your name
269
00:11:08.450 --> 00:11:09.694
from Arti to Amfi.
270
00:11:09.694 --> 00:11:12.194
(Arti laughs)
271
00:11:13.100 --> 00:11:15.748
Yeah, oh my mom won't be happy.
272
00:11:15.748 --> 00:11:18.550
Yeah, your mom might not be happy there.
273
00:11:18.550 --> 00:11:21.260
You described a process where you went further in depth
274
00:11:21.260 --> 00:11:23.490
into a pricing process,
275
00:11:23.490 --> 00:11:25.947
but earlier you were talking about a process of saying,
276
00:11:25.947 --> 00:11:28.049
"Well, we need to do lots of things in different areas."
277
00:11:28.049 --> 00:11:28.882
Yes.
278
00:11:28.882 --> 00:11:30.790
How do you balance that,
279
00:11:30.790 --> 00:11:32.610
tightening all the bolts
280
00:11:32.610 --> 00:11:34.648
versus tightening this one bolt harder?
281
00:11:34.648 --> 00:11:36.350
Because it sounded like in that example,
282
00:11:36.350 --> 00:11:38.681
you were doing some more tightening of one bolt.
283
00:11:38.681 --> 00:11:40.370
How do you balance those out?
284
00:11:40.370 --> 00:11:42.480
Well, you didn't hear the whole story.
285
00:11:42.480 --> 00:11:43.410
Oh, there's more, okay.
286
00:11:43.410 --> 00:11:44.537
Already from their first tighten up
287
00:11:44.537 --> 00:11:46.565
of the first bolt, so the first test that I got,
288
00:11:46.565 --> 00:11:49.340
which was good result, but it was not enough,
289
00:11:49.340 --> 00:11:51.500
already then I was happy about those results,
290
00:11:51.500 --> 00:11:53.698
and I took that result and went another part of the business
291
00:11:53.698 --> 00:11:56.410
and said, "Listen, we did this with these guys.
292
00:11:56.410 --> 00:11:58.100
They were really happy about the result.
293
00:11:58.100 --> 00:12:01.630
We saw that we were X amount percent better on net sales
294
00:12:01.630 --> 00:12:05.470
by using an algo which we created in four and a half weeks,
295
00:12:05.470 --> 00:12:08.350
and it has a huge impact on the business.
296
00:12:08.350 --> 00:12:10.390
Do you want to us to help you to look at this area?
297
00:12:10.390 --> 00:12:12.470
Because I know you have a problem here."
298
00:12:12.470 --> 00:12:14.300
And then we brought in the data scientists,
299
00:12:14.300 --> 00:12:16.020
and we put the experiments around that.
300
00:12:16.020 --> 00:12:17.760
And then when we started that discussion
301
00:12:17.760 --> 00:12:20.390
and that conversation for that specific project,
302
00:12:20.390 --> 00:12:22.450
it was connected to another part of their business,
303
00:12:22.450 --> 00:12:24.624
and they say, "Hey, if you're going to do change there,
304
00:12:24.624 --> 00:12:27.040
maybe we should do change here as well.
305
00:12:27.040 --> 00:12:28.667
Arti, do you want to help us in this case?"
306
00:12:28.667 --> 00:12:30.400
"Yes, please let me help."
307
00:12:30.400 --> 00:12:32.384
And then we started doing that, and then that follows,
308
00:12:32.384 --> 00:12:35.970
and by the end of the year, suddenly we had those eight,
309
00:12:35.970 --> 00:12:38.580
seven, whatever use cases that we had,
310
00:12:38.580 --> 00:12:41.570
and then we saw that we're actually applying this
311
00:12:41.570 --> 00:12:43.380
throughout the whole value chain.
312
00:12:43.380 --> 00:12:46.950
So, and then, meanwhile you start finding each of the bolts
313
00:12:46.950 --> 00:12:47.930
little bit more and more.
314
00:12:47.930 --> 00:12:48.870
It's like a pitch crew.
315
00:12:48.870 --> 00:12:50.300
Yeah (chuckles) exactly.
316
00:12:50.300 --> 00:12:52.300
And it's the whole line of being agile, right?
317
00:12:52.300 --> 00:12:54.810
You start small, you dream big,
318
00:12:54.810 --> 00:12:57.110
you start small and you scale fast.
319
00:12:57.110 --> 00:12:58.464
So, you start small with something here,
320
00:12:58.464 --> 00:13:00.396
and then you start the next wave and the next wave,
321
00:13:00.396 --> 00:13:02.340
the next wave, and all these waves
322
00:13:02.340 --> 00:13:04.610
have a cycle of starting small,
323
00:13:04.610 --> 00:13:07.150
testing a little bit larger, failing, pivoting,
324
00:13:07.150 --> 00:13:08.916
and testing more, failing, pivoting, learning,
325
00:13:08.916 --> 00:13:11.140
and then goes on and goes on, and you know,
326
00:13:11.140 --> 00:13:14.180
organizations such as ours, we are huge like 5,000 stores,
327
00:13:14.180 --> 00:13:17.530
70 plus countries, 180,000 people, you know.
328
00:13:17.530 --> 00:13:19.980
So there's huge amount of things when you start
329
00:13:19.980 --> 00:13:21.890
to industrialize that.
330
00:13:21.890 --> 00:13:25.780
And AI doesn't mean anything if you don't industrialize it.
331
00:13:25.780 --> 00:13:27.710
And Arti, as you were talking about
332
00:13:27.710 --> 00:13:31.970
industrializing AI and getting real scale out of it,
333
00:13:31.970 --> 00:13:34.390
in the context of amplified intelligence,
334
00:13:34.390 --> 00:13:35.223
you know.
Yes.
335
00:13:35.223 --> 00:13:38.780
Machine is easy to get them to play with human
336
00:13:38.780 --> 00:13:40.520
because they don't have choice.
337
00:13:40.520 --> 00:13:45.130
How do you get the human to play nice with the machine
338
00:13:45.130 --> 00:13:48.500
and what's the kind of pushback
339
00:13:48.500 --> 00:13:49.976
that you would expect or you get,
340
00:13:49.976 --> 00:13:52.199
and how do you deal with that?
341
00:13:52.199 --> 00:13:55.850
So, it's different from, you know,
342
00:13:55.850 --> 00:13:58.918
different levels of people in the organization.
343
00:13:58.918 --> 00:13:59.940
Mm-hmm.
344
00:13:59.940 --> 00:14:01.380
One thing I found out is that,
345
00:14:01.380 --> 00:14:03.663
maybe it's also about the culture in the organization as
346
00:14:03.663 --> 00:14:06.580
well, right? So we come from a company with a culture of,
347
00:14:06.580 --> 00:14:09.264
always being entrepreneurial, always seeking for,
348
00:14:09.264 --> 00:14:11.440
you know, improving ourselves.
349
00:14:11.440 --> 00:14:13.470
And that has been part of the organization,
350
00:14:13.470 --> 00:14:16.534
it's in our DNA somehow to always try to be better.
351
00:14:16.534 --> 00:14:18.820
And that's why also we talked about,
352
00:14:18.820 --> 00:14:20.870
you know, the combination that we're just taking
353
00:14:20.870 --> 00:14:22.230
a small part of the work,
354
00:14:22.230 --> 00:14:24.750
and we're actually internal consultants.
355
00:14:24.750 --> 00:14:26.700
So, when they own that,
356
00:14:26.700 --> 00:14:29.130
it also makes them feel pride about what they're doing.
357
00:14:29.130 --> 00:14:32.080
We really went for this DevOps model
358
00:14:32.080 --> 00:14:34.480
where we put people into teams
359
00:14:34.480 --> 00:14:35.548
where you have a use case lead,
360
00:14:35.548 --> 00:14:37.101
and then you have business experts,
361
00:14:37.101 --> 00:14:39.950
and you have machinery engineers, data engineers,
362
00:14:39.950 --> 00:14:42.120
software engineers, UX designers,
363
00:14:42.120 --> 00:14:46.230
all working together on a daily basis in a very agile way
364
00:14:46.230 --> 00:14:48.930
by sitting together in the same office,
365
00:14:48.930 --> 00:14:50.460
having morning stand-ups.
366
00:14:50.460 --> 00:14:52.310
So they were part the whole development team,
367
00:14:52.310 --> 00:14:53.856
and my job to make sure that,
368
00:14:53.856 --> 00:14:56.570
my other colleagues also understand the impact,
369
00:14:56.570 --> 00:14:59.570
and then utilize this to, you know, realize their value
370
00:14:59.570 --> 00:15:00.850
in their part of the company,
371
00:15:00.850 --> 00:15:01.870
they're part of our organization,
372
00:15:01.870 --> 00:15:03.640
making our company continue to thrive
373
00:15:03.640 --> 00:15:05.310
and become even better.
374
00:15:05.310 --> 00:15:07.150
I think there's an important lesson in there
375
00:15:07.150 --> 00:15:09.764
for everyone as well, including for CEO's
376
00:15:09.764 --> 00:15:11.317
and heads of businesses.
377
00:15:11.317 --> 00:15:15.920
We are underscoring the importance of focus on value versus
378
00:15:15.920 --> 00:15:19.700
a single sort of minded, tech centric
379
00:15:19.700 --> 00:15:21.265
focus on building capabilities,
380
00:15:21.265 --> 00:15:25.520
and building more and more capabilities, and the need for
381
00:15:25.520 --> 00:15:28.280
the business and the users and the people
382
00:15:28.280 --> 00:15:31.530
to come in from the beginning to design that.
383
00:15:31.530 --> 00:15:33.443
Arti, thank you so much for making time.
384
00:15:33.443 --> 00:15:35.480
Yes, thanks for taking the time today.
385
00:15:35.480 --> 00:15:36.769
Thank you guys.
386
00:15:36.769 --> 00:15:39.519
(ethereal music)
387
00:15:40.880 --> 00:15:42.680
Sam, I thought that was such a stimulating
388
00:15:42.680 --> 00:15:45.282
conversation with Arti, let's quickly recap.
389
00:15:45.282 --> 00:15:48.230
Sounds good. Shervin, it's interesting, you know,
390
00:15:48.230 --> 00:15:49.270
we talked with Porsche,
391
00:15:49.270 --> 00:15:52.501
we never even talked about tires and hubcaps and tightening,
392
00:15:52.501 --> 00:15:55.730
(chuckles) tightening bolts, but when we turn to fashion,
393
00:15:55.730 --> 00:15:58.340
he talked about the importance of doing that,
394
00:15:58.340 --> 00:16:00.750
and his analogy was very ultra-related.
395
00:16:00.750 --> 00:16:03.130
That's right, and when we talked with Porsche,
396
00:16:03.130 --> 00:16:04.140
if you're talking about coffee,
397
00:16:04.140 --> 00:16:05.330
(chuckles) Exactly.
398
00:16:05.330 --> 00:16:08.580
Everyone is, you know, joking aside,
399
00:16:08.580 --> 00:16:10.550
I think the common thing we're hearing,
400
00:16:10.550 --> 00:16:14.260
is the importance of archetypal problems
401
00:16:14.260 --> 00:16:16.760
and translating or transferring these learnings
402
00:16:16.760 --> 00:16:20.490
across problems, which interestingly enough is a
403
00:16:20.490 --> 00:16:22.833
topic in AI itself like transfer learning.
404
00:16:22.833 --> 00:16:26.350
This is not exactly mean what we're talking about here.
405
00:16:26.350 --> 00:16:29.120
I also thought it was quite interesting
406
00:16:29.120 --> 00:16:31.527
how from the beginning Arti said,
407
00:16:31.527 --> 00:16:35.758
"Look, it's about changing the mindset of the people,
408
00:16:35.758 --> 00:16:38.897
and it's about the organization, and it's about."
409
00:16:38.897 --> 00:16:42.030
You know, he talked about it as amplified intelligence,
410
00:16:42.030 --> 00:16:45.260
you know, bringing human and AI together
411
00:16:45.260 --> 00:16:47.866
rather than all one or the other.
412
00:16:47.866 --> 00:16:49.540
I think if we pulled out the words,
413
00:16:49.540 --> 00:16:51.321
he said learning more than anything else.
414
00:16:51.321 --> 00:16:53.020
That's right, that's right.
415
00:16:53.020 --> 00:16:55.000
The other important point he made,
416
00:16:55.000 --> 00:16:57.610
which I think might be lost on many,
417
00:16:57.610 --> 00:16:59.810
is that you can't just start with technology
418
00:16:59.810 --> 00:17:02.780
and capabilities and, you know,
419
00:17:02.780 --> 00:17:03.898
reminds me of "Field of Dreams,"
420
00:17:03.898 --> 00:17:06.250
you know, "If you build it, they will come."
421
00:17:06.250 --> 00:17:07.610
I think this is--
He said the opposite.
422
00:17:07.610 --> 00:17:09.130
Exactly the opposite.
423
00:17:09.130 --> 00:17:12.090
You can't build it and wait for them to come,
424
00:17:12.090 --> 00:17:14.110
you actually have to build it together.
425
00:17:14.110 --> 00:17:16.945
You need to get them first and you got to build it
426
00:17:16.945 --> 00:17:20.084
together. I think it's an important lesson here.
427
00:17:20.084 --> 00:17:21.910
Yeah, he emphasized the value first
428
00:17:21.910 --> 00:17:23.593
and then structure and that was important.
429
00:17:23.593 --> 00:17:25.610
There was an element too of,
430
00:17:25.610 --> 00:17:28.570
of a weakest link thinking that came through that,
431
00:17:28.570 --> 00:17:30.990
he talked about lots of different places
432
00:17:30.990 --> 00:17:32.770
that they were using artificial intelligence,
433
00:17:32.770 --> 00:17:34.620
and he didn't actually use this phrase,
434
00:17:34.620 --> 00:17:37.070
but part of the idea was that
435
00:17:37.070 --> 00:17:39.258
it doesn't do any good to strengthen one area extensively
436
00:17:39.258 --> 00:17:40.980
but not another one.
437
00:17:40.980 --> 00:17:43.660
And so, you know, that kind of speaks to his,
438
00:17:43.660 --> 00:17:46.870
the tension with value versus structure,
439
00:17:46.870 --> 00:17:49.420
he wanted to kind of the same time progressing.
440
00:17:49.420 --> 00:17:51.005
He wanted different parts of the business
441
00:17:51.005 --> 00:17:53.130
to be progressing at the same time as well,
442
00:17:53.130 --> 00:17:55.540
not too deep in one area,
443
00:17:55.540 --> 00:17:57.600
not just completely shallow everywhere either
444
00:17:57.600 --> 00:18:00.057
and so, almost everything seemed to be about a balance.
445
00:18:00.057 --> 00:18:01.787
You don't start by saying,
446
00:18:01.787 --> 00:18:03.240
"I've got to move everybody
447
00:18:03.240 --> 00:18:05.660
all into one center of excellence.
448
00:18:05.660 --> 00:18:08.280
Then I'm going to go build this baseball field,
449
00:18:08.280 --> 00:18:11.738
and then everybody will come and play," you know.
450
00:18:11.738 --> 00:18:13.950
Right, the other part of that was that he said,
451
00:18:13.950 --> 00:18:15.860
they didn't have AI use cases.
452
00:18:15.860 --> 00:18:17.440
And that sort of structure first
453
00:18:17.440 --> 00:18:21.510
would lend you towards a thinking of AI use cases first,
454
00:18:21.510 --> 00:18:22.950
he said, "Not AI use cases."
455
00:18:22.950 --> 00:18:26.410
He said, "Business challenges that we sometimes
456
00:18:26.410 --> 00:18:27.514
and often solve with AI."
457
00:18:27.514 --> 00:18:28.988
That's right, that's right.
458
00:18:28.988 --> 00:18:30.930
Well, that's all the time we have for today.
459
00:18:30.930 --> 00:18:34.007
Join us next time with our last episode for this season.
460
00:18:34.007 --> 00:18:36.182
We'll be talking with Kay Firth-Butterfield,
461
00:18:36.182 --> 00:18:38.130
from the World Economic Forum.
462
00:18:38.130 --> 00:18:39.252
See you next time Shervin.
463
00:18:39.252 --> 00:18:40.567
You too.
464
00:18:40.567 --> 00:18:42.985
(ethereal music)
465
00:18:42.985 --> 00:18:46.120
Thanks for listening to Me, Myself, and AI.
466
00:18:46.120 --> 00:18:47.730
If you're enjoying the show,
467
00:18:47.730 --> 00:18:49.870
take a minute to write us a review.
468
00:18:49.870 --> 00:18:51.470
If you send us a screenshot,
469
00:18:51.470 --> 00:18:53.980
we'll send you a collection of MIT SMR's
470
00:18:53.980 --> 00:18:56.420
best articles on artificial intelligence,
471
00:18:56.420 --> 00:18:58.310
free for a limited time.
472
00:18:58.310 --> 00:19:02.823
Send your review screenshot to smrfeedback@mit.edu.
473
00:19:02.823 --> 00:19:05.573
(ethereal music)