WEBVTT
1
00:00:01.270 --> 00:00:03.580
When you work in an established, successful company,
2
00:00:03.580 --> 00:00:06.100
current managers already know a ton.
3
00:00:06.100 --> 00:00:08.530
Still AI solutions can offer insights
4
00:00:08.530 --> 00:00:10.690
to even experienced managers
5
00:00:10.690 --> 00:00:13.483
if you can get the humans and the AI to work together.
6
00:00:14.420 --> 00:00:15.760
In this episode,
7
00:00:15.760 --> 00:00:18.130
Prakhar Mehrotra describes some moments
8
00:00:18.130 --> 00:00:21.790
where human and AI efforts came together for Walmart.
9
00:00:21.790 --> 00:00:22.980
And even more fun,
10
00:00:22.980 --> 00:00:24.660
he describes the hard work that it took
11
00:00:24.660 --> 00:00:26.110
to make those moments happen.
12
00:00:27.600 --> 00:00:29.680
Welcome to Me, Myself and AI,
13
00:00:29.680 --> 00:00:32.603
a podcast on artificial intelligence in business.
14
00:00:33.970 --> 00:00:34.803
Each episode,
15
00:00:34.803 --> 00:00:37.083
we introduce you to someone innovating with AI.
16
00:00:38.130 --> 00:00:39.480
I'm Sam Ransbotham,
17
00:00:39.480 --> 00:00:42.760
Professor of Information Systems at Boston College.
18
00:00:42.760 --> 00:00:45.710
I'm also the guest editor for the AI and Business Strategy,
19
00:00:45.710 --> 00:00:49.610
Big Idea Program at MIT Sloan Management Review.
20
00:00:49.610 --> 00:00:53.190
And I'm Shervin Khodabandeh Senior Partner with BCG
21
00:00:53.190 --> 00:00:56.990
and I co-lead BCG's AI practice in North America.
22
00:00:56.990 --> 00:01:01.990
And together BCG and MIT SMR have been researching AI
23
00:01:02.000 --> 00:01:03.370
for four years.
24
00:01:03.370 --> 00:01:05.660
Interviewing hundreds of practitioners
25
00:01:05.660 --> 00:01:07.770
and surveying thousands of companies
26
00:01:07.770 --> 00:01:11.070
on what it takes to build and deploy
27
00:01:11.070 --> 00:01:13.650
and scale AI capabilities
28
00:01:13.650 --> 00:01:16.523
and really transform the way organizations operate.
29
00:01:17.440 --> 00:01:19.470
Shervin, I'm looking forward to kicking off our series
30
00:01:19.470 --> 00:01:21.030
with today's episode.
31
00:01:21.030 --> 00:01:22.890
Thanks Sam. Me too.
32
00:01:22.890 --> 00:01:25.450
Our guest today is Prakhar Mehrotra,
33
00:01:25.450 --> 00:01:28.920
Vice President of Machine Learning at Walmart.
34
00:01:28.920 --> 00:01:32.380
He's joining us from Sunnyvale, California.
35
00:01:32.380 --> 00:01:35.710
Prakhar, thank you so much for speaking with us today.
36
00:01:35.710 --> 00:01:37.130
Could you introduce yourself
37
00:01:37.130 --> 00:01:39.950
and share a bit about what you do?
38
00:01:39.950 --> 00:01:41.303
Hi, I'm Prakhar Mehrotra.
39
00:01:41.303 --> 00:01:45.630
I'm the Vice President of Machine Learning at Walmart US.
40
00:01:45.630 --> 00:01:49.560
My responsibilities include building algorithms
41
00:01:49.560 --> 00:01:51.020
that will power the decision making
42
00:01:51.020 --> 00:01:54.280
of our merchants into the core areas like assortment,
43
00:01:54.280 --> 00:01:57.180
pricing, inventory management, financial planning,
44
00:01:57.180 --> 00:01:59.690
all aspects of merchandising.
45
00:01:59.690 --> 00:02:03.230
I lead a team of 80 people. They are data scientists.
46
00:02:03.230 --> 00:02:04.850
They are full stack team from data scientists,
47
00:02:04.850 --> 00:02:06.350
data analysts, data engineers.
48
00:02:07.530 --> 00:02:08.983
That's my role at Walmart.
49
00:02:09.970 --> 00:02:11.410
We're particularly interested in you
50
00:02:11.410 --> 00:02:14.470
because you're a top expert in artificial intelligence.
51
00:02:14.470 --> 00:02:17.070
Can you tell us how Walmart is using artificial intelligence
52
00:02:17.070 --> 00:02:18.750
to improve their business?
53
00:02:18.750 --> 00:02:22.470
Walmart wants to use AI to serve our consumers better,
54
00:02:22.470 --> 00:02:26.090
right? And so my role is to make that happen.
55
00:02:26.090 --> 00:02:29.770
And my expertise that I gained from Uber and Twitter
56
00:02:29.770 --> 00:02:33.960
and my graduate studies have helped me achieve that dream.
57
00:02:33.960 --> 00:02:37.070
And the secret sauce that I realized was that
58
00:02:37.070 --> 00:02:38.770
AI will be successful in companies
59
00:02:38.770 --> 00:02:40.860
if we partner with business closely
60
00:02:40.860 --> 00:02:43.830
and take business stakeholders along the journey.
61
00:02:43.830 --> 00:02:46.420
It's not just about algorithms, it's about business.
62
00:02:46.420 --> 00:02:50.270
Because the eventual goal of AI is to improve the business.
63
00:02:50.270 --> 00:02:53.190
I'm responsible for all the machine algorithmic developments
64
00:02:53.190 --> 00:02:55.890
for core areas of merchandising,
65
00:02:55.890 --> 00:02:58.610
which include how do you price something?
66
00:02:58.610 --> 00:03:00.880
How do you select the right assortment?
67
00:03:00.880 --> 00:03:03.440
Replenishment strategies, forecasting and planning.
68
00:03:03.440 --> 00:03:06.470
So all the core aspects of merchandising
69
00:03:06.470 --> 00:03:08.490
is what we are trying to use machine learning
70
00:03:08.490 --> 00:03:09.393
and AI towards.
71
00:03:10.320 --> 00:03:13.490
So Prakhar, how did you get started on your path in AI?
72
00:03:13.490 --> 00:03:15.230
What are some of the more challenging aspects
73
00:03:15.230 --> 00:03:17.770
of implementing AI in your work now?
74
00:03:17.770 --> 00:03:19.887
So I started my career at Twitter.
75
00:03:19.887 --> 00:03:21.730
When I was a data scientist,
76
00:03:21.730 --> 00:03:24.170
I picked up all the fundamentals of scaling
77
00:03:24.170 --> 00:03:25.670
and engineering at Twitter.
78
00:03:25.670 --> 00:03:29.380
And Uber gave me a massive break where
79
00:03:29.380 --> 00:03:30.570
it was like a juggernaut, right?
80
00:03:30.570 --> 00:03:32.310
Like it's like it's rolling,
81
00:03:32.310 --> 00:03:34.720
like what disruption is it that Uber taught me.
82
00:03:34.720 --> 00:03:36.940
And then when I joined Walmart,
83
00:03:36.940 --> 00:03:39.430
I had learned something about AI.
84
00:03:39.430 --> 00:03:40.430
Like I knew how to,
85
00:03:40.430 --> 00:03:42.400
I had got some experience about AI Management Algorithms.
86
00:03:42.400 --> 00:03:45.160
I had picked up on the fundamentals of AI.
87
00:03:45.160 --> 00:03:48.040
And so the most challenging part about
88
00:03:48.910 --> 00:03:51.930
at least the work at Walmart on the store side is
89
00:03:51.930 --> 00:03:53.670
there are no labels in the data.
90
00:03:53.670 --> 00:03:54.900
There are no tags.
91
00:03:54.900 --> 00:03:57.080
When customers shops in our store,
92
00:03:57.080 --> 00:03:59.553
all we record, is all we have information about
93
00:03:59.553 --> 00:04:01.910
is that the transaction was made.
94
00:04:01.910 --> 00:04:04.880
Unlike social media or unlike an app where you,
95
00:04:04.880 --> 00:04:06.720
or in a Netflix type of environment,
96
00:04:06.720 --> 00:04:08.990
recommendation type of environment where you know,
97
00:04:08.990 --> 00:04:11.840
or you can track the history of a consumer
98
00:04:11.840 --> 00:04:13.300
and you can learn from it.
99
00:04:13.300 --> 00:04:15.630
That environment is not present in the store side.
100
00:04:15.630 --> 00:04:18.270
We don't know what items customers are picking up
101
00:04:18.270 --> 00:04:20.360
and when they're making these choices.
102
00:04:20.360 --> 00:04:23.130
So the job of algorithms is actually a lot harder
103
00:04:23.130 --> 00:04:25.120
as they have to infer all this,
104
00:04:25.120 --> 00:04:27.840
as opposed to directly learn from the data, right?
105
00:04:27.840 --> 00:04:31.400
Like, and so inference became a big part about Walmart
106
00:04:31.400 --> 00:04:34.800
and then translating that inference into actionable insight
107
00:04:34.800 --> 00:04:37.220
that's something we can make a forward looking decision on.
108
00:04:37.220 --> 00:04:38.430
And so that was the,
109
00:04:38.430 --> 00:04:40.860
was a key challenge at Walmart.
110
00:04:40.860 --> 00:04:42.550
How does that feel when you're
111
00:04:42.550 --> 00:04:45.080
going from a world where everything's highly quantified
112
00:04:45.080 --> 00:04:48.460
to things where everything is abstract and you're,
113
00:04:48.460 --> 00:04:50.580
but you're still asked to make a decision.
114
00:04:50.580 --> 00:04:53.525
The honest answer is you actually feel AI is a bubble.
115
00:04:53.525 --> 00:04:54.760
(laughs)
116
00:04:54.760 --> 00:04:57.900
I write about the power of the promise of AI that we have,
117
00:04:57.900 --> 00:04:59.670
that it can solve anything.
118
00:04:59.670 --> 00:05:02.020
And then I can deploy, write an algorithm
119
00:05:02.020 --> 00:05:04.360
and I'll get a very quick answer tomorrow.
120
00:05:04.360 --> 00:05:06.500
That starts to get challenged, right?
121
00:05:06.500 --> 00:05:08.730
Because when you're doing something like inference,
122
00:05:08.730 --> 00:05:10.100
or when you're trying to find,
123
00:05:10.100 --> 00:05:12.890
you're trying to identify these hidden patterns
124
00:05:12.890 --> 00:05:14.610
that are not pretty obvious.
125
00:05:14.610 --> 00:05:15.750
There are multiple challenges.
126
00:05:15.750 --> 00:05:18.060
They're challenges for a for a scientist to learn
127
00:05:18.060 --> 00:05:19.870
them from the available data that we have.
128
00:05:19.870 --> 00:05:20.703
And then also,
129
00:05:20.703 --> 00:05:22.620
how will you be able to explain it to the end user
130
00:05:22.620 --> 00:05:24.090
that why the algorithm is inferring
131
00:05:24.090 --> 00:05:25.480
what it is trying to infer.
132
00:05:25.480 --> 00:05:28.620
I think my education and, or my graduate studies at Caltech
133
00:05:28.620 --> 00:05:31.480
really helped me think through,
134
00:05:31.480 --> 00:05:34.170
like it taught us how to think of a problem.
135
00:05:34.170 --> 00:05:36.320
I still remember my candidacy
136
00:05:36.320 --> 00:05:39.310
where my advisor literally asked me a question
137
00:05:39.310 --> 00:05:41.100
that had nothing to do with my PhD
138
00:05:41.100 --> 00:05:42.730
and wanted me to defend it.
139
00:05:42.730 --> 00:05:45.700
And this was pretty routine thing in aeronautics at Caltech.
140
00:05:45.700 --> 00:05:50.210
And so that type of training helped me to move areas.
141
00:05:50.210 --> 00:05:52.670
So like when, when I decided to,
142
00:05:52.670 --> 00:05:55.650
to change fields and venture out into Silicon Valley,
143
00:05:55.650 --> 00:05:56.740
social media, Twitter,
144
00:05:56.740 --> 00:05:59.260
that was a completely different ball game, right?
145
00:05:59.260 --> 00:06:01.890
Like on the way I met people who were,
146
00:06:01.890 --> 00:06:03.730
who were on dev and not judging me,
147
00:06:03.730 --> 00:06:04.900
but they were more like,
148
00:06:04.900 --> 00:06:07.420
you know what, we'll invest in you and we'll teach you.
149
00:06:07.420 --> 00:06:08.960
And I was able to connect the dots
150
00:06:08.960 --> 00:06:12.290
between science and business.
151
00:06:12.290 --> 00:06:16.070
Prakhar, you talked about the scientist's job
152
00:06:16.070 --> 00:06:21.070
being more challenging in an environment where, you know,
153
00:06:21.440 --> 00:06:25.050
data's not tagged and inferences have to be made.
154
00:06:25.050 --> 00:06:30.050
What are your sort of observations around the attributes
155
00:06:31.420 --> 00:06:33.650
of a good scientist in business
156
00:06:33.650 --> 00:06:38.083
versus a good scientist in academia or in a research lab?
157
00:06:39.080 --> 00:06:41.070
Yeah, that's a good question.
158
00:06:41.070 --> 00:06:42.240
At the end of the day,
159
00:06:42.240 --> 00:06:45.330
if a scientist has decided to spend time in industry
160
00:06:45.330 --> 00:06:47.670
and work in, partner with
161
00:06:47.670 --> 00:06:49.320
and work in companies like Walmart,
162
00:06:49.320 --> 00:06:51.630
like oil companies, or like all this,
163
00:06:51.630 --> 00:06:54.450
I would say non traditional software companies
164
00:06:54.450 --> 00:06:56.030
where the core business is different.
165
00:06:56.030 --> 00:06:58.080
The key element becomes that you should be able
166
00:06:58.080 --> 00:07:00.160
to explain what you are doing.
167
00:07:00.160 --> 00:07:02.630
You should take the business user on the journey.
168
00:07:02.630 --> 00:07:03.463
A data scientist,
169
00:07:03.463 --> 00:07:04.640
we usually say that we should,
170
00:07:04.640 --> 00:07:06.760
a data scientist should be able to tell a good story,
171
00:07:06.760 --> 00:07:08.810
but the story has many parts, right?
172
00:07:08.810 --> 00:07:11.400
Like when, when I first started working at Walmart,
173
00:07:11.400 --> 00:07:13.810
I actually spent first three months of my time in stores.
174
00:07:13.810 --> 00:07:16.120
Just trying to understand the terminology.
175
00:07:16.120 --> 00:07:18.160
What does VPI means?
176
00:07:18.160 --> 00:07:20.410
What does a core metrics are like?
177
00:07:20.410 --> 00:07:21.530
How do we do things?
178
00:07:21.530 --> 00:07:25.660
And that basically got me credibility with the leadership.
179
00:07:25.660 --> 00:07:26.530
That all sounds very easy.
180
00:07:26.530 --> 00:07:28.188
I think we can wrap from here.
181
00:07:28.188 --> 00:07:29.460
(laughs)
182
00:07:29.460 --> 00:07:31.210
One thing I also came to realize
183
00:07:31.210 --> 00:07:34.060
is that when you're taking bids as big bets or moonshots
184
00:07:34.060 --> 00:07:35.700
and enterprise setting,
185
00:07:35.700 --> 00:07:38.520
usually there's a very, optimism on day one,
186
00:07:38.520 --> 00:07:40.760
but you have to deliver something quickly
187
00:07:40.760 --> 00:07:42.310
to retain their trust.
188
00:07:42.310 --> 00:07:43.560
And so there's a delicate balance.
189
00:07:43.560 --> 00:07:45.730
So that transition was very challenging for me,
190
00:07:45.730 --> 00:07:48.330
where in my previous roles,
191
00:07:48.330 --> 00:07:49.860
everybody was believing in data science
192
00:07:49.860 --> 00:07:50.693
and here we are we have to go do this.
193
00:07:50.693 --> 00:07:52.680
And it was more about execution
194
00:07:52.680 --> 00:07:54.670
and writing those codes and data faster,
195
00:07:54.670 --> 00:07:57.450
and move fast break things, right?
196
00:07:57.450 --> 00:08:00.280
That's a mantra in Silicon Valley here, like,
197
00:08:00.280 --> 00:08:01.860
it was kind of slightly different at,
198
00:08:01.860 --> 00:08:02.930
in my current role at Walmart,
199
00:08:02.930 --> 00:08:05.230
where I have to act as a thought partner
200
00:08:05.230 --> 00:08:06.750
and show them the knots.
201
00:08:06.750 --> 00:08:08.360
And show them what is possible,
202
00:08:08.360 --> 00:08:09.850
and also tell them the risks.
203
00:08:09.850 --> 00:08:12.370
So balance between overselling
204
00:08:12.370 --> 00:08:14.020
and showing what is possible.
205
00:08:14.020 --> 00:08:16.550
That was also a big challenge for me as a leader.
206
00:08:16.550 --> 00:08:19.650
What's the parallel difficult challenge at Walmart
207
00:08:19.650 --> 00:08:21.233
that's not execution?
208
00:08:22.170 --> 00:08:23.670
Explaining what is possible
209
00:08:23.670 --> 00:08:28.230
and deciding on the big bets that, this is the power of AI.
210
00:08:28.230 --> 00:08:30.080
Because I think it's a,
211
00:08:30.080 --> 00:08:31.550
it's not only unique to Walmart.
212
00:08:31.550 --> 00:08:34.120
It's unique to any industry like healthcare,
213
00:08:34.120 --> 00:08:36.730
or wherever you have this human expertise
214
00:08:36.730 --> 00:08:37.840
where human has expertise
215
00:08:37.840 --> 00:08:40.700
because of our way we think or the way we are wired.
216
00:08:40.700 --> 00:08:43.070
And way we can deal with uncertainty
217
00:08:43.070 --> 00:08:44.460
or unforeseen situations, right?
218
00:08:44.460 --> 00:08:47.960
Like, in where the cost of doing a mistake is very heavy.
219
00:08:47.960 --> 00:08:49.160
There's a penalization cost.
220
00:08:49.160 --> 00:08:50.030
Like in some sense,
221
00:08:50.030 --> 00:08:52.700
I was able to find this parallel with aerospace
222
00:08:52.700 --> 00:08:56.630
where when a rocket is launched it has to now land on Mars.
223
00:08:56.630 --> 00:08:58.380
Right.There is no coming back. Right?
224
00:08:58.380 --> 00:09:00.500
And so while most of the data science training
225
00:09:00.500 --> 00:09:03.070
that I had in my previous roles or jobs was like,
226
00:09:03.070 --> 00:09:04.877
you can do millions of experiments, right?
227
00:09:04.877 --> 00:09:07.353
That's not the paradigm here.
228
00:09:08.250 --> 00:09:11.960
And your algorithm is not the only decision maker, right?
229
00:09:11.960 --> 00:09:14.280
Like a good example might be
230
00:09:14.280 --> 00:09:15.750
how we decide assortment, right?
231
00:09:15.750 --> 00:09:18.700
Or how we think about what items should go into store.
232
00:09:18.700 --> 00:09:20.110
That involves financial planning,
233
00:09:20.110 --> 00:09:23.280
negotiation with the suppliers, costing, how do I price it?
234
00:09:23.280 --> 00:09:25.350
That's not just an algorithmic decisions.
235
00:09:25.350 --> 00:09:27.490
On the other hand, I see the point here too,
236
00:09:27.490 --> 00:09:31.450
that unless some of these sort of mistakes are being made,
237
00:09:31.450 --> 00:09:35.720
there's a danger of slipping back into the execution.
238
00:09:35.720 --> 00:09:38.420
And then it slips back into a pure execution,
239
00:09:38.420 --> 00:09:40.270
pure refinement mode
240
00:09:40.270 --> 00:09:42.960
versus I think what you might call more of an exchange mode
241
00:09:42.960 --> 00:09:44.710
where you're exchanging experiences.
242
00:09:44.710 --> 00:09:47.350
So, execution versus exchange.
243
00:09:47.350 --> 00:09:48.330
It's about the journey.
244
00:09:48.330 --> 00:09:49.910
It's not about just the end execution.
245
00:09:49.910 --> 00:09:51.510
It's about the journey you take, right.
246
00:09:51.510 --> 00:09:54.170
It's about, and the journey involves exchange of ideas
247
00:09:54.170 --> 00:09:55.730
like you can't execute it
248
00:09:55.730 --> 00:09:57.900
if you have not taken people along.
249
00:09:57.900 --> 00:09:59.410
And I think there's also a difference
250
00:09:59.410 --> 00:10:01.763
between it's a journey from BI to AI, right?
251
00:10:01.763 --> 00:10:03.080
Like business intelligence,
252
00:10:03.080 --> 00:10:04.270
like that's how it's called, right.
253
00:10:04.270 --> 00:10:05.210
So you have to take a journey
254
00:10:05.210 --> 00:10:08.330
from business intelligence to artificial intelligence.
255
00:10:08.330 --> 00:10:10.370
Prakhar, you were a finalist for the Edelman Award.
256
00:10:10.370 --> 00:10:11.203
Yes.
257
00:10:11.203 --> 00:10:12.036
For those of you who don't know,
258
00:10:12.036 --> 00:10:14.140
the Edelman Award recognizes examples
259
00:10:14.140 --> 00:10:16.670
of outstanding operations research and practice,
260
00:10:16.670 --> 00:10:18.850
which is a big deal in the OR world.
261
00:10:18.850 --> 00:10:19.740
How does that feel?
262
00:10:19.740 --> 00:10:21.330
How did your team feel? How did you feel?
263
00:10:21.330 --> 00:10:22.177
It was a proud moment.
264
00:10:22.177 --> 00:10:23.370
It was a proud achievement
265
00:10:23.370 --> 00:10:26.040
because not only it was like a breakthrough from
266
00:10:26.040 --> 00:10:27.590
because I'm coming from aerospace and like,
267
00:10:27.590 --> 00:10:30.860
OR community is recognizing the work that I'm doing.
268
00:10:30.860 --> 00:10:34.100
It was also a proud moment for me and Walmart
269
00:10:34.100 --> 00:10:35.740
that look the work that we are doing
270
00:10:35.740 --> 00:10:37.660
is recognized by a broader community.
271
00:10:37.660 --> 00:10:41.730
So, I feel very humbled and like, it's like,
272
00:10:41.730 --> 00:10:43.270
yes, you are doing something right, right.
273
00:10:43.270 --> 00:10:46.930
Because my PhD, my thesis is not in this.
274
00:10:46.930 --> 00:10:48.320
And so when you're running something
275
00:10:48.320 --> 00:10:49.237
for a force driven company
276
00:10:49.237 --> 00:10:50.520
and a broader community recognizes you,
277
00:10:50.520 --> 00:10:52.300
it was like a stamp of assurance
278
00:10:52.300 --> 00:10:53.580
that yes, you've got it right.
279
00:10:53.580 --> 00:10:55.660
A moonshot might be possible.
280
00:10:55.660 --> 00:10:58.000
For me one reason why I chose Walmart
281
00:10:58.000 --> 00:10:59.610
as a place to work in was
282
00:10:59.610 --> 00:11:04.210
because like a dollar or 10 cents price savings
283
00:11:04.210 --> 00:11:06.520
might not mean much for at least
284
00:11:06.520 --> 00:11:08.010
most of the people in Silicon Valley
285
00:11:08.010 --> 00:11:08.970
or at least I always thought of,
286
00:11:08.970 --> 00:11:12.720
but that 10 cents can mean a world to a consumer.
287
00:11:12.720 --> 00:11:15.059
And so that basically gave me a meaning to it,
288
00:11:15.059 --> 00:11:15.892
like you know what it is,
289
00:11:15.892 --> 00:11:19.240
it's about finding the 10 cents savings or 20 cents savings.
290
00:11:19.240 --> 00:11:21.570
And you do that across many items
291
00:11:21.570 --> 00:11:22.750
that we carry in our stores
292
00:11:22.750 --> 00:11:25.220
and across our merchandising network
293
00:11:25.220 --> 00:11:28.603
and those 10 cents add up and make a $10 and make $20.
294
00:11:28.603 --> 00:11:30.790
I Like the framing of, you know,
295
00:11:30.790 --> 00:11:31.900
you check my math on this,
296
00:11:31.900 --> 00:11:34.100
but I'm pretty sure if you save 10 cents, 10 times,
297
00:11:34.100 --> 00:11:34.933
you're gonna have a dollar.
298
00:11:34.933 --> 00:11:38.610
And you know that you keep doing that over and over.
299
00:11:38.610 --> 00:11:42.450
And I like the way you framed it from not taking the dollar,
300
00:11:42.450 --> 00:11:45.290
you're saving the dollar out of the process.
301
00:11:45.290 --> 00:11:48.690
And I think that's where your team has a lot of potential.
302
00:11:48.690 --> 00:11:49.523
Yeah.
303
00:11:49.523 --> 00:11:52.220
Tell us a bit about your role at Walmart
304
00:11:52.220 --> 00:11:53.710
and like how much of that
305
00:11:53.710 --> 00:11:58.450
is science and technical management?
306
00:11:58.450 --> 00:12:01.580
How much of that is team and stakeholder management?
307
00:12:01.580 --> 00:12:05.290
How much of that is evangelism and inspiration,
308
00:12:05.290 --> 00:12:06.750
and whatever else?
309
00:12:06.750 --> 00:12:11.350
I spend probably 30% of my time in management,
310
00:12:11.350 --> 00:12:13.430
which involves upward management,
311
00:12:13.430 --> 00:12:15.130
trying to set the expectations with the company.
312
00:12:15.130 --> 00:12:15.963
What is possible,
313
00:12:15.963 --> 00:12:18.630
what is not acting as a thought partner to the leaderships.
314
00:12:18.630 --> 00:12:20.683
Both in the technology side, and in the business side.
315
00:12:20.683 --> 00:12:24.450
Another 10 to 20%, because I am a firm believer
316
00:12:24.450 --> 00:12:26.850
that if you are a AI leader
317
00:12:26.850 --> 00:12:29.350
and you're leading a team and you're in management,
318
00:12:29.350 --> 00:12:31.650
you can't just be a people manager.
319
00:12:31.650 --> 00:12:34.130
What's the funnest part of your job?
320
00:12:34.130 --> 00:12:36.190
Funnest part of my job is...
321
00:12:36.190 --> 00:12:37.308
Besides talking to us.
322
00:12:37.308 --> 00:12:40.580
(laughs) Of course, stuff like this.
323
00:12:40.580 --> 00:12:42.190
I get to opportunity at Walmart.
324
00:12:42.190 --> 00:12:44.120
I think the most funnest part is
325
00:12:45.480 --> 00:12:48.690
when you see somebody who is not a believer in AI,
326
00:12:48.690 --> 00:12:50.570
and starts believing in AI.
327
00:12:50.570 --> 00:12:53.020
So that happiness that you get,
328
00:12:53.020 --> 00:12:55.090
when you see people start to believe in something,
329
00:12:55.090 --> 00:12:57.740
the passion that you share is amazing.
330
00:12:57.740 --> 00:12:59.840
And second is like,
331
00:12:59.840 --> 00:13:02.500
I'm just making the fortune one company a better place.
332
00:13:02.500 --> 00:13:05.690
Like Walmart is a essential part of our life.
333
00:13:05.690 --> 00:13:07.220
It's as part of like,
334
00:13:07.220 --> 00:13:09.257
I mean, during this difficult times in Covid,
335
00:13:09.257 --> 00:13:11.400
like we have to keep our stores open, right.
336
00:13:11.400 --> 00:13:13.170
We have to do it. It has a role in the society.
337
00:13:13.170 --> 00:13:14.730
And so you keep that running,
338
00:13:14.730 --> 00:13:16.680
you play a teeny-tiny part in that.
339
00:13:16.680 --> 00:13:17.580
And so that is,
340
00:13:17.580 --> 00:13:21.180
it gives a meaning for me to come everyday to office.
341
00:13:21.180 --> 00:13:24.190
And then support from the leadership, right.
342
00:13:24.190 --> 00:13:26.190
Like those are the best parts of my job.
343
00:13:27.080 --> 00:13:28.250
And then you build a team.
344
00:13:28.250 --> 00:13:32.100
You have a team of very smart people spread across,
345
00:13:32.100 --> 00:13:34.280
who share the passion with me
346
00:13:34.280 --> 00:13:35.260
and you'll see them rising
347
00:13:35.260 --> 00:13:37.620
and declaring working with young people, right.
348
00:13:37.620 --> 00:13:40.400
And then looking up to this crazy wave
349
00:13:40.400 --> 00:13:41.620
that we are riding in AI, right?
350
00:13:41.620 --> 00:13:42.680
They're like, they're on one side,
351
00:13:42.680 --> 00:13:44.050
everything is possible than other,
352
00:13:44.050 --> 00:13:46.710
you come to job and you're like, no, it's not possible.
353
00:13:46.710 --> 00:13:48.560
Right. There's ups and downs that you see,
354
00:13:48.560 --> 00:13:51.360
it's a roller coaster ride leading a AI team
355
00:13:51.360 --> 00:13:53.270
and a work stream.
356
00:13:53.270 --> 00:13:54.220
Where do I apply?
357
00:13:55.460 --> 00:13:59.040
But like, Walmart was never on my radar also to join
358
00:13:59.040 --> 00:14:01.387
because I was like, why Walmart? Right.
359
00:14:01.387 --> 00:14:03.410
And like when you have Googles
360
00:14:03.410 --> 00:14:05.970
and Facebooks next to you, where you live.
361
00:14:05.970 --> 00:14:09.350
And then when you realize, like, what I realized was that,
362
00:14:09.350 --> 00:14:10.870
like we had to tell the story,
363
00:14:10.870 --> 00:14:12.170
somebody had to make this connection
364
00:14:12.170 --> 00:14:14.330
between the awesomeness of retail
365
00:14:14.330 --> 00:14:17.350
and how it connects to daily life
366
00:14:17.350 --> 00:14:19.010
to the power of machine learning.
367
00:14:19.010 --> 00:14:22.170
And so I spent a lot of time there, rest of my time there.
368
00:14:22.170 --> 00:14:26.490
So, and whenever I'm not at work I'm at home,
369
00:14:26.490 --> 00:14:29.850
playing with a daughter and then figuring out life.
370
00:14:29.850 --> 00:14:30.890
Well, we don't wanna keep Prakhar
371
00:14:30.890 --> 00:14:32.270
from his family any longer.
372
00:14:32.270 --> 00:14:34.080
Thanks for taking the time to talk with us Prakhar.
373
00:14:34.080 --> 00:14:35.080
Thank you so much.
374
00:14:36.870 --> 00:14:38.830
So Sam let's recap what we heard from Prakhar.
375
00:14:38.830 --> 00:14:40.580
He made a lot of good points.
376
00:14:40.580 --> 00:14:42.050
Yeah. A lot of great points.
377
00:14:42.050 --> 00:14:44.630
I certainly enjoyed the conversation with him a lot.
378
00:14:44.630 --> 00:14:46.300
There was a lot of passion,
379
00:14:46.300 --> 00:14:48.640
but there's also a lot of also
380
00:14:48.640 --> 00:14:51.920
understanding of sort of deep practicalities
381
00:14:51.920 --> 00:14:56.480
of what it takes to actually transform a company at scale,
382
00:14:56.480 --> 00:14:57.610
a company like Walmart.
383
00:14:57.610 --> 00:15:01.800
Because, you know, he's talking about certain processes
384
00:15:01.800 --> 00:15:06.330
where you cannot be dogmatic about it
385
00:15:06.330 --> 00:15:08.540
and say, well, this is what the engine says,
386
00:15:08.540 --> 00:15:09.387
and therefore you should do that.
387
00:15:09.387 --> 00:15:12.260
And some of these things are inventory management
388
00:15:12.260 --> 00:15:17.260
or on shelf assortment or store operations
389
00:15:17.560 --> 00:15:20.330
and store labor and things like that, where, you know,
390
00:15:20.330 --> 00:15:23.330
he talked about this notion of exchange
391
00:15:23.330 --> 00:15:28.330
and bringing the business owners along for the ride
392
00:15:29.070 --> 00:15:29.990
and during the ride.
393
00:15:29.990 --> 00:15:32.600
And so the designing the solution with them in mind,
394
00:15:32.600 --> 00:15:35.070
so that by the time it's done, they're not surprised.
395
00:15:35.070 --> 00:15:38.330
And they've been not only involved,
396
00:15:38.330 --> 00:15:42.830
but instrumental in its design and build and incubation
397
00:15:42.830 --> 00:15:44.290
and implementation.
398
00:15:44.290 --> 00:15:45.740
And that's really, really critical.
399
00:15:45.740 --> 00:15:47.020
The tough part too is, that, you know,
400
00:15:47.020 --> 00:15:48.413
these things aren't going to be perfect.
401
00:15:48.413 --> 00:15:51.430
I think I really heard that in his discussion
402
00:15:51.430 --> 00:15:53.990
that he knew that they wanna be perfect on day one.
403
00:15:53.990 --> 00:15:55.970
And when they're not perfect,
404
00:15:55.970 --> 00:15:57.560
he's going to lose some credibility.
405
00:15:57.560 --> 00:15:59.520
And how do they build that trust
406
00:15:59.520 --> 00:16:03.290
and how do they build that credibility on an ongoing basis.
407
00:16:03.290 --> 00:16:04.890
Yeah. And that's a very good point too.
408
00:16:04.890 --> 00:16:07.910
And I'm reading between the lines of what he said,
409
00:16:07.910 --> 00:16:11.620
but an admission of sort of vulnerability
410
00:16:11.620 --> 00:16:14.980
and willingness for the AI engine
411
00:16:14.980 --> 00:16:18.850
and for his teams to learn from those experts
412
00:16:18.850 --> 00:16:20.820
and setting the right expectations.
413
00:16:20.820 --> 00:16:24.320
That just because we've built a piece of technology,
414
00:16:24.320 --> 00:16:26.550
it's not supposed to be a 100% perfect.
415
00:16:26.550 --> 00:16:29.930
That is actually not how any learning system works.
416
00:16:29.930 --> 00:16:31.410
Yeah. I like your word vulnerability there
417
00:16:31.410 --> 00:16:32.900
because it came across.
418
00:16:32.900 --> 00:16:34.030
I mean, he's clearly smart.
419
00:16:34.030 --> 00:16:35.490
He clearly knows what he's doing,
420
00:16:35.490 --> 00:16:37.220
but he's still willing to learn
421
00:16:37.220 --> 00:16:39.810
and listen to what other people said
422
00:16:39.810 --> 00:16:41.800
and recognize that his algorithms
423
00:16:41.800 --> 00:16:43.960
weren't gonna be perfect right off the bat.
424
00:16:43.960 --> 00:16:45.700
That was a humility that came through.
425
00:16:45.700 --> 00:16:48.630
And I think actually that is a secret sauce
426
00:16:48.630 --> 00:16:52.620
of somebody like him, whether it's at Walmart
427
00:16:52.620 --> 00:16:57.620
or another person like him in a different company,
428
00:16:57.780 --> 00:17:01.860
being successful in that role is the willingness to listen.
429
00:17:01.860 --> 00:17:04.370
The willingness to partner, the, you know,
430
00:17:04.370 --> 00:17:07.720
ability to admit vulnerability and the desire to learn.
431
00:17:07.720 --> 00:17:09.430
And that passion that he has
432
00:17:09.430 --> 00:17:13.340
that look, when this thing works it's fantastic.
433
00:17:13.340 --> 00:17:14.620
And when it doesn't work,
434
00:17:14.620 --> 00:17:18.160
I've already sort of set your expectations
435
00:17:18.160 --> 00:17:19.770
that it will not always work perfectly,
436
00:17:19.770 --> 00:17:22.690
but every day it will get better than the day before.
437
00:17:22.690 --> 00:17:25.500
And I think that humility and that willingness
438
00:17:25.500 --> 00:17:28.150
is a real characteristics of folks
439
00:17:28.150 --> 00:17:30.100
that are in these roles increasingly
440
00:17:30.100 --> 00:17:32.360
cause like there could be some organ reject
441
00:17:32.360 --> 00:17:36.570
that you bring an expert in AI from a different industry,
442
00:17:36.570 --> 00:17:40.130
different field, like Silicon Valley, like Uber,
443
00:17:40.130 --> 00:17:43.810
into a traditionally a brick and mortar company,
444
00:17:43.810 --> 00:17:47.860
because there is a belief that, hey, we've already done it.
445
00:17:47.860 --> 00:17:49.020
It's the right way.
446
00:17:49.020 --> 00:17:50.970
You guys have to just listen to me
447
00:17:50.970 --> 00:17:52.250
and we know that won't work.
448
00:17:52.250 --> 00:17:55.120
And so the sort of EQ that comes along
449
00:17:55.120 --> 00:17:57.660
with a role like that is really, really super crucial.
450
00:17:57.660 --> 00:17:59.800
And he really demonstrated that too.
451
00:17:59.800 --> 00:18:02.360
One of the things that was interesting about Prakhar,
452
00:18:02.360 --> 00:18:04.170
was how much it aligned
453
00:18:04.170 --> 00:18:06.640
with what we found in our research this year.
454
00:18:06.640 --> 00:18:08.980
We found that only 10% of organizations are getting
455
00:18:08.980 --> 00:18:12.370
significant financial benefits from artificial intelligence.
456
00:18:12.370 --> 00:18:15.240
And Prakhar really shows why that's so hard.
457
00:18:15.240 --> 00:18:18.180
Most of the things he talked about, weren't technical.
458
00:18:18.180 --> 00:18:19.550
You could see him almost wistful
459
00:18:19.550 --> 00:18:21.503
for the days of perfectly labeled data,
460
00:18:22.420 --> 00:18:24.400
but that wasn't the problems that he was facing.
461
00:18:24.400 --> 00:18:27.040
Yeah. The problems were a lot more
462
00:18:27.040 --> 00:18:32.040
organizational, change management, bringing users along.
463
00:18:32.870 --> 00:18:37.020
That's all the human aspect and not so much the tech aspect.
464
00:18:37.020 --> 00:18:42.020
And, you know, I think part of his formula for the 10%
465
00:18:42.140 --> 00:18:46.160
is going in upfront with the admission
466
00:18:46.160 --> 00:18:47.710
that AI is not perfect.
467
00:18:47.710 --> 00:18:51.520
And AI has a ton to learn from the process,
468
00:18:51.520 --> 00:18:54.380
from the experts, from the organization.
469
00:18:54.380 --> 00:18:56.150
And sometimes it will be right.
470
00:18:56.150 --> 00:18:57.030
And when it's right,
471
00:18:57.030 --> 00:19:00.220
it will be accretive to the judgment of those people.
472
00:19:00.220 --> 00:19:02.940
And sometimes it will be wrong and when it's wrong,
473
00:19:02.940 --> 00:19:05.020
it has the ability to learn.
474
00:19:05.020 --> 00:19:07.180
And so I think actually going in with a mindset
475
00:19:07.180 --> 00:19:08.860
that AI is perfect.
476
00:19:08.860 --> 00:19:11.470
It's sure a recipe for disaster, right.
477
00:19:11.470 --> 00:19:15.540
And any good AI practitioner knows that that's not the case.
478
00:19:15.540 --> 00:19:16.373
Exactly.
479
00:19:18.140 --> 00:19:20.090
Shervin and I are really excited about our next episode
480
00:19:20.090 --> 00:19:21.950
with Slawek Kierner from Humana.
481
00:19:21.950 --> 00:19:22.823
Please join us.
482
00:19:27.640 --> 00:19:30.400
Thanks for listening to Me, Myself and AI.
483
00:19:30.400 --> 00:19:32.010
If you're enjoying the show,
484
00:19:32.010 --> 00:19:34.130
take a minute to write us a review.
485
00:19:34.130 --> 00:19:35.740
If you send us a screenshot,
486
00:19:35.740 --> 00:19:39.030
we'll send you a collection of MIT SMRs best articles
487
00:19:39.030 --> 00:19:42.560
on artificial intelligence free for a limited time.
488
00:19:42.560 --> 00:19:47.343
Send your review screenshot to smrfeedback@mit.edu.