WEBVTT
1
00:00:02.830 --> 00:00:05.220
Flowers are not digital products at all
2
00:00:05.220 --> 00:00:07.920
but digital technologies like artificial intelligence
3
00:00:07.920 --> 00:00:10.130
can still offer considerable value to companies
4
00:00:10.130 --> 00:00:11.853
that sell non-digital products.
5
00:00:13.320 --> 00:00:15.840
In today's episode, Amit Shah, president
6
00:00:15.840 --> 00:00:17.997
of 1-800-Flowers describes how AI
7
00:00:17.997 --> 00:00:19.860
and machine learning offered
8
00:00:19.860 --> 00:00:22.730
through a platform like 1-800-Flowers can empower
9
00:00:22.730 --> 00:00:26.580
small companies to compete with much larger organizations.
10
00:00:26.580 --> 00:00:29.350
Welcome to Me, Myself, and AI, a podcast
11
00:00:29.350 --> 00:00:31.770
on artificial intelligence in business.
12
00:00:31.770 --> 00:00:33.310
Each episode, we introduce you
13
00:00:33.310 --> 00:00:35.640
to someone innovating with AI.
14
00:00:35.640 --> 00:00:36.950
I'm Sam Ransbotham,
15
00:00:36.950 --> 00:00:40.180
Professor of Information Systems at Boston College.
16
00:00:40.180 --> 00:00:42.207
I'm also the Guest Editor for the AI
17
00:00:42.207 --> 00:00:43.250
and Business Strategy
18
00:00:43.250 --> 00:00:47.260
Big Idea program at MIT Sloan Management Review.
19
00:00:47.260 --> 00:00:51.150
And, I'm Shervin Khodabandeh, Senior Partner with BCG,
20
00:00:51.150 --> 00:00:54.960
and I co-lead BCG's AI practice in North America.
21
00:00:54.960 --> 00:00:59.030
And together, MIT SMR and BCG have been researching AI
22
00:00:59.030 --> 00:01:02.640
for five years, interviewing hundreds of practitioners
23
00:01:02.640 --> 00:01:05.370
and surveying thousands of companies on what it takes
24
00:01:05.370 --> 00:01:09.300
to build and to deploy and scale AI capabilities
25
00:01:09.300 --> 00:01:10.670
across the organization
26
00:01:10.670 --> 00:01:13.320
and really transform the way organizations operate.
27
00:01:15.640 --> 00:01:16.980
Today, we're joined by Amit Shah.
28
00:01:16.980 --> 00:01:19.950
Amit's the president of 1-800-Flowers.
29
00:01:19.950 --> 00:01:20.783
Amit, welcome.
30
00:01:20.783 --> 00:01:23.010
We're excited to learn more about what you're doing.
31
00:01:23.010 --> 00:01:26.600
Thank you so much, Sam and Shervin, great to be here.
32
00:01:26.600 --> 00:01:29.110
Amit, we talked earlier, and discovered that we all,
33
00:01:29.110 --> 00:01:30.970
had some connection to the state of Georgia,
34
00:01:30.970 --> 00:01:33.030
and you and Shervin and I.
35
00:01:33.030 --> 00:01:34.470
But now you're in Long Island.
36
00:01:34.470 --> 00:01:36.920
You're the president of 1-800-Flowers.
37
00:01:36.920 --> 00:01:39.960
Tell us a little bit about your career path to get there.
38
00:01:39.960 --> 00:01:42.660
So I started out as an analyst at McKinsey,
39
00:01:42.660 --> 00:01:46.870
which is very well known for helping teach problem solvings
40
00:01:46.870 --> 00:01:49.990
both individually and collectively at scale,
41
00:01:49.990 --> 00:01:54.990
and then worked at a range of startups in the Northeast area
42
00:01:55.350 --> 00:02:00.350
and ended up being part of Liberty Media ProFlowers Group
43
00:02:00.450 --> 00:02:04.370
at an early stage of my career, and got very much involved,
44
00:02:04.370 --> 00:02:08.070
I would say, in the front seat of growth hacking.
45
00:02:08.070 --> 00:02:11.590
And that's what led me to sort of the current career
46
00:02:11.590 --> 00:02:14.440
as I've seen growth hacking and sort of the mindset
47
00:02:14.440 --> 00:02:18.100
of the hacker, looking for the continuous change
48
00:02:18.100 --> 00:02:21.670
continuous upliftment, and a continuous desire
49
00:02:21.670 --> 00:02:24.030
to provide the best customer experience,
50
00:02:24.030 --> 00:02:25.950
actually both prepaid,
51
00:02:25.950 --> 00:02:29.310
and ultimately become the most critical element
52
00:02:29.310 --> 00:02:33.430
of any C-suite or boardrooms at large.
53
00:02:33.430 --> 00:02:35.570
Amit, starting in a marketing role
54
00:02:35.570 --> 00:02:37.300
and then transitioning to president,
55
00:02:37.300 --> 00:02:39.840
that marketing background must've made a difference.
56
00:02:39.840 --> 00:02:40.850
How has being
57
00:02:40.850 --> 00:02:44.150
from marketing affected your role as president?
58
00:02:44.150 --> 00:02:45.590
That's a great question Sam.
59
00:02:45.590 --> 00:02:49.360
Evolving into the role of leading, you know,
60
00:02:49.360 --> 00:02:52.020
multiple functions, departments
61
00:02:52.020 --> 00:02:55.040
and colleagues outside of marketing,
62
00:02:55.040 --> 00:02:58.610
what really stands out is two key elements.
63
00:02:58.610 --> 00:03:01.630
The first thing is that I think marketing traditionally
64
00:03:01.630 --> 00:03:06.630
was seen as one of the functional competencies in empowering
65
00:03:07.200 --> 00:03:09.690
and accelerating problem solving.
66
00:03:09.690 --> 00:03:12.620
Marketing is becoming a growth function,
67
00:03:12.620 --> 00:03:17.330
and growth is becoming the key differentiator of companies.
68
00:03:17.330 --> 00:03:21.110
So I expect to continue to see an acceleration
69
00:03:21.110 --> 00:03:25.330
of marketing leaders actually taking on much more leadership
70
00:03:25.330 --> 00:03:30.330
responsibilities and ultimately starting to steer the ship
71
00:03:30.330 --> 00:03:33.450
because growth has become the essential currency
72
00:03:33.450 --> 00:03:35.470
and essential differentiator.
73
00:03:35.470 --> 00:03:37.170
So that is one vector.
74
00:03:37.170 --> 00:03:39.860
And then I would say the second vector that informs me
75
00:03:39.860 --> 00:03:43.660
and prescient to our AI conversation is that I feel
76
00:03:43.660 --> 00:03:46.480
like people in the marketing sphere are actually much
77
00:03:46.480 --> 00:03:51.200
more contextually aware and contextually practicing
78
00:03:51.200 --> 00:03:54.770
advanced problem solving, using machine learning
79
00:03:54.770 --> 00:03:56.600
than a lot of their peers.
80
00:03:56.600 --> 00:04:00.140
So I think there's a unique ability and a unique mindset
81
00:04:00.140 --> 00:04:02.590
that I bring to this leadership role as president
82
00:04:02.590 --> 00:04:07.130
of 1-800-Flowers, having been exposed to that quality
83
00:04:07.130 --> 00:04:09.350
and quantum of problem solving,
84
00:04:09.350 --> 00:04:10.950
which is surrounding a lot of growth
85
00:04:10.950 --> 00:04:14.240
and marketing leaders around us.
86
00:04:14.240 --> 00:04:16.990
Let's address the obvious question, flowers themselves
87
00:04:16.990 --> 00:04:18.480
aren't digital at all.
88
00:04:18.480 --> 00:04:20.760
How is 1-800-Flowers using digital data,
89
00:04:20.760 --> 00:04:21.913
and AI in particular?
90
00:04:23.090 --> 00:04:27.260
Currently, we are a platform of 15 brands.
91
00:04:27.260 --> 00:04:28.530
So if you think about it, you know,
92
00:04:28.530 --> 00:04:33.130
we are a platform that empowers engagement.
93
00:04:33.130 --> 00:04:35.750
We play along the full spectrum
94
00:04:35.750 --> 00:04:40.180
of human expression and engagement, starting from birthdays
95
00:04:40.180 --> 00:04:43.150
all the way to sympathy and everything in between.
96
00:04:43.150 --> 00:04:44.457
So that is who we are.
97
00:04:44.457 --> 00:04:48.510
You know, we think that we have built an all-star range
98
00:04:48.510 --> 00:04:53.210
of brands to really power up an engagement platform.
99
00:04:53.210 --> 00:04:54.043
So if you think
100
00:04:54.043 --> 00:04:57.450
about what differentiates modern organizations,
101
00:04:57.450 --> 00:05:00.080
it is not just the ability to adopt technologies
102
00:05:00.080 --> 00:05:03.270
which has become a table stake, but the ability
103
00:05:03.270 --> 00:05:08.180
to out-solve their competitors in facing deep problems.
104
00:05:08.180 --> 00:05:12.600
So when I think about AI, I think about our competitiveness
105
00:05:12.600 --> 00:05:15.160
on that frontier.
106
00:05:15.160 --> 00:05:17.400
Are we better problem solvers?
107
00:05:17.400 --> 00:05:19.200
And I'll give you a terrific example of that.
108
00:05:19.200 --> 00:05:22.130
You know, when I started my career 20 years ago
109
00:05:22.130 --> 00:05:24.540
as a young analyst at McKinsey,
110
00:05:24.540 --> 00:05:26.210
there was a clear differentiator
111
00:05:26.210 --> 00:05:28.370
between people who are masters
112
00:05:28.370 --> 00:05:31.440
of Excel and who are not, right?
113
00:05:31.440 --> 00:05:35.500
It was a tool that empowered decision making
114
00:05:35.500 --> 00:05:39.600
at scale and communication of the decision making.
115
00:05:39.600 --> 00:05:41.090
When I think about AI
116
00:05:41.090 --> 00:05:43.270
and its power five years down the road,
117
00:05:43.270 --> 00:05:45.370
I think every new employee that starts
118
00:05:45.370 --> 00:05:49.120
out will actually have an AI tool kit
119
00:05:49.120 --> 00:05:52.000
like we used to get the Excel toolkit
120
00:05:52.000 --> 00:05:54.470
to both solve problems better and communicate
121
00:05:54.470 --> 00:05:59.150
that better to clients, to colleagues, to any stakeholder.
122
00:05:59.150 --> 00:06:02.920
So AI to me is not a skill-set issue,
123
00:06:02.920 --> 00:06:06.990
it is a mindset issue and over the longterm,
124
00:06:06.990 --> 00:06:11.800
companies that adopt and understand that this is a mindset
125
00:06:11.800 --> 00:06:15.390
and a skill-set game, actually will be the ones
126
00:06:15.390 --> 00:06:19.110
that are more competitive than their peer reference group.
127
00:06:19.110 --> 00:06:21.310
Yeah. That's super insightful and right on.
128
00:06:21.310 --> 00:06:24.160
And it's almost exactly the conversation
129
00:06:24.160 --> 00:06:26.310
that we were having with Will from Google.
130
00:06:26.310 --> 00:06:27.560
It is all about mindset.
131
00:06:27.560 --> 00:06:31.150
Yet, it's quite daunting, I would say,
132
00:06:31.150 --> 00:06:34.010
that for so many companies,
133
00:06:34.010 --> 00:06:35.310
they, they're still viewing it
134
00:06:35.310 --> 00:06:38.660
as a technology issue, as a technology black box.
135
00:06:38.660 --> 00:06:40.520
As you know, we need a lot of data.
136
00:06:40.520 --> 00:06:42.180
We need a lot of data scientists, which is,
137
00:06:42.180 --> 00:06:45.970
of course you do need those, but you need to also focus
138
00:06:45.970 --> 00:06:49.040
on the right problems and that problem definition
139
00:06:49.040 --> 00:06:52.060
and how you go about solving those problems.
140
00:06:52.060 --> 00:06:55.560
And then how do you change the mindset of the actual users?
141
00:06:55.560 --> 00:07:00.300
Because I could imagine you have been able to shift
142
00:07:00.300 --> 00:07:03.640
the mindset of two kinds of groups, like the mindset
143
00:07:03.640 --> 00:07:06.960
of the consumers, where 20 years ago, they wouldn't,
144
00:07:06.960 --> 00:07:09.570
they wouldn't share information on birthdays
145
00:07:09.570 --> 00:07:12.410
and things like that with any digital platform.
146
00:07:12.410 --> 00:07:16.430
But of course you've changed that mindset, including myself.
147
00:07:16.430 --> 00:07:19.200
But you've also, must've changed the mindset
148
00:07:19.200 --> 00:07:21.160
and the ways of working of so many,
149
00:07:21.160 --> 00:07:26.070
like mom and pop local florists that are actually engaging
150
00:07:26.070 --> 00:07:28.570
with a platform to do business.
151
00:07:28.570 --> 00:07:30.100
Can you comment on that a little bit
152
00:07:30.100 --> 00:07:34.753
and how hard was to do that?
I think Shervin, you have,
153
00:07:34.753 --> 00:07:38.600
you have brought a very important crux of this issue.
154
00:07:38.600 --> 00:07:42.080
You know, when we, when we talk about a mindset shift,
155
00:07:42.080 --> 00:07:45.670
a metamorphosis of acceptance of, you know,
156
00:07:45.670 --> 00:07:48.550
mindset over even skill-set,
157
00:07:48.550 --> 00:07:52.080
it really requires a multi-stakeholder approach.
158
00:07:52.080 --> 00:07:54.200
And certainly we are very proud
159
00:07:54.200 --> 00:07:57.120
that we support almost a community of 4,000
160
00:07:57.120 --> 00:08:00.840
plus florists who are powering Main Street businesses.
161
00:08:00.840 --> 00:08:03.850
And I would say one of the last remaining outposts
162
00:08:03.850 --> 00:08:05.970
of successful Main Street businesses
163
00:08:05.970 --> 00:08:08.180
in the US is the florist,
164
00:08:08.180 --> 00:08:10.770
and it plays a very important part in the community
165
00:08:10.770 --> 00:08:13.880
not just as sort of a trusted provider
166
00:08:13.880 --> 00:08:16.230
to all the important occasions
167
00:08:16.230 --> 00:08:18.240
for everyone in the community.
168
00:08:18.240 --> 00:08:21.880
But also I would say as a light post
169
00:08:21.880 --> 00:08:24.690
of how the context with AI is evolving.
170
00:08:24.690 --> 00:08:27.450
And let me give you a few examples of it.
171
00:08:27.450 --> 00:08:29.930
The question comes down to, is the context
172
00:08:29.930 --> 00:08:32.110
around me and getting more competitive and evolving?
173
00:08:32.110 --> 00:08:33.670
And I would say for the small florist
174
00:08:33.670 --> 00:08:36.650
and a company like ours, being surrounded
175
00:08:36.650 --> 00:08:39.010
by platforms like Facebook and Google,
176
00:08:39.010 --> 00:08:42.720
which are auction-rich machine learning environments
177
00:08:42.720 --> 00:08:46.950
set up to extract the highest yield per click
178
00:08:47.840 --> 00:08:52.130
means that any business owner that is seeking growth,
179
00:08:52.130 --> 00:08:54.830
that is seeking to get in front of customers,
180
00:08:54.830 --> 00:08:56.770
already is being mediated
181
00:08:56.770 --> 00:08:59.960
by machine learning and artificial intelligence.
182
00:08:59.960 --> 00:09:04.640
So when I think about this multi-stakeholder empowerment,
183
00:09:04.640 --> 00:09:09.640
I think about how do we empower the smallest florist
184
00:09:09.730 --> 00:09:13.090
in heartland of America compete
185
00:09:13.090 --> 00:09:15.550
with this evolution of context?
186
00:09:15.550 --> 00:09:18.540
You know, how do we empower that, you know,
187
00:09:18.540 --> 00:09:21.720
small business entity to get to that strength?
188
00:09:21.720 --> 00:09:24.027
And I think that's where the mindset comes in,
189
00:09:24.027 --> 00:09:27.120
'cause what this requires is, first of all,
190
00:09:27.120 --> 00:09:31.093
understanding that the context is already rich in AI and ML.
191
00:09:32.110 --> 00:09:34.700
The second point is that unless you can,
192
00:09:34.700 --> 00:09:37.360
you can assemble a response to it,
193
00:09:37.360 --> 00:09:39.570
you are always on the losing side.
194
00:09:39.570 --> 00:09:42.690
So our thinking is that by providing those suite
195
00:09:42.690 --> 00:09:46.330
of services, by providing and working very closely
196
00:09:46.330 --> 00:09:50.040
with our florist community, our supplier community,
197
00:09:50.040 --> 00:09:52.740
we are actually providing them relevance
198
00:09:52.740 --> 00:09:56.570
in a rapidly evolving context, where getting in the front
199
00:09:56.570 --> 00:09:59.370
of their customers itself is a machine learning problem.
200
00:10:00.230 --> 00:10:02.170
How do you go about doing that?
201
00:10:02.170 --> 00:10:07.040
How much of that is technologically driven
202
00:10:07.040 --> 00:10:11.550
through the platform and how much of that is good
203
00:10:11.550 --> 00:10:15.130
old fashioned human grease and relationship management
204
00:10:15.130 --> 00:10:20.130
and working closely with these little places?
205
00:10:20.440 --> 00:10:22.930
How much of that was technology solving the problem
206
00:10:22.930 --> 00:10:25.160
versus people and processes
207
00:10:25.160 --> 00:10:27.560
and change management and those kinds of things?
208
00:10:28.560 --> 00:10:32.240
A very strong starting point is realizing, how
209
00:10:32.240 --> 00:10:37.240
can you basically collect data and make inferences at scale?
210
00:10:37.700 --> 00:10:39.453
So I'll give you a simple example.
211
00:10:40.340 --> 00:10:42.230
To set up a reminder program
212
00:10:42.230 --> 00:10:47.230
on our platform is actually a perpetual cold start problem.
213
00:10:47.910 --> 00:10:49.830
And let me explain what that means.
214
00:10:49.830 --> 00:10:52.960
It means that for example, if you come
215
00:10:52.960 --> 00:10:56.870
to our website or you come to any florist website
216
00:10:56.870 --> 00:10:58.070
and you, let's say you have come
217
00:10:58.070 --> 00:11:01.740
in to express happy birthday to your sister
218
00:11:01.740 --> 00:11:04.910
whose birthday's a week away, and you might come
219
00:11:04.910 --> 00:11:08.280
and pick an arrangement, let's say she loves, you know,
220
00:11:08.280 --> 00:11:11.190
white calla lilies, and you come and, you know,
221
00:11:11.190 --> 00:11:15.180
do some clicking on white flowers, white arrangements,
222
00:11:15.180 --> 00:11:18.080
and then pick a calla lily arrangement and send it to her.
223
00:11:18.920 --> 00:11:22.760
Most companies will take a record of that data
224
00:11:22.760 --> 00:11:26.070
and say that the next time Shervin comes to our site,
225
00:11:26.070 --> 00:11:28.930
let's show him white, for example.
226
00:11:28.930 --> 00:11:29.870
But it could be that the,
227
00:11:29.870 --> 00:11:33.960
your next visit is actually right before Valentine's.
228
00:11:33.960 --> 00:11:36.850
You're here to buy flowers, which are predominantly
229
00:11:36.850 --> 00:11:40.180
generally red or pink for Valentine's.
230
00:11:40.180 --> 00:11:42.350
And you're trying to express that.
231
00:11:42.350 --> 00:11:45.840
So your, your entire click history, your entire corpus
232
00:11:45.840 --> 00:11:48.930
of digital breadcrumbs that you have given us
233
00:11:48.930 --> 00:11:52.600
to solve a machine learning problem is actually irrelevant
234
00:11:52.600 --> 00:11:55.740
because you're starting again as a cold start outcome.
235
00:11:55.740 --> 00:11:59.300
And this fact of personalization, you know,
236
00:11:59.300 --> 00:12:01.830
the enormity of data, the enormity
237
00:12:01.830 --> 00:12:05.230
of decisions required to resolve this outcome
238
00:12:05.230 --> 00:12:09.470
so that you can create a better customer experience is,
239
00:12:09.470 --> 00:12:13.370
is what we are empowering our stakeholders
240
00:12:13.370 --> 00:12:15.400
to really realize, right?
241
00:12:15.400 --> 00:12:17.320
So that is one dimension of it.
242
00:12:17.320 --> 00:12:19.520
The second dimension of it is what we talked
243
00:12:19.520 --> 00:12:23.770
about that currently customers are intermediated
244
00:12:23.770 --> 00:12:27.390
by extremely expensive, I would say,
245
00:12:27.390 --> 00:12:30.430
auction rich environments, controlled
246
00:12:30.430 --> 00:12:35.430
by a few major platforms and to play in those platforms
247
00:12:35.930 --> 00:12:39.330
you need to have a baseline competency.
248
00:12:39.330 --> 00:12:40.690
So we employ a lot
249
00:12:40.690 --> 00:12:45.690
of advanced algorithmic trading and algorithmic models
250
00:12:46.080 --> 00:12:49.960
for example, to understand what should be your bid
251
00:12:49.960 --> 00:12:53.450
at any given time of the day, day of the week,
252
00:12:53.450 --> 00:12:58.120
and the month of a year in order to maximize your,
253
00:12:58.120 --> 00:13:01.460
your yield and minimize your CAC.
254
00:13:01.460 --> 00:13:04.550
And those data sets, that sophistication,
255
00:13:04.550 --> 00:13:08.990
that investment is, is almost outside the realm,
256
00:13:08.990 --> 00:13:13.990
I would say, of a lot of localized businesses and outcomes.
257
00:13:14.450 --> 00:13:16.510
So this question of building alliances,
258
00:13:16.510 --> 00:13:20.140
this question of trusting larger entities,
259
00:13:20.140 --> 00:13:23.130
is going to become also more important over time.
260
00:13:23.130 --> 00:13:26.340
So when we think about our mission and our vision,
261
00:13:26.340 --> 00:13:31.160
we are inspired by what part can we play
262
00:13:31.160 --> 00:13:33.270
in catalyzing those outcomes
263
00:13:33.270 --> 00:13:36.510
and empowering in accelerating those outcomes?
264
00:13:36.510 --> 00:13:37.700
Because whether we are talking
265
00:13:37.700 --> 00:13:39.990
about florists on the Main Street, as one
266
00:13:39.990 --> 00:13:44.700
of the last remaining independent important businesses
267
00:13:44.700 --> 00:13:49.560
in America, or we think about someone who is trying to get
268
00:13:49.560 --> 00:13:53.760
to a funeral home to express something very personal
269
00:13:53.760 --> 00:13:56.900
to them, those moments define us
270
00:13:56.900 --> 00:13:59.980
and define the communities that we live in.
271
00:13:59.980 --> 00:14:01.560
And we think that, you know,
272
00:14:01.560 --> 00:14:03.580
we have a strong part to play in,
273
00:14:03.580 --> 00:14:05.840
in helping realize that vision.
274
00:14:05.840 --> 00:14:07.120
And we look at that vision
275
00:14:07.120 --> 00:14:10.630
not just as a financial or a transactional outcome,
276
00:14:10.630 --> 00:14:14.160
but we look at that as an outcome of, for the whole society.
277
00:14:14.160 --> 00:14:15.410
So for example, you know,
278
00:14:15.410 --> 00:14:17.320
we have free e-cards that you can come
279
00:14:17.320 --> 00:14:19.033
to our site right now and send.
280
00:14:19.930 --> 00:14:23.500
We really want you to just literally express
281
00:14:23.500 --> 00:14:26.170
to someone that, hey, you are thinking of them,
282
00:14:26.170 --> 00:14:28.580
because we think that it's very more important
283
00:14:28.580 --> 00:14:33.580
for us to appreciate and empower that expression.
284
00:14:33.730 --> 00:14:34.870
That over time
285
00:14:34.870 --> 00:14:38.040
hopefully leads you to have a deeper connection
286
00:14:38.040 --> 00:14:42.350
with us as a brand, deeper connection with us as a platform
287
00:14:42.350 --> 00:14:45.760
and then use us to express that emotion.
288
00:14:45.760 --> 00:14:49.830
But the empowerment of emotion in and of itself
289
00:14:49.830 --> 00:14:53.940
is a very important part of our mission and our vision.
290
00:14:53.940 --> 00:14:56.380
And going back to AI, and the reason I talked
291
00:14:56.380 --> 00:15:00.140
about solving fundamental personalized problems
292
00:15:00.140 --> 00:15:02.350
at scale is that all
293
00:15:02.350 --> 00:15:05.900
of our expressions are ultimately personalized expressions.
294
00:15:05.900 --> 00:15:10.900
So unless you are employing and deploying those technologies
295
00:15:11.130 --> 00:15:15.840
and the mindset that customers are here to express
296
00:15:15.840 --> 00:15:20.800
and connect, you are not going to be looking at the problem
297
00:15:20.800 --> 00:15:23.780
or the solution in the way that empowers
298
00:15:23.780 --> 00:15:25.730
that end customer first.
299
00:15:25.730 --> 00:15:26.580
Was there something
300
00:15:26.580 --> 00:15:28.170
about your background that shaped how you think
301
00:15:28.170 --> 00:15:30.370
about customers or maybe that affects how
302
00:15:30.370 --> 00:15:33.600
you get people working in that customer-first mindset?
303
00:15:33.600 --> 00:15:37.530
I think it was a mix of my liberal arts education
304
00:15:37.530 --> 00:15:40.610
and a desire to push problem solving
305
00:15:40.610 --> 00:15:45.610
as a key characteristic and an attribute of my skill-set
306
00:15:46.240 --> 00:15:47.530
as I moved through the various
307
00:15:47.530 --> 00:15:49.810
leadership challenges and ranks.
308
00:15:49.810 --> 00:15:53.540
One of the key lessons that I took away
309
00:15:53.540 --> 00:15:57.310
from my liberal arts education at Bowdoin
310
00:15:57.310 --> 00:16:01.150
was around the importance of this learning quotient
311
00:16:01.150 --> 00:16:04.100
and having an LQ-first mindset
312
00:16:04.100 --> 00:16:08.440
because what liberal arts really forces you to do
313
00:16:08.440 --> 00:16:13.440
is adopt a continuous learning and asking questions
314
00:16:13.630 --> 00:16:16.870
which are deeper than the functional competency.
315
00:16:16.870 --> 00:16:19.700
And this, I think over time, actually, you know,
316
00:16:19.700 --> 00:16:23.330
when machines start doing repetitive tasks,
317
00:16:23.330 --> 00:16:25.620
decisioning will become actually a very
318
00:16:25.620 --> 00:16:28.600
important ethical choice as well.
319
00:16:28.600 --> 00:16:32.440
When I mentor college students and I give talks,
320
00:16:32.440 --> 00:16:35.180
I always point out the primacy
321
00:16:35.180 --> 00:16:38.650
of take your non-technical classes very seriously
322
00:16:38.650 --> 00:16:41.360
and consider a liberal arts education.
323
00:16:41.360 --> 00:16:44.100
Because I think the seminal questions faced
324
00:16:44.100 --> 00:16:48.150
by a leader 10 years, hence 15 years hence,
325
00:16:48.150 --> 00:16:52.700
are not going to be just around how competent they are,
326
00:16:52.700 --> 00:16:56.700
but how thoughtful they are, how good they are at learning.
327
00:16:56.700 --> 00:16:57.560
Well as a professor
328
00:16:57.560 --> 00:17:00.610
at a university that focuses on liberal arts education,
329
00:17:00.610 --> 00:17:03.200
I can wholeheartedly agree with that.
330
00:17:03.200 --> 00:17:04.710
But I also want to think about,
331
00:17:04.710 --> 00:17:06.050
is there an example of a place?
332
00:17:06.050 --> 00:17:07.840
So you mentioned you were trying to learn
333
00:17:07.840 --> 00:17:11.320
about individual customers and how difficult that is,
334
00:17:11.320 --> 00:17:12.700
because in your context, it's not just
335
00:17:12.700 --> 00:17:14.020
here's what they did last time,
336
00:17:14.020 --> 00:17:16.160
and we predict that they do more of the same.
337
00:17:16.160 --> 00:17:18.470
In fact, last time we told them exactly the opposite
338
00:17:18.470 --> 00:17:20.410
of what they were going to do this time.
339
00:17:20.410 --> 00:17:21.930
So can you give us some examples
340
00:17:21.930 --> 00:17:25.400
of how are you using AI to learn about your customer's needs
341
00:17:25.400 --> 00:17:28.010
and what kinds of things you've learned and how have you set
342
00:17:28.010 --> 00:17:30.440
up your organization to learn those things?
343
00:17:30.440 --> 00:17:34.210
It's exceedingly hard, no matter what AI leaders
344
00:17:34.210 --> 00:17:36.720
and the ecosystem likes to talk about it,
345
00:17:36.720 --> 00:17:38.830
chiefly because of three reasons.
346
00:17:38.830 --> 00:17:43.570
I think all business leaders face a trifecta of issues.
347
00:17:43.570 --> 00:17:45.870
When they think about AI adoption,
348
00:17:45.870 --> 00:17:46.703
the first starts
349
00:17:46.703 --> 00:17:50.170
with having cross-functional and competent teams.
350
00:17:50.170 --> 00:17:52.840
You know, generally what you find within organizations is
351
00:17:52.840 --> 00:17:56.640
that the teams are spoken for and especially data science
352
00:17:56.640 --> 00:17:59.580
and machine learning competencies are extremely
353
00:17:59.580 --> 00:18:02.210
hard to find and fund, I would say.
354
00:18:02.210 --> 00:18:04.630
The second issue is the data sets
355
00:18:04.630 --> 00:18:06.690
are noisy and incomplete.
356
00:18:06.690 --> 00:18:10.980
So when we talk about essential ingredients of AI,
357
00:18:10.980 --> 00:18:15.370
in most companies, actually that data is extremely siloed,
358
00:18:15.370 --> 00:18:20.130
extremely difficult to join and often incomplete.
359
00:18:20.130 --> 00:18:24.980
And the third, which is a much more evolving vector is
360
00:18:24.980 --> 00:18:29.740
that it has to be explainable in its end state, right?
361
00:18:29.740 --> 00:18:33.340
It has to be trustworthy as a stack.
362
00:18:33.340 --> 00:18:36.590
So what we actually found is, is rapidly evolving.
363
00:18:36.590 --> 00:18:38.610
And I think this is going to be very true
364
00:18:38.610 --> 00:18:42.730
of most organizations is to adopt AI as a service.
365
00:18:42.730 --> 00:18:45.730
Most companies I think can get very quickly started
366
00:18:45.730 --> 00:18:50.300
to your question, Sam, by adopting AI as a service
367
00:18:50.300 --> 00:18:53.380
and then asking a very simple question,
368
00:18:53.380 --> 00:18:58.230
what two or three problems can I solve better tomorrow
369
00:18:58.230 --> 00:19:02.100
employing the stack that I'm not doing currently?
370
00:19:02.100 --> 00:19:04.910
And there is very interesting outcomes when
371
00:19:04.910 --> 00:19:06.230
you start looking under the layer.
372
00:19:06.230 --> 00:19:07.460
So one of the problems, I said,
373
00:19:07.460 --> 00:19:09.360
is a cold start problem for us.
374
00:19:09.360 --> 00:19:11.370
So we are working on a recommendation system
375
00:19:11.370 --> 00:19:14.700
which has been very successful, that utilizes a lot
376
00:19:14.700 --> 00:19:17.650
of neural learning and sort of learning
377
00:19:17.650 --> 00:19:21.540
with very thin data sets, to make inferences, right?
378
00:19:21.540 --> 00:19:25.120
The other place that we found is forecasting for example.
379
00:19:25.120 --> 00:19:27.520
You know, forecasting is a very difficult exercise,
380
00:19:27.520 --> 00:19:30.540
especially if you can imagine that, you know, for example
381
00:19:30.540 --> 00:19:34.720
Valentine's Day actually moves by day of the week, right?
382
00:19:34.720 --> 00:19:37.850
So last year it was a Friday, this year, it was a Sunday.
383
00:19:37.850 --> 00:19:40.430
And, you know, compared to Mother's Day,
384
00:19:40.430 --> 00:19:43.300
which is always on a Sunday, right?
385
00:19:43.300 --> 00:19:46.400
And that has very deep business implications
386
00:19:46.400 --> 00:19:47.580
as an outcome, right?
387
00:19:47.580 --> 00:19:50.920
So forecasting is a perfect candidate to put towards this.
388
00:19:50.920 --> 00:19:55.140
But the mindset again is are you testing
389
00:19:55.140 --> 00:19:56.910
and learning along the way?
390
00:19:56.910 --> 00:19:58.780
You know, in some cases, the early attempts
391
00:19:58.780 --> 00:20:00.930
at machine learning will be no better
392
00:20:00.930 --> 00:20:03.150
than your decision based engines.
393
00:20:03.150 --> 00:20:06.110
But what we have seen is that actually persistence
394
00:20:06.110 --> 00:20:11.050
over, even the medium term, has very asymmetric payoffs
395
00:20:11.050 --> 00:20:14.070
and extremely important to evangelize
396
00:20:14.070 --> 00:20:15.930
and understand those payoffs.
397
00:20:15.930 --> 00:20:16.880
Because as I said,
398
00:20:16.880 --> 00:20:19.770
the context that most modern companies find themselves
399
00:20:19.770 --> 00:20:23.550
in is already awash in machine learning.
400
00:20:23.550 --> 00:20:25.840
So two of the three things you mentioned involve
401
00:20:25.840 --> 00:20:28.490
cross-platform, it's the idea of,
402
00:20:28.490 --> 00:20:30.150
of people working together.
403
00:20:30.150 --> 00:20:32.420
You mentioned it from several, from the data perspective
404
00:20:32.420 --> 00:20:34.750
and also from the team perspective, the tension is
405
00:20:34.750 --> 00:20:36.910
if everyone can't work on everything all the time.
406
00:20:36.910 --> 00:20:39.250
Otherwise that's not a team, that's the whole organization.
407
00:20:39.250 --> 00:20:41.680
So how do you set up that within your organization?
408
00:20:41.680 --> 00:20:45.240
So that you've got that nice blend of cross-functional
409
00:20:45.240 --> 00:20:48.430
but not everybody involved in everything?
410
00:20:48.430 --> 00:20:51.370
I would say, you know, to be brutally honest,
411
00:20:51.370 --> 00:20:54.840
it's a field of aspirational tensions.
412
00:20:54.840 --> 00:20:55.719
You know,
413
00:20:55.719 --> 00:20:58.660
when you are trying to shift mindsets over skill-sets,
414
00:20:58.660 --> 00:21:01.870
it's not about sort of how do you assemble teams
415
00:21:01.870 --> 00:21:04.060
and how do you get to a solution,
416
00:21:04.060 --> 00:21:07.740
but how do you ultimately sell your vision and how do you,
417
00:21:07.740 --> 00:21:11.070
how do you get people enthusiastically believing
418
00:21:11.070 --> 00:21:12.180
in that vision?
419
00:21:12.180 --> 00:21:14.470
So I would say our early attempts
420
00:21:14.470 --> 00:21:17.880
at sort of organizing what a lot more command
421
00:21:17.880 --> 00:21:20.527
and control we are, we were sort of saying that,
422
00:21:20.527 --> 00:21:22.580
"Hey, if you have data science background
423
00:21:22.580 --> 00:21:24.840
or you have analytics background
424
00:21:24.840 --> 00:21:26.780
maybe you are primed for this."
425
00:21:26.780 --> 00:21:27.700
I think over time
426
00:21:27.700 --> 00:21:30.600
what we have realized is actually learning systems
427
00:21:30.600 --> 00:21:33.960
our self-organizing principle at their core.
428
00:21:33.960 --> 00:21:36.640
So now we are thinking more about, as I was saying,
429
00:21:36.640 --> 00:21:40.250
the early days of just rolling out Excel to everyone.
430
00:21:40.250 --> 00:21:44.280
What if we rolled out AI as a service to everyone?
431
00:21:44.280 --> 00:21:46.810
You know, is if someone is just making a schedule
432
00:21:46.810 --> 00:21:51.810
of meetings, do they get more empowered by AI as a service?
433
00:21:52.250 --> 00:21:56.510
You know, will they themselves find out some novel solutions
434
00:21:56.510 --> 00:21:59.950
to something that was completely not thought of as,
435
00:21:59.950 --> 00:22:02.670
as an important-enough problem, right?
436
00:22:02.670 --> 00:22:06.360
And the reason I say that Sam is, is not to suggest
437
00:22:06.360 --> 00:22:09.610
that there is not a cohesive sort of listing
438
00:22:09.610 --> 00:22:12.480
of problems that can be solved by AI
439
00:22:12.480 --> 00:22:15.500
and assembling cross-functional teams and doing that.
440
00:22:15.500 --> 00:22:19.610
I think that's the easier part, but what I'm suggesting,
441
00:22:19.610 --> 00:22:22.500
and you know, egging on my peer reference group
442
00:22:22.500 --> 00:22:27.060
to really think about is that the real empowerment
443
00:22:27.060 --> 00:22:30.330
and the real transformation in the mindset will come
444
00:22:30.330 --> 00:22:34.570
when you roll out AI to every end point, right?
445
00:22:34.570 --> 00:22:37.140
Like we don't think twice about rolling out email
446
00:22:37.140 --> 00:22:39.010
to every new employee.
447
00:22:39.010 --> 00:22:42.960
Why do we constrain and, and sort of self limit ourselves
448
00:22:42.960 --> 00:22:47.700
to think about AI as only the domain of specialists, right?
449
00:22:47.700 --> 00:22:50.230
It's a problem solving methodology.
450
00:22:50.230 --> 00:22:52.040
It's a problem solving mindset.
451
00:22:52.040 --> 00:22:54.130
It's an operating system, we build apps on it.
452
00:22:54.130 --> 00:22:54.963
Exactly.
453
00:22:54.963 --> 00:22:55.890
And I think that's quite insightful
454
00:22:55.890 --> 00:22:59.900
because whatever you make available
455
00:22:59.900 --> 00:23:04.540
to smart and inquisitive people
456
00:23:04.540 --> 00:23:06.570
ends up becoming better.
457
00:23:06.570 --> 00:23:07.403
And that's a,
458
00:23:07.403 --> 00:23:09.680
that's a very good challenge to any organization.
459
00:23:09.680 --> 00:23:14.050
You know, why not have the suite of AI products self-service
460
00:23:14.050 --> 00:23:17.550
for the layman user to be able to do things with?
461
00:23:17.550 --> 00:23:18.500
To your point.
462
00:23:18.500 --> 00:23:19.940
One other thing that comes
463
00:23:19.940 --> 00:23:24.940
to mind is the importance of appreciating failures
464
00:23:25.420 --> 00:23:28.170
as essential input to better learning.
465
00:23:28.170 --> 00:23:29.350
I think what I find
466
00:23:29.350 --> 00:23:34.350
in adopting an AI-first mindset is a deep respect
467
00:23:34.720 --> 00:23:39.720
and celebration of failure as an organizational currency.
468
00:23:39.930 --> 00:23:41.610
If you think about the history
469
00:23:41.610 --> 00:23:46.600
of employees within an organization, all the origin stories
470
00:23:46.600 --> 00:23:50.110
and the stories thereafter are around successes
471
00:23:50.110 --> 00:23:51.770
but an AI-first mindset
472
00:23:51.770 --> 00:23:56.640
in my mind is how do you actually collectively embrace,
473
00:23:56.640 --> 00:23:59.500
you know, not, not by putting up posters that, you know,
474
00:23:59.500 --> 00:24:01.720
run fast and fail fast.
475
00:24:01.720 --> 00:24:02.553
You know, don't,
476
00:24:02.553 --> 00:24:06.380
those don't really change people's activities,
477
00:24:06.380 --> 00:24:08.550
their behaviors, and their acceptance
478
00:24:08.550 --> 00:24:13.550
of their career trajectory as much as celebrating failures.
479
00:24:13.630 --> 00:24:14.930
And the reason I say that is
480
00:24:14.930 --> 00:24:17.550
that all machine learning, all learning
481
00:24:17.550 --> 00:24:21.180
in the future actually has a very healthy equilibrium
482
00:24:21.180 --> 00:24:23.530
between outcomes that are successful and outcomes
483
00:24:23.530 --> 00:24:26.900
that failed because outcomes that fail actually
484
00:24:26.900 --> 00:24:30.510
teach the system equally as much as outcomes that succeeded.
485
00:24:30.510 --> 00:24:33.090
And I think it's a very important point on failure.
486
00:24:33.090 --> 00:24:35.140
How do you operationalize that?
487
00:24:35.140 --> 00:24:38.300
I pray a lot and I tossed the coin a lot
488
00:24:38.300 --> 00:24:40.860
but, you know, it's a very important question.
489
00:24:40.860 --> 00:24:43.490
I think it has to start from the leadership.
490
00:24:43.490 --> 00:24:45.440
I think it has to start from a very,
491
00:24:45.440 --> 00:24:49.410
very human manifestation
492
00:24:49.410 --> 00:24:53.600
of how decisions are extremely difficult.
493
00:24:53.600 --> 00:24:57.850
And even for leaders to be very open about when
494
00:24:57.850 --> 00:25:02.410
their decisions did not lead to successful outcomes.
495
00:25:02.410 --> 00:25:04.980
So I think one of the key learnings
496
00:25:04.980 --> 00:25:08.360
in my life and which I've tried to follow very deeply
497
00:25:08.360 --> 00:25:10.630
is around radical transparency,
498
00:25:10.630 --> 00:25:13.500
around making sure that people appreciate,
499
00:25:13.500 --> 00:25:16.810
that these were the reasons I took a certain decision
500
00:25:16.810 --> 00:25:20.390
and that I'm open enough at the end of it
501
00:25:20.390 --> 00:25:22.560
for any inputs, right?
502
00:25:22.560 --> 00:25:25.500
Whether it went successfully or it didn't go successfully.
503
00:25:25.500 --> 00:25:27.610
So that is one way of operationalizing it when
504
00:25:27.610 --> 00:25:30.510
the leadership starts living out that outcome.
505
00:25:30.510 --> 00:25:32.580
The second, I think very important part
506
00:25:32.580 --> 00:25:35.940
of it is how do you incentivize that outcome?
507
00:25:35.940 --> 00:25:37.280
So for example, you know,
508
00:25:37.280 --> 00:25:41.000
we have a constant red team that we call internally,
509
00:25:41.000 --> 00:25:44.000
that runs up against the main growth team, for example.
510
00:25:44.000 --> 00:25:45.980
So if the main growth team has a $100 million
511
00:25:45.980 --> 00:25:48.610
to spend on marketing, I give 10%
512
00:25:48.610 --> 00:25:50.920
to a red team that is actually going
513
00:25:50.920 --> 00:25:53.360
against the conventional wisdom.
514
00:25:53.360 --> 00:25:54.480
And the reason it is going
515
00:25:54.480 --> 00:25:56.760
against the conventional wisdom is actually to build
516
00:25:56.760 --> 00:26:00.240
up a corpus of failures that then can act
517
00:26:00.240 --> 00:26:02.750
as a foil to what did we learn
518
00:26:02.750 --> 00:26:04.350
from spending that $100 million.
519
00:26:05.370 --> 00:26:07.500
And this is a very important part again
520
00:26:07.500 --> 00:26:11.880
of increasing the collective LQ of the team, right?
521
00:26:11.880 --> 00:26:14.450
Because if everything is done by consensus,
522
00:26:14.450 --> 00:26:17.880
we know from behavioral economics and a lot of studies done,
523
00:26:17.880 --> 00:26:21.610
it is not the best decision making outcome as well, right?
524
00:26:21.610 --> 00:26:23.290
So that is one example of it.
525
00:26:23.290 --> 00:26:26.010
So how do you set up team structures and incentives?
526
00:26:26.010 --> 00:26:27.640
And then the last thing I would say,
527
00:26:27.640 --> 00:26:30.890
which has been a learning mode of late to me,
528
00:26:30.890 --> 00:26:33.040
is how do you actually translate
529
00:26:33.040 --> 00:26:37.330
that into ESG or ethical goals?
530
00:26:37.330 --> 00:26:40.980
Because what I have seen with the newer cohort
531
00:26:40.980 --> 00:26:44.490
of employees of stakeholders that we have had
532
00:26:44.490 --> 00:26:48.450
is that it is not so much just about learning,
533
00:26:48.450 --> 00:26:52.170
but learning within a context that I believe in.
534
00:26:52.170 --> 00:26:55.040
So my newer understanding more and more
535
00:26:55.040 --> 00:26:59.640
has to be around like, hey, if he ingests AI models,
536
00:26:59.640 --> 00:27:03.490
are they explainable, are they de-biased?
537
00:27:03.490 --> 00:27:04.550
Can I make sure
538
00:27:04.550 --> 00:27:07.620
that the team appreciates that sudden choices
539
00:27:07.620 --> 00:27:11.910
that we may make may not have the immediate business payoff
540
00:27:11.910 --> 00:27:13.970
but are actually much more better aligned
541
00:27:13.970 --> 00:27:15.653
with our vision and our mission?
542
00:27:16.880 --> 00:27:18.940
Well, we started this discussion talking about mom
543
00:27:18.940 --> 00:27:22.380
and pop flower shops, and that resonates with me actually
544
00:27:22.380 --> 00:27:24.650
I didn't mention it, but my mom owned a flower shop.
545
00:27:24.650 --> 00:27:27.580
So mom and pop is actually literal in this point.
546
00:27:27.580 --> 00:27:29.380
Amit, we really appreciate you taking the time
547
00:27:29.380 --> 00:27:31.000
to talk with us today.
548
00:27:31.000 --> 00:27:32.810
Thanks for spending some time with us.
549
00:27:32.810 --> 00:27:33.840
Yeah. Thank you so much.
550
00:27:33.840 --> 00:27:35.100
This has been very insightful.
551
00:27:35.100 --> 00:27:35.933
Thank you.
552
00:27:35.933 --> 00:27:36.880
I love this conversation.
553
00:27:36.880 --> 00:27:37.730
Thank you guys.
554
00:27:37.730 --> 00:27:38.664
Appreciate it.
555
00:27:38.664 --> 00:27:42.164
(relaxed technical music)
556
00:27:44.240 --> 00:27:45.073
Sure enough.
557
00:27:45.073 --> 00:27:46.810
It was was quite interesting.
558
00:27:46.810 --> 00:27:48.940
One thing that struck me was how, you know,
559
00:27:48.940 --> 00:27:49.773
we talk about, "Oh yeah
560
00:27:49.773 --> 00:27:52.140
the machines can learn from past, et cetera, et cetera."
561
00:27:52.140 --> 00:27:54.100
But how at, every scenario for them is a bit
562
00:27:54.100 --> 00:27:55.690
of a cold start problem
563
00:27:55.690 --> 00:27:58.460
because, you know, every holiday is different.
564
00:27:58.460 --> 00:28:01.060
Every time someone comes to them, they're getting something
565
00:28:01.060 --> 00:28:03.670
for a different reason and it wouldn't be a cold start
566
00:28:03.670 --> 00:28:06.660
if they knew the underlying reasons, but they don't always.
567
00:28:06.660 --> 00:28:08.050
When we go to, you know,
568
00:28:08.050 --> 00:28:10.390
any of the normal collaborative filtering platforms
569
00:28:10.390 --> 00:28:13.660
like a Netflix or other places, or even transportation
570
00:28:13.660 --> 00:28:16.080
like Uber and Lyft, those people have a much
571
00:28:16.080 --> 00:28:18.460
better ability to build off our history
572
00:28:18.460 --> 00:28:20.670
than 1-800-Flowers does.
573
00:28:20.670 --> 00:28:22.500
It, it is once, you know, cold start,
574
00:28:22.500 --> 00:28:26.400
and tying that to how emotionally aware they need to be.
575
00:28:26.400 --> 00:28:28.420
Because by definition,
576
00:28:28.420 --> 00:28:32.010
these are very human experiences that they're involved in.
577
00:28:32.010 --> 00:28:34.430
If they screw that up, that's not good.
578
00:28:34.430 --> 00:28:37.590
Yeah, also it's a cold start, which by definition
579
00:28:37.590 --> 00:28:39.920
means it's a learning opportunity
580
00:28:39.920 --> 00:28:41.760
'cause a cold start problem is the same
581
00:28:41.760 --> 00:28:43.270
as a learning problem.
582
00:28:43.270 --> 00:28:45.390
And if you have many cold star problems,
583
00:28:45.390 --> 00:28:47.050
then isn't that another way
584
00:28:47.050 --> 00:28:50.760
of saying you've basically have to be comfortable
585
00:28:50.760 --> 00:28:54.160
with a very accelerated rate of learning
586
00:28:54.160 --> 00:28:57.090
and that's your success, 'cause otherwise, yes
587
00:28:57.090 --> 00:29:00.900
everything is a sympathy or everything is
588
00:29:00.900 --> 00:29:04.220
that one demographic that I really really know.
589
00:29:04.220 --> 00:29:08.410
And rather than being adaptive to all those situations.
590
00:29:08.410 --> 00:29:10.460
The other thing that is emerging theme
591
00:29:10.460 --> 00:29:11.800
over research that he talked
592
00:29:11.800 --> 00:29:16.140
about a lot was the notion of learning quotient, right?
593
00:29:16.140 --> 00:29:20.590
You know, we asked him about teams and he said,
594
00:29:20.590 --> 00:29:25.590
what they care about a lot is an individual's desire
595
00:29:25.930 --> 00:29:29.240
and willingness and ability to want to learn.
596
00:29:29.240 --> 00:29:33.740
And that fits so well with what AI itself is
597
00:29:33.740 --> 00:29:37.760
which is it's all about learning and the notion
598
00:29:37.760 --> 00:29:41.150
of human and machine learning from each other
599
00:29:41.150 --> 00:29:43.040
which is also the theme of our work.
600
00:29:43.040 --> 00:29:45.610
I found it quite insightful that he picked up on that.
601
00:29:45.610 --> 00:29:48.760
And in many ways it sort of also fits
602
00:29:48.760 --> 00:29:51.590
into his point around mindset
603
00:29:51.590 --> 00:29:55.320
and culture change because he also talked about, you know,
604
00:29:55.320 --> 00:29:58.650
it's not so much about the skill-set or the tech,
605
00:29:58.650 --> 00:30:01.720
it's much more about changing the ways of working,
606
00:30:01.720 --> 00:30:05.130
and changing the operating model and changing the mindset
607
00:30:05.130 --> 00:30:10.130
of what you can and should do with AI, with this tool
608
00:30:10.340 --> 00:30:14.640
and capability that he thought would just be as commonplace
609
00:30:14.640 --> 00:30:17.950
as an Excel spreadsheet, that is not pretty commonplace.
610
00:30:17.950 --> 00:30:18.783
Exactly.
611
00:30:18.783 --> 00:30:21.950
And so the importance of learning and ongoing learning
612
00:30:21.950 --> 00:30:24.633
and adaptability, I thought was quite elegant
613
00:30:24.633 --> 00:30:26.030
in what he said.
614
00:30:26.030 --> 00:30:27.620
Well, you're not gonna get an argument from me.
615
00:30:27.620 --> 00:30:29.830
I mean, I'm, I'm a professor, I'm an academic.
616
00:30:29.830 --> 00:30:30.663
So I think that.
617
00:30:30.663 --> 00:30:32.230
Yeah, you love learning, right.
618
00:30:32.230 --> 00:30:35.130
I'm biased to think that learning is kind of a big thing
619
00:30:36.030 --> 00:30:38.090
but even more than that, he also mentioned a little bit
620
00:30:38.090 --> 00:30:40.810
the importance of liberal arts thinking in that learning.
621
00:30:40.810 --> 00:30:44.490
You and I, we make fun of our engineering backgrounds a lot
622
00:30:44.490 --> 00:30:46.630
but as we're seeing these technologies get easier
623
00:30:46.630 --> 00:30:48.130
and easier to use,
624
00:30:48.130 --> 00:30:50.660
it's really highlighting the importance of the human,
625
00:30:50.660 --> 00:30:53.580
and importance of the human working with the machine.
626
00:30:53.580 --> 00:30:55.640
I think, you know, if we go back 20 or 30 years ago,
627
00:30:55.640 --> 00:30:57.860
there was so much talk about the death of IT,
628
00:30:57.860 --> 00:31:00.720
And IT doesn't matter, you remember that phase.
629
00:31:00.720 --> 00:31:01.553
Yeah.
630
00:31:01.553 --> 00:31:03.310
But, nope, that didn't happen a bit.
631
00:31:03.310 --> 00:31:05.850
I mean, as IT became easier to use,
632
00:31:05.850 --> 00:31:07.800
companies just wanted more and more of it.
633
00:31:07.800 --> 00:31:10.320
And this is the natural extension of that.
634
00:31:10.320 --> 00:31:12.380
Yeah, and I think there's this notion
635
00:31:12.380 --> 00:31:17.380
of technology raising the playing field
636
00:31:17.380 --> 00:31:20.750
so that humans can operate at a higher level,
637
00:31:20.750 --> 00:31:23.410
and then humans inventing better technology
638
00:31:23.410 --> 00:31:26.910
so that, that level again keeps, you know, getting raised.
639
00:31:26.910 --> 00:31:29.860
I think that's sort of a common theme
640
00:31:29.860 --> 00:31:31.930
that's happened with technology.
641
00:31:31.930 --> 00:31:33.773
You know, actually chess is a great example of that,
642
00:31:33.773 --> 00:31:37.120
because if you look at how 20 years ago,
643
00:31:37.120 --> 00:31:39.280
you talked about Sam 20 years ago, the death of IT.
644
00:31:39.280 --> 00:31:42.220
Like 20 years ago, 25, 30 years ago,
645
00:31:42.220 --> 00:31:44.870
it was almost like the death of the computer in chess
646
00:31:44.870 --> 00:31:47.810
because it was like argued that there's no way.
647
00:31:47.810 --> 00:31:48.643
Yeah. Right.
648
00:31:48.643 --> 00:31:51.720
Like no way a human could be beaten by a computer.
649
00:31:51.720 --> 00:31:54.300
And then the game change when yeah.
650
00:31:54.300 --> 00:31:58.170
With, with Kasparov lost to Deep Blue,
651
00:31:58.170 --> 00:32:02.760
but then what happened is chess players got smarter.
652
00:32:02.760 --> 00:32:06.650
So the chess Elo ranking, the highest ranking
653
00:32:06.650 --> 00:32:07.960
of highest chess players
654
00:32:07.960 --> 00:32:10.480
has been, you know, steadily increasing.
655
00:32:10.480 --> 00:32:11.313
Right.
656
00:32:11.313 --> 00:32:12.480
Because of how, you know,
657
00:32:12.480 --> 00:32:15.203
AI has helped humans get smarter.
658
00:32:16.630 --> 00:32:17.620
Thanks for listening.
659
00:32:17.620 --> 00:32:19.310
Next time, we'll talk with JoAnn Stonier
660
00:32:19.310 --> 00:32:20.930
Chief Data Officer at MasterCard
661
00:32:20.930 --> 00:32:23.920
about how MasterCard uses design thinking to ensure its use
662
00:32:23.920 --> 00:32:26.563
of AI supports its overall business strategy.
663
00:32:27.660 --> 00:32:30.460
Thanks for listening to Me, Myself, and AI.
664
00:32:30.460 --> 00:32:32.050
If you're enjoying the show,
665
00:32:32.050 --> 00:32:34.180
take a minute to write us a review.
666
00:32:34.180 --> 00:32:35.800
If you send us a screenshot,
667
00:32:35.800 --> 00:32:39.100
we'll send you a collection of MIT SMRs best articles
668
00:32:39.100 --> 00:32:42.650
on artificial intelligence free for a limited time.
669
00:32:42.650 --> 00:32:47.353
Send your review screenshot to smrfeedback@mit.edu.