WEBVTT
1
00:00:01.460 --> 00:00:05.060
Of course managers can change processes to use AI,
2
00:00:05.060 --> 00:00:08.030
but how does adopting AI change organizations?
3
00:00:08.030 --> 00:00:10.145
AI is a force of change, but
4
00:00:10.145 --> 00:00:14.490
change is not easy, and it's got to be a learning process.
5
00:00:14.490 --> 00:00:16.363
In this episode, Gina Chung, DHL,
6
00:00:16.363 --> 00:00:19.800
relates to how adopting AI can shift a corporate culture
7
00:00:19.800 --> 00:00:21.780
to embrace innovation.
8
00:00:21.780 --> 00:00:23.630
Welcome to Me, Myself, and AI,
9
00:00:23.630 --> 00:00:26.550
a podcast on artificial intelligence and business.
10
00:00:26.550 --> 00:00:29.750
Each week we introduce you to someone innovating with AI.
11
00:00:29.750 --> 00:00:32.152
I'm Sam Ransbotham, professor of information systems
12
00:00:32.152 --> 00:00:33.900
at Boston College,
13
00:00:33.900 --> 00:00:35.587
and I'm also the guest editor for the AI
14
00:00:35.587 --> 00:00:38.130
and Business Strategy Big Idea program
15
00:00:38.130 --> 00:00:39.988
at MIT Sloan Management Review.
16
00:00:39.988 --> 00:00:41.820
And I'm Shervin Khodabandeh,
17
00:00:41.820 --> 00:00:45.947
senior partner with BCG, and I co-lead BCG's AI practice
18
00:00:45.947 --> 00:00:47.670
in North America.
19
00:00:47.670 --> 00:00:52.670
And together, BCG and MIT SMR have been researching AI
20
00:00:52.670 --> 00:00:56.350
for four years, interviewing hundreds of practitioners,
21
00:00:56.350 --> 00:00:58.480
and surveying thousands of companies
22
00:00:58.480 --> 00:01:01.810
on what it takes to build and deploy
23
00:01:01.810 --> 00:01:04.350
and scale AI capabilities
24
00:01:04.350 --> 00:01:07.960
and really transform the way organizations operate.
25
00:01:07.960 --> 00:01:09.360
So in the last couple of episodes,
26
00:01:09.360 --> 00:01:12.320
we've talked with Walmart, we've talked with Humana,
27
00:01:12.320 --> 00:01:13.470
and I'm pretty excited today
28
00:01:13.470 --> 00:01:15.583
we're talking with Gina Chung from DHL.
29
00:01:16.840 --> 00:01:18.193
Hi, Gina, welcome to the show.
30
00:01:18.193 --> 00:01:20.120
Can you take a minute and introduce yourself
31
00:01:20.120 --> 00:01:22.170
and tell us a little bit about your role?
32
00:01:22.170 --> 00:01:23.420
Hi, I'm Gina Chung.
33
00:01:23.420 --> 00:01:26.820
I head up innovation for DHL in the Americas region,
34
00:01:26.820 --> 00:01:27.930
and as a part of this role,
35
00:01:27.930 --> 00:01:31.440
I also operate our innovation center out here in Chicago
36
00:01:31.440 --> 00:01:33.890
that's focused on helping supply chain leaders
37
00:01:33.890 --> 00:01:36.970
leverage technologies like AI, robotics,
38
00:01:36.970 --> 00:01:39.850
wearables in our global operations.
39
00:01:39.850 --> 00:01:40.683
So how did you get there?
40
00:01:40.683 --> 00:01:42.920
How did you end up at that position?
41
00:01:42.920 --> 00:01:44.410
I might actually answer this
42
00:01:44.410 --> 00:01:46.181
by starting back in college.
43
00:01:46.181 --> 00:01:49.104
So I started college wanting to be an investment banker
44
00:01:49.104 --> 00:01:51.899
and very quickly figured out that's not for me.
45
00:01:51.899 --> 00:01:53.930
But I took a supply chain course
46
00:01:53.930 --> 00:01:55.560
and ended up becoming fascinated
47
00:01:55.560 --> 00:01:57.046
by how things get manufactured
48
00:01:57.046 --> 00:01:59.920
and how things get distributed.
49
00:01:59.920 --> 00:02:01.500
I think it's something to do with the fact that
50
00:02:01.500 --> 00:02:04.124
I'm from New Zealand and grew up in a pretty isolated part
51
00:02:04.124 --> 00:02:05.660
of the world.
52
00:02:05.660 --> 00:02:08.032
But anyway, after college I joined DHL
53
00:02:08.032 --> 00:02:10.010
at their headquarters in Germany.
54
00:02:10.010 --> 00:02:11.950
I've helped launch, eight years ago,
55
00:02:11.950 --> 00:02:14.620
some of our very first projects working with startups
56
00:02:14.620 --> 00:02:15.960
in our operations.
57
00:02:15.960 --> 00:02:19.440
And a few years ago, they then asked me to have the pleasure
58
00:02:19.440 --> 00:02:22.680
of launching our third innovation center
59
00:02:22.680 --> 00:02:25.603
that serves the Americas region out here in Chicago.
60
00:02:26.600 --> 00:02:28.160
Actually, I think we can end right there that way.
61
00:02:28.160 --> 00:02:29.810
I love it when someone gets converted
62
00:02:29.810 --> 00:02:33.040
from investment banker to supply chain and operations.
63
00:02:33.040 --> 00:02:34.440
I think that's great.
64
00:02:34.440 --> 00:02:37.290
So can you give us an example of a project
65
00:02:37.290 --> 00:02:40.730
that your team has applied a technology like AI to?
66
00:02:40.730 --> 00:02:43.280
Yeah, so one project that we've completed
67
00:02:43.280 --> 00:02:47.160
using AI and computer vision is to use it to automate
68
00:02:47.160 --> 00:02:49.640
the inspection of pallets in our world.
69
00:02:49.640 --> 00:02:53.080
So currently today, our operators have to see whether
70
00:02:53.080 --> 00:02:55.310
you can stack one pallet on top of another
71
00:02:55.310 --> 00:02:58.070
and that might seem very trivial, but actually,
72
00:02:58.070 --> 00:02:59.941
it's sometimes very difficult to identify
73
00:02:59.941 --> 00:03:02.691
whether that bottom pallet's going to be damaged,
74
00:03:02.691 --> 00:03:05.280
and you have to look for certain markers,
75
00:03:05.280 --> 00:03:06.660
certain indications.
76
00:03:06.660 --> 00:03:09.170
And through combining a camera vision system
77
00:03:09.170 --> 00:03:12.285
with AI software, we're able to automate that process
78
00:03:12.285 --> 00:03:14.960
and reduce potential damages
79
00:03:14.960 --> 00:03:19.580
as well as also optimize utilization in our aircraft.
80
00:03:19.580 --> 00:03:21.410
So who uses this system?
81
00:03:21.410 --> 00:03:23.013
Who uses it, it's our operations.
82
00:03:23.013 --> 00:03:25.860
So people on the shop floor that are helping
83
00:03:25.860 --> 00:03:27.480
to load our aircrafts.
84
00:03:27.480 --> 00:03:29.170
The pallets pass through our system,
85
00:03:29.170 --> 00:03:31.570
it flags if the pallet can't be stacked,
86
00:03:31.570 --> 00:03:33.770
and then our operators are able to see that
87
00:03:33.770 --> 00:03:36.541
and then take that pallet out and give it the right marker
88
00:03:36.541 --> 00:03:38.420
to say that it can't be stacked.
89
00:03:38.420 --> 00:03:41.210
And then there are some other steps in that process
90
00:03:41.210 --> 00:03:43.910
to deal with a pallet that can't be stacked.
91
00:03:43.910 --> 00:03:45.940
Before somebody would have to be trained
92
00:03:45.940 --> 00:03:48.690
on how to identify whether a pallet can be stacked or not.
93
00:03:48.690 --> 00:03:50.490
So they have to be trained on look out for these
94
00:03:50.490 --> 00:03:53.590
kinds of markers, these kinds of indentations,
95
00:03:53.590 --> 00:03:55.470
and then each pallet as it comes through,
96
00:03:55.470 --> 00:03:56.850
you'd have to kind of walk around it
97
00:03:56.850 --> 00:03:58.965
and make a note and type it into the system.
98
00:03:58.965 --> 00:04:01.687
But now, we can actually automate that process
99
00:04:01.687 --> 00:04:05.030
using AI and computer vision.
100
00:04:05.030 --> 00:04:08.640
This is a great example of how AI
101
00:04:08.640 --> 00:04:11.773
is taking human, unnecessary human roles away,
102
00:04:11.773 --> 00:04:15.820
probably even increasing the accuracy and precision,
103
00:04:15.820 --> 00:04:18.390
I would assume, of like even picking things
104
00:04:18.390 --> 00:04:20.380
that humans might've missed.
105
00:04:20.380 --> 00:04:25.100
Can you comment on how the process that you guys go through
106
00:04:25.100 --> 00:04:27.430
to make that AI engine intelligent?
107
00:04:27.430 --> 00:04:29.190
I'm assuming there was humans involved
108
00:04:29.190 --> 00:04:30.540
as it was being designed.
109
00:04:30.540 --> 00:04:34.173
Can you comment on how that worked?
110
00:04:34.173 --> 00:04:37.610
Absolutely, so, you know, I always say AI
111
00:04:37.610 --> 00:04:39.830
is a very broad term, so, you know, you can use AI
112
00:04:39.830 --> 00:04:41.370
for in robotics.
113
00:04:41.370 --> 00:04:42.860
You can use AI with computer vision.
114
00:04:42.860 --> 00:04:44.757
You can use AI algorithmically.
115
00:04:44.757 --> 00:04:47.769
For this particular use case, we worked with a partner,
116
00:04:47.769 --> 00:04:51.292
a startup actually, and together with our operations
117
00:04:51.292 --> 00:04:53.679
and a startup we developed the algorithm
118
00:04:53.679 --> 00:04:56.060
for this particular use case,
119
00:04:56.060 --> 00:04:58.185
and it was designed with a lot of images initially.
120
00:04:58.185 --> 00:05:00.382
So just collecting images and images of pallets,
121
00:05:00.382 --> 00:05:03.870
working with our operations to train that algorithm
122
00:05:03.870 --> 00:05:07.187
to look out for these specific markers and indentations.
123
00:05:07.187 --> 00:05:08.629
And then after it's deployed,
124
00:05:08.629 --> 00:05:10.810
there's, you know, a recommendation, right,
125
00:05:10.810 --> 00:05:12.865
that this pallet is not stackable,
126
00:05:12.865 --> 00:05:16.110
and it's up to also our workers to trust in that.
127
00:05:16.110 --> 00:05:19.201
If they think it's inaccurate they can make a marking,
128
00:05:19.201 --> 00:05:21.850
and we'll look at it and see if the AI,
129
00:05:21.850 --> 00:05:23.700
the algorithm, needs to be improved.
130
00:05:23.700 --> 00:05:25.227
So we have that loop in the process
131
00:05:25.227 --> 00:05:27.840
to continuously train the algorithm
132
00:05:27.840 --> 00:05:29.789
because our shipments come in all different
133
00:05:29.789 --> 00:05:31.870
shapes and sizes.
134
00:05:31.870 --> 00:05:33.040
That is a great point,
135
00:05:33.040 --> 00:05:35.950
because as we know, without that loop,
136
00:05:35.950 --> 00:05:37.210
it's going to make mistakes
137
00:05:37.210 --> 00:05:38.490
and those mistakes will compound.
138
00:05:38.490 --> 00:05:41.084
So really interesting how that loop is made.
139
00:05:41.084 --> 00:05:44.368
So when on day one, when you turn the system on,
140
00:05:44.368 --> 00:05:46.304
what kind of reaction did you get from people?
141
00:05:46.304 --> 00:05:47.540
How did people feel?
142
00:05:47.540 --> 00:05:50.430
Yeah, we always like to say the first day for AI
143
00:05:50.430 --> 00:05:51.410
is the worst day.
144
00:05:51.410 --> 00:05:54.234
And what we mean by that is, you know, the algorithm,
145
00:05:54.234 --> 00:05:56.770
it only gets more accurate over time as you
146
00:05:56.770 --> 00:05:58.210
ingest more and more data and more
147
00:05:58.210 --> 00:05:59.810
and more different exceptions.
148
00:05:59.810 --> 00:06:02.589
So when we turn on the AI, especially during a pilot,
149
00:06:02.589 --> 00:06:04.700
the accuracy looks pretty low
150
00:06:04.700 --> 00:06:06.337
and people start to question,
151
00:06:06.337 --> 00:06:09.310
"Hey, I don't think that the AI can actually do this."
152
00:06:09.310 --> 00:06:11.709
But then as we see the pilot go on week after week,
153
00:06:11.709 --> 00:06:15.520
and ingest more and more data, it also learns from our
154
00:06:15.520 --> 00:06:17.857
workers as well that the accuracy drastically increases.
155
00:06:17.857 --> 00:06:20.142
And then people start to really believe in it
156
00:06:20.142 --> 00:06:23.010
and that starts to make their lives easier.
157
00:06:23.010 --> 00:06:25.717
So we try to focus on automating those activities
158
00:06:25.717 --> 00:06:28.444
that are really tedious, repetitive.
159
00:06:28.444 --> 00:06:32.070
We try to also build the accuracy to such a confidence level
160
00:06:32.070 --> 00:06:34.100
that people trust it and embrace it rather than,
161
00:06:34.100 --> 00:06:37.410
you know, the AI spitting out recommendations
162
00:06:37.410 --> 00:06:39.718
that people know aren't correct, yeah.
163
00:06:39.718 --> 00:06:41.813
I think that's a great point, Gina.
164
00:06:41.813 --> 00:06:46.120
The idea that it's not being forced as brute force,
165
00:06:46.120 --> 00:06:48.720
but the users are involved from the beginning
166
00:06:48.720 --> 00:06:52.250
in the design, and they actually see it better
167
00:06:52.250 --> 00:06:53.830
their own judgment.
168
00:06:53.830 --> 00:06:56.955
I could imagine if you guys did it differently,
169
00:06:56.955 --> 00:07:00.062
maybe a more sort of old school way of saying,
170
00:07:00.062 --> 00:07:02.547
"Well, this is the best thing because it's using
171
00:07:02.547 --> 00:07:05.637
all the algorithms and all the signals
172
00:07:05.637 --> 00:07:07.087
and it knows more than you.
173
00:07:07.087 --> 00:07:10.147
And if you don't use that, we're going to take points off
174
00:07:10.147 --> 00:07:11.930
of you or whatever,"
175
00:07:11.930 --> 00:07:14.120
the kind of backlash you would have gotten.
176
00:07:14.120 --> 00:07:16.020
So it's really great to hear that.
177
00:07:16.020 --> 00:07:18.028
Yeah, I think it's very important
178
00:07:18.028 --> 00:07:22.300
to have that option available for the end-users, right?
179
00:07:22.300 --> 00:07:24.080
So that, you know, you have a lot of people
180
00:07:24.080 --> 00:07:26.320
in your workforce that are experts at what they do,
181
00:07:26.320 --> 00:07:28.440
and they've been doing it for years and years.
182
00:07:28.440 --> 00:07:31.270
So the tools that we're introducing are there to aid
183
00:07:31.270 --> 00:07:33.460
our workforce and our employees.
184
00:07:33.460 --> 00:07:36.340
So that's something that we have always kept front of mind
185
00:07:36.340 --> 00:07:39.098
as we drive our AI agenda at DHL.
186
00:07:39.098 --> 00:07:40.590
So you mentioned the users,
187
00:07:40.590 --> 00:07:43.380
how much do users need to understand that this is
188
00:07:43.380 --> 00:07:46.718
AI versus just a computer program?
189
00:07:46.718 --> 00:07:49.050
How do you get people to accept
190
00:07:49.050 --> 00:07:50.859
some sort of recommendations?
191
00:07:50.859 --> 00:07:53.770
Do you do a lot of training or how does that work?
192
00:07:53.770 --> 00:07:55.020
Do they need to know if it's AI,
193
00:07:55.020 --> 00:07:56.280
I guess is one way of phrasing that?
194
00:07:56.280 --> 00:07:59.510
I think people are interested to know if it's AI,
195
00:07:59.510 --> 00:08:01.290
but a lot of the time people just want
196
00:08:01.290 --> 00:08:02.900
to get on with it, right?
197
00:08:02.900 --> 00:08:05.719
So our customers are, you know, they ask us for a solution.
198
00:08:05.719 --> 00:08:07.927
They don't want to understand in detail
199
00:08:07.927 --> 00:08:09.547
how that model was developed,
200
00:08:09.547 --> 00:08:11.300
how did we develop that algorithm,
201
00:08:11.300 --> 00:08:12.870
what type of techniques were used.
202
00:08:12.870 --> 00:08:14.410
They just want something that works,
203
00:08:14.410 --> 00:08:16.840
that's reliable that, you know, is at the price point
204
00:08:16.840 --> 00:08:18.190
that meets their needs.
205
00:08:18.190 --> 00:08:20.250
And the same goes for our operations,
206
00:08:20.250 --> 00:08:22.698
our kind of end-users of some of our AI tools.
207
00:08:22.698 --> 00:08:25.220
They just want to use some, have something that's easy
208
00:08:25.220 --> 00:08:27.250
to use that makes their lives easier
209
00:08:27.250 --> 00:08:28.740
and that they can trust.
210
00:08:28.740 --> 00:08:31.230
And if all those things are there they don't want to,
211
00:08:31.230 --> 00:08:34.940
you know, dig into understanding what latest ML technique
212
00:08:34.940 --> 00:08:37.060
was deployed to make that happen.
213
00:08:37.060 --> 00:08:40.320
A key part of the success is the change management.
214
00:08:40.320 --> 00:08:42.208
Many of the technologies that we're introducing
215
00:08:42.208 --> 00:08:44.090
into our operations,
216
00:08:44.090 --> 00:08:46.911
it's designed to make the lives of our workforce easier.
217
00:08:46.911 --> 00:08:49.610
So I think in the past you know, change management,
218
00:08:49.610 --> 00:08:50.500
yes, it's important.
219
00:08:50.500 --> 00:08:54.307
Yes, we need to focus on culture, changing communications,
220
00:08:54.307 --> 00:08:55.627
changing our processes,
221
00:08:55.627 --> 00:08:58.580
but over time, I think we've learned as a company,
222
00:08:58.580 --> 00:09:01.851
just how important and how critical change management is.
223
00:09:01.851 --> 00:09:04.955
Especially when you're introducing cutting-edge AI,
224
00:09:04.955 --> 00:09:06.490
cutting-edge robotics,
225
00:09:06.490 --> 00:09:07.970
it's completely new forms of
226
00:09:07.970 --> 00:09:10.447
human-machine interaction, collaboration.
227
00:09:10.447 --> 00:09:13.990
So that is something that is always top of mind
228
00:09:13.990 --> 00:09:15.919
in our innovation initiatives.
229
00:09:15.919 --> 00:09:18.199
Could you share some other examples
230
00:09:18.199 --> 00:09:21.390
of use cases with AI?
231
00:09:21.390 --> 00:09:23.170
Yes, so I have a couple of ones
232
00:09:23.170 --> 00:09:25.820
that are really exciting actually that ties in a bit
233
00:09:25.820 --> 00:09:28.210
to keeping humans in the loop.
234
00:09:28.210 --> 00:09:29.910
So we're also working with a startup
235
00:09:29.910 --> 00:09:33.560
on implementing AI-driven, route optimization
236
00:09:33.560 --> 00:09:35.356
and last-mile delivery and pickup.
237
00:09:35.356 --> 00:09:38.239
So there, again, we look at leveraging the data,
238
00:09:38.239 --> 00:09:42.023
looking into the route, looking into other external factors
239
00:09:42.023 --> 00:09:44.927
to optimize the best path for pickup and delivery,
240
00:09:44.927 --> 00:09:47.080
and these pickup and delivery requests
241
00:09:47.080 --> 00:09:48.520
come throughout the day.
242
00:09:48.520 --> 00:09:51.150
So it's constantly optimizing the route,
243
00:09:51.150 --> 00:09:53.375
and our drivers within actually get the recommendation
244
00:09:53.375 --> 00:09:57.418
on their kind of tablet or on their phones and the vehicle.
245
00:09:57.418 --> 00:10:00.870
They can either choose to follow that recommendation,
246
00:10:00.870 --> 00:10:03.341
or they can actually choose to not follow the recommendation
247
00:10:03.341 --> 00:10:06.030
because, you know, they've driven these routes for years
248
00:10:06.030 --> 00:10:09.010
and years, and some of them will just know the best way
249
00:10:09.010 --> 00:10:10.330
for various different reasons.
250
00:10:10.330 --> 00:10:13.025
So they're able to follow it, not follow it.
251
00:10:13.025 --> 00:10:15.634
If they don't, we can then try to understand why,
252
00:10:15.634 --> 00:10:18.730
and again, improve that algorithm for maybe a driver
253
00:10:18.730 --> 00:10:21.570
that's brand new and doesn't have that tribal knowledge.
254
00:10:21.570 --> 00:10:24.800
That's really a great example, yep.
255
00:10:24.800 --> 00:10:27.810
Somebody was making that routing before now,
256
00:10:27.810 --> 00:10:30.120
and then you've introduced an AI element to it.
257
00:10:30.120 --> 00:10:32.780
I'm guessing a lot of that may have been automated before,
258
00:10:32.780 --> 00:10:35.737
but so how do people react when you say,
259
00:10:35.737 --> 00:10:37.687
"All right now, I want you to follow what this computer
260
00:10:37.687 --> 00:10:38.990
is telling you to do."
261
00:10:38.990 --> 00:10:41.110
Are people thrilled?
262
00:10:41.110 --> 00:10:42.280
Are they angry?
263
00:10:42.280 --> 00:10:44.850
What's the reaction with people?
264
00:10:44.850 --> 00:10:46.770
I always like to say, "You cannot trivialize,
265
00:10:46.770 --> 00:10:48.223
you know, the people aspect."
266
00:10:48.223 --> 00:10:50.360
I mean, the AI can make a recommendation,
267
00:10:50.360 --> 00:10:51.460
but it's actually people
268
00:10:51.460 --> 00:10:53.374
that are going to take the action, right?
269
00:10:53.374 --> 00:10:55.800
That pallet is not going to just move by itself
270
00:10:55.800 --> 00:10:58.170
somewhere now that it's come up with this recommendation.
271
00:10:58.170 --> 00:11:01.040
So with a lot of these projects that our team do,
272
00:11:01.040 --> 00:11:03.550
we try to make sure that we have the right people
273
00:11:03.550 --> 00:11:04.383
at the table.
274
00:11:04.383 --> 00:11:07.680
So it's not just the innovation team and the leadership,
275
00:11:07.680 --> 00:11:09.880
but it's also people on the shop floor
276
00:11:09.880 --> 00:11:13.760
that'll actually be using the AI as an end-user.
277
00:11:13.760 --> 00:11:16.041
So we try to get their buy-in very early on,
278
00:11:16.041 --> 00:11:18.147
and then we also give that option to say,
279
00:11:18.147 --> 00:11:20.597
"Actually, you know, the recommendation is incorrect,
280
00:11:20.597 --> 00:11:23.180
or I think this is the better way of doing it."
281
00:11:23.180 --> 00:11:26.150
So we allow that option so that we're not forcing everybody
282
00:11:26.150 --> 00:11:27.770
to follow that recommendation,
283
00:11:27.770 --> 00:11:29.690
but we still give the freedom to, you know,
284
00:11:29.690 --> 00:11:32.782
have people make their own choices as well.
285
00:11:32.782 --> 00:11:35.674
Gina, I want to build on that comment around,
286
00:11:35.674 --> 00:11:38.258
you know, your customers want it to work.
287
00:11:38.258 --> 00:11:41.530
They don't want to necessarily understand all the details
288
00:11:41.530 --> 00:11:43.590
and then tie it to the teams
289
00:11:43.590 --> 00:11:46.626
you have at your innovation hubs.
290
00:11:46.626 --> 00:11:51.626
What are the kind of attributes or sort of personality types
291
00:11:51.770 --> 00:11:54.180
that you're finding your technical folks
292
00:11:54.180 --> 00:11:56.640
must have to be able to thrive
293
00:11:56.640 --> 00:11:58.518
in this kind of an environment?
294
00:11:58.518 --> 00:12:01.194
I always say for our innovation managers,
295
00:12:01.194 --> 00:12:03.818
they're the ones that go into our operations
296
00:12:03.818 --> 00:12:05.661
and work with customers and partners
297
00:12:05.661 --> 00:12:07.710
to bring these projects to life.
298
00:12:07.710 --> 00:12:10.040
There are three kinds of success factors.
299
00:12:10.040 --> 00:12:13.790
One is that they are able to have a deep understanding
300
00:12:13.790 --> 00:12:15.010
of technology.
301
00:12:15.010 --> 00:12:16.850
I'm not a technical person myself,
302
00:12:16.850 --> 00:12:18.560
so I always say that as a disclaimer,
303
00:12:18.560 --> 00:12:20.460
but I really make an effort to keep up
304
00:12:20.460 --> 00:12:21.830
with the pace of technology
305
00:12:21.830 --> 00:12:24.290
and try to understand different concepts
306
00:12:24.290 --> 00:12:26.542
and learn that pretty quickly.
307
00:12:26.542 --> 00:12:29.630
The second part is understanding our operations.
308
00:12:29.630 --> 00:12:32.740
So one of the big kind of no-nos of corporate innovation
309
00:12:32.740 --> 00:12:34.660
is just sitting at an innovation center
310
00:12:34.660 --> 00:12:37.321
and being disconnected with the realities of your business.
311
00:12:37.321 --> 00:12:39.720
So we really make sure that we're out there
312
00:12:39.720 --> 00:12:42.110
at the operations learning about some of the challenges,
313
00:12:42.110 --> 00:12:44.470
talking to our people on the front line.
314
00:12:44.470 --> 00:12:47.709
And the third one is being close to our customers,
315
00:12:47.709 --> 00:12:50.776
to being able to communicate some of these complex ideas
316
00:12:50.776 --> 00:12:54.094
and concepts in a way that our customers can digest them
317
00:12:54.094 --> 00:12:57.694
and translate that into business value drivers
318
00:12:57.694 --> 00:13:01.350
that our customers will embrace and want us to embark on
319
00:13:01.350 --> 00:13:02.590
and work together with them on.
320
00:13:02.590 --> 00:13:04.860
So it's kind of, I would say, a mix of an individual
321
00:13:04.860 --> 00:13:06.850
that has a good technical background
322
00:13:06.850 --> 00:13:09.170
but can really clearly communicate these concepts,
323
00:13:09.170 --> 00:13:12.226
get buy-in, and also down to earth that they can work
324
00:13:12.226 --> 00:13:15.073
in our operations on some of these projects.
325
00:13:16.140 --> 00:13:17.070
So to quick followup,
326
00:13:17.070 --> 00:13:19.060
who initiates these sorts of projects?
327
00:13:19.060 --> 00:13:20.798
Is your innovation team looking for them,
328
00:13:20.798 --> 00:13:22.640
scouring, trying to like, you know,
329
00:13:22.640 --> 00:13:23.850
we've got some cool tools,
330
00:13:23.850 --> 00:13:25.380
can we, where can we use them?
331
00:13:25.380 --> 00:13:27.167
Or do you have people approaching you, saying,
332
00:13:27.167 --> 00:13:28.600
"We've got a problem,"
333
00:13:28.600 --> 00:13:30.840
which directions are these flowing?
334
00:13:30.840 --> 00:13:32.950
I would say it's pretty organic at DHL.
335
00:13:32.950 --> 00:13:35.900
So sometimes it might start with a use case, right?
336
00:13:35.900 --> 00:13:37.977
So it might be a business unit saying,
337
00:13:37.977 --> 00:13:40.347
"Hey, we have this specific challenge,
338
00:13:40.347 --> 00:13:42.750
like what are some solutions to solve this?"
339
00:13:42.750 --> 00:13:46.230
Other times, we work very deeply with a whole host
340
00:13:46.230 --> 00:13:48.827
of different startups and there's a new one that comes by,
341
00:13:48.827 --> 00:13:51.220
and we know that it makes things more efficient
342
00:13:51.220 --> 00:13:53.549
then we can find a problem to solve there.
343
00:13:53.549 --> 00:13:55.630
So it comes in different directions.
344
00:13:55.630 --> 00:13:56.660
Sometimes it's our team,
345
00:13:56.660 --> 00:13:58.060
sometimes it's itself business unit,
346
00:13:58.060 --> 00:13:59.810
sometimes it's our customers,
347
00:13:59.810 --> 00:14:02.130
and sometimes it's just a partner that's come up
348
00:14:02.130 --> 00:14:04.290
with a really groundbreaking solution
349
00:14:04.290 --> 00:14:07.698
that we know holds a lot of potential in our business.
350
00:14:07.698 --> 00:14:09.710
As a professor, we call that the E,
351
00:14:09.710 --> 00:14:10.827
all of the above answer,
352
00:14:10.827 --> 00:14:13.280
(Gina and Sam laughs)
353
00:14:13.280 --> 00:14:14.520
coming from everywhere.
354
00:14:14.520 --> 00:14:16.574
So what's exciting about this to you?
355
00:14:16.574 --> 00:14:17.700
What's fun?
356
00:14:17.700 --> 00:14:20.183
What makes you dread getting up and going into a project?
357
00:14:20.183 --> 00:14:22.410
What makes you excited about going into a project?
358
00:14:22.410 --> 00:14:24.473
What's fun about it or exciting, if anything?
359
00:14:25.470 --> 00:14:28.010
The exciting part of AI robotics
360
00:14:28.010 --> 00:14:29.710
and some of these other topics that I work on
361
00:14:29.710 --> 00:14:32.851
is, it is truly shaping the future of logistics.
362
00:14:32.851 --> 00:14:36.120
So, you know, some of the first robotics projects
363
00:14:36.120 --> 00:14:39.037
we did back in 2016, it was one of the first handfuls
364
00:14:39.037 --> 00:14:42.210
of robots we're putting into our warehouses,
365
00:14:42.210 --> 00:14:44.310
and then four years later, it's, you know,
366
00:14:44.310 --> 00:14:46.765
one of the highest topics on the agenda
367
00:14:46.765 --> 00:14:47.860
of our business units.
368
00:14:47.860 --> 00:14:50.610
It's all about, you know, how can we leverage new automation
369
00:14:50.610 --> 00:14:52.040
in our warehouses.
370
00:14:52.040 --> 00:14:53.670
So I think that's always exciting that
371
00:14:53.670 --> 00:14:56.478
some of these early proof of concepts and pilots we do,
372
00:14:56.478 --> 00:14:58.940
they might be the first for the industry.
373
00:14:58.940 --> 00:15:01.190
And then several years later, they become the norm
374
00:15:01.190 --> 00:15:02.770
and it's just the way of doing business,
375
00:15:02.770 --> 00:15:05.350
so that always keeps it really exciting.
376
00:15:05.350 --> 00:15:08.138
And then of course, working with some brilliant individuals,
377
00:15:08.138 --> 00:15:10.864
both within the company but also with our partners,
378
00:15:10.864 --> 00:15:13.990
that always makes life exciting day to day.
379
00:15:13.990 --> 00:15:15.710
I feel like some of that's the curse of AI
380
00:15:15.710 --> 00:15:17.510
is because it's all shiny and new,
381
00:15:17.510 --> 00:15:19.310
and then suddenly it's just what everybody's supposed
382
00:15:19.310 --> 00:15:20.143
to be doing
383
00:15:20.143 --> 00:15:22.770
and it's normal, and you always have to be searching out
384
00:15:22.770 --> 00:15:25.496
for that next, that next cool thing.
385
00:15:25.496 --> 00:15:26.329
Yeah.
386
00:15:26.329 --> 00:15:28.290
I think if we went back to the 17th century
387
00:15:28.290 --> 00:15:29.987
and showed someone spell check, they think,
388
00:15:29.987 --> 00:15:31.977
"Oh man, my quill will actually check, you know,
389
00:15:31.977 --> 00:15:34.529
underline with red when I misspell a word,
390
00:15:34.529 --> 00:15:36.270
that would be sorcery."
391
00:15:36.270 --> 00:15:39.280
But now, you know, if the paper doesn't practically
392
00:15:39.280 --> 00:15:40.810
write itself we're bored with it,
393
00:15:40.810 --> 00:15:43.640
and it doesn't really seem like cool technology.
394
00:15:43.640 --> 00:15:45.070
Is there any cool technology coming
395
00:15:45.070 --> 00:15:46.010
that you're fired up about,
396
00:15:46.010 --> 00:15:48.780
or you think that you can apply to do something,
397
00:15:48.780 --> 00:15:51.210
you know, what's short term on the horizon that's exciting?
398
00:15:51.210 --> 00:15:56.210
When it comes to AI, I think the evolution
399
00:15:56.290 --> 00:15:58.740
of some of our analytics services
400
00:15:58.740 --> 00:16:02.236
evolving into more advanced AI will be really exciting.
401
00:16:02.236 --> 00:16:05.387
So to give you one example, we developed some years ago
402
00:16:05.387 --> 00:16:08.852
a supply chain risk management tool called Resilience 360,
403
00:16:08.852 --> 00:16:11.284
which is very timely now because of everything
404
00:16:11.284 --> 00:16:13.300
that's happened this year, right?
405
00:16:13.300 --> 00:16:16.710
So with Resilience 360, it's a tool that alerts you
406
00:16:16.710 --> 00:16:18.530
if there's a risk that's going to disrupt
407
00:16:18.530 --> 00:16:19.689
your supply chain.
408
00:16:19.689 --> 00:16:23.147
And it's a true kind of big data analytics lighthouse tool
409
00:16:23.147 --> 00:16:26.950
at DHL, but we never could really predict the risk
410
00:16:26.950 --> 00:16:29.670
and then kind of quantify what that risk will do
411
00:16:29.670 --> 00:16:31.070
to your supply chain.
412
00:16:31.070 --> 00:16:34.420
And there, we're now working on leveraging AI
413
00:16:34.420 --> 00:16:36.390
and taking that to the next level.
414
00:16:36.390 --> 00:16:39.020
So I think that's a really exciting space.
415
00:16:39.020 --> 00:16:40.070
Thank you for taking the time
416
00:16:40.070 --> 00:16:41.253
to talk with us today, Gina.
417
00:16:41.253 --> 00:16:42.680
This was really great.
418
00:16:42.680 --> 00:16:43.830
Thanks Sam, thanks, Shervin,
419
00:16:43.830 --> 00:16:45.857
and it was great to talk to you.
420
00:16:45.857 --> 00:16:48.585
(gentle music)
421
00:16:48.585 --> 00:16:50.800
We really enjoyed talking to Gina.
422
00:16:50.800 --> 00:16:52.345
Shervin, let's recap a minute and talk about
423
00:16:52.345 --> 00:16:53.630
what we learned.
424
00:16:53.630 --> 00:16:55.872
One point is about AI being a big change.
425
00:16:55.872 --> 00:16:58.100
It's not just a tech thing.
426
00:16:58.100 --> 00:17:01.400
It's about change management, and she emphasized that.
427
00:17:01.400 --> 00:17:02.570
Yeah, I like that point,
428
00:17:02.570 --> 00:17:05.360
and I'm going to borrow that quote from her
429
00:17:05.360 --> 00:17:07.970
that the first day for AI is the worst day
430
00:17:07.970 --> 00:17:12.970
because it very much talks to how difficult it can be.
431
00:17:13.010 --> 00:17:15.733
And so yes, change management is critical,
432
00:17:15.733 --> 00:17:17.820
but it's also going to be difficult.
433
00:17:17.820 --> 00:17:21.370
And she talked about the process that she follows
434
00:17:21.370 --> 00:17:24.688
or she's created that brings the users
435
00:17:24.688 --> 00:17:29.090
and the operators into the design phase early on,
436
00:17:29.090 --> 00:17:34.090
so that they're not surprised by what AI ends up creating
437
00:17:34.630 --> 00:17:35.700
down the line,
438
00:17:35.700 --> 00:17:38.860
but they're very integral to the creation of it,
439
00:17:38.860 --> 00:17:41.335
they evolve it, they have the right expectations
440
00:17:41.335 --> 00:17:43.430
that it's not going to be perfect.
441
00:17:43.430 --> 00:17:46.050
We're working with this, we deployed, we tested,
442
00:17:46.050 --> 00:17:47.450
we see how it goes.
443
00:17:47.450 --> 00:17:50.240
And so I thought that was really, really elegant,
444
00:17:50.240 --> 00:17:53.600
how she talked about it's got to be a learning process,
445
00:17:53.600 --> 00:17:56.820
and it's got to be with the right expectation setting.
446
00:17:56.820 --> 00:17:59.264
But more importantly, the users have to be involved
447
00:17:59.264 --> 00:18:00.642
on an ongoing basis,
448
00:18:00.642 --> 00:18:04.710
not just at the end when the technical folks
449
00:18:04.710 --> 00:18:08.350
have built something and they're forcing it down
450
00:18:08.350 --> 00:18:09.470
to the users.
451
00:18:09.470 --> 00:18:10.930
I think it requires patience.
452
00:18:10.930 --> 00:18:12.630
There's a learning process involved,
453
00:18:12.630 --> 00:18:15.710
and if the organization is expecting things to run well
454
00:18:15.710 --> 00:18:16.950
the first day
455
00:18:16.950 --> 00:18:18.732
then there's going to be a lot of disappointment.
456
00:18:18.732 --> 00:18:20.190
Yeah, and I love that point
457
00:18:20.190 --> 00:18:25.190
because without patience and without that interactive,
458
00:18:25.445 --> 00:18:28.630
ongoing engagement of user,
459
00:18:28.630 --> 00:18:32.100
the innovation center, and the evolution of AI,
460
00:18:32.100 --> 00:18:33.500
there can't be a transformation.
461
00:18:33.500 --> 00:18:36.716
But if that process is being done cohesively,
462
00:18:36.716 --> 00:18:39.800
then you know, the sky's the limit.
463
00:18:39.800 --> 00:18:43.520
That's when the innovation center can begin
464
00:18:43.520 --> 00:18:47.220
to actually completely change processes
465
00:18:47.220 --> 00:18:51.012
and create new ones and really change the way people work.
466
00:18:51.012 --> 00:18:53.470
Setting that expectation that it will get better
467
00:18:53.470 --> 00:18:54.610
from that worst day,
468
00:18:54.610 --> 00:18:56.440
maybe that's a good benchmark to start with
469
00:18:56.440 --> 00:18:58.930
that it's not, we hope that it doesn't get worse.
470
00:18:58.930 --> 00:19:00.972
It's going to get better after that first day.
471
00:19:00.972 --> 00:19:03.236
But the other part of that is the idea that AI itself
472
00:19:03.236 --> 00:19:05.060
can change the organization.
473
00:19:05.060 --> 00:19:08.830
AI can be a force for change. Gina was in the,
474
00:19:08.830 --> 00:19:10.680
she's in the innovation group,
475
00:19:10.680 --> 00:19:15.267
and their charge is not to put AI across the organization.
476
00:19:15.267 --> 00:19:16.794
Their charge is to innovate,
477
00:19:16.794 --> 00:19:19.560
and it sounds like AI has been really instrumental
478
00:19:19.560 --> 00:19:22.480
in changing the culture around innovation at DHL.
479
00:19:22.480 --> 00:19:25.210
Yeah, no, I totally picked up on that as well
480
00:19:25.210 --> 00:19:26.240
on several dimensions.
481
00:19:26.240 --> 00:19:31.050
One is that the users of AI are,
482
00:19:31.050 --> 00:19:32.640
you know, you ask them 10 years ago,
483
00:19:32.640 --> 00:19:35.176
how was it being done while it was being done manually
484
00:19:35.176 --> 00:19:37.772
and how were people going down those routes
485
00:19:37.772 --> 00:19:41.210
while they were all deciding based on their own judgment.
486
00:19:41.210 --> 00:19:43.947
Of course, today, they have the AI telling them something,
487
00:19:43.947 --> 00:19:45.680
it's not being forced on them.
488
00:19:45.680 --> 00:19:48.390
So, but they have the benefit of that signal
489
00:19:48.390 --> 00:19:50.330
and that recommendation,
490
00:19:50.330 --> 00:19:52.901
and if they think they could do better,
491
00:19:52.901 --> 00:19:55.290
they will know whether they did better or not.
492
00:19:55.290 --> 00:19:57.800
And if they did better, then AI will know
493
00:19:57.800 --> 00:19:59.630
that it could do better,
494
00:19:59.630 --> 00:20:02.520
and that process itself is introducing this change
495
00:20:02.520 --> 00:20:03.770
you're talking about Sam.
496
00:20:03.770 --> 00:20:06.549
So I thought that's really an interesting way
497
00:20:06.549 --> 00:20:08.140
that they've set that up,
498
00:20:08.140 --> 00:20:11.506
and they are scaling it across different use cases.
499
00:20:11.506 --> 00:20:13.371
She used the word recommendation a lot,
500
00:20:13.371 --> 00:20:14.410
and that was nice.
501
00:20:14.410 --> 00:20:17.370
She didn't talk about the solution that the system offers.
502
00:20:17.370 --> 00:20:18.660
She talked about recommendation,
503
00:20:18.660 --> 00:20:23.480
which is a very collaborative, working together approach.
504
00:20:23.480 --> 00:20:25.037
I thought it was a great conversation.
505
00:20:25.037 --> 00:20:26.412
(gentle music)
506
00:20:26.412 --> 00:20:28.330
Looking forward to our next episode,
507
00:20:28.330 --> 00:20:30.280
with Mattias Ulbrich from Porsche.
508
00:20:30.280 --> 00:20:31.880
Please take the time to join us.
509
00:20:36.233 --> 00:20:39.510
Thanks for listening to Me, Myself, and AI.
510
00:20:39.510 --> 00:20:41.110
If you're enjoying the show,
511
00:20:41.110 --> 00:20:43.240
take a minute to write us a review.
512
00:20:43.240 --> 00:20:44.840
If you send us a screenshot,
513
00:20:44.840 --> 00:20:47.743
we'll send you a collection of MIT SMR's best articles
514
00:20:47.743 --> 00:20:51.670
on artificial intelligence free for a limited time.
515
00:20:51.670 --> 00:20:56.540
Send your review screenshot to smrfeedback@mit.edu.
516
00:20:56.540 --> 00:20:59.123
(gentle music)