WEBVTT
1
00:00:02.940 --> 00:00:05.340
Things like brake fluid and chemical manufacturing
2
00:00:05.340 --> 00:00:08.200
may not seem like gee-whiz artificial intelligence,
3
00:00:08.200 --> 00:00:10.680
but we all may be benefiting from AI already
4
00:00:10.680 --> 00:00:11.733
and just not know it.
5
00:00:12.610 --> 00:00:14.450
Today, were talking with Chris Couch,
6
00:00:14.450 --> 00:00:16.750
senior vice president and chief technology officer
7
00:00:16.750 --> 00:00:17.930
of Cooper Standard,
8
00:00:17.930 --> 00:00:20.250
about how were all benefiting indirectly
9
00:00:20.250 --> 00:00:22.200
from artificial intelligence every day.
10
00:00:23.310 --> 00:00:25.320
Welcome to Me, Myself, and AI,
11
00:00:25.320 --> 00:00:28.480
a podcast on artificial intelligence in business.
12
00:00:28.480 --> 00:00:30.490
Each episode, we introduce you to someone
13
00:00:30.490 --> 00:00:32.360
innovating with AI.
14
00:00:32.360 --> 00:00:33.680
I'm Sam Ransbotham,
15
00:00:33.680 --> 00:00:36.587
professor of information systems at Boston College.
16
00:00:36.587 --> 00:00:37.979
I'm also the guest editor
17
00:00:37.979 --> 00:00:41.210
for the AI and Business Strategy Big Ideas program
18
00:00:41.210 --> 00:00:43.416
at MIT Sloan Management Review.
19
00:00:43.416 --> 00:00:46.150
And I'm Shervin Khodabandeh,
20
00:00:46.150 --> 00:00:47.860
senior partner with BCG,
21
00:00:47.860 --> 00:00:51.910
and I colead BCG's AI practice in North America.
22
00:00:51.910 --> 00:00:54.590
Together MIT, SMR, and BCG
23
00:00:54.590 --> 00:00:57.200
have been researching AI for five years,
24
00:00:57.200 --> 00:00:59.360
interviewing hundreds of practitioners
25
00:00:59.360 --> 00:01:01.400
and surveying thousands of companies
26
00:01:01.400 --> 00:01:02.738
on what it takes to build
27
00:01:02.738 --> 00:01:06.010
and to deploy and scale AI capabilities
28
00:01:06.010 --> 00:01:07.380
across the organization.
29
00:01:07.380 --> 00:01:09.930
And really transform the way organizations operate.
30
00:01:14.600 --> 00:01:16.290
Today were talking with Chris Couch.
31
00:01:16.290 --> 00:01:18.920
Chris is the SVP and the chief technology officer
32
00:01:18.920 --> 00:01:20.210
for Cooper Standard.
33
00:01:20.210 --> 00:01:21.880
Chris, thanks for taking the time to talk with us.
34
00:01:21.880 --> 00:01:22.713
Welcome.
35
00:01:22.713 --> 00:01:23.990
You bet. Thank you very much.
36
00:01:23.990 --> 00:01:24.890
Why don't we start by
37
00:01:24.890 --> 00:01:27.330
learning a little bit about your role at Cooper Standard.
38
00:01:27.330 --> 00:01:28.830
What do you do now?
39
00:01:28.830 --> 00:01:30.770
I am the CTO of Cooper Standard.
40
00:01:30.770 --> 00:01:33.400
We're a tier-one global automotive supplier.
41
00:01:33.400 --> 00:01:36.250
I'm also the founder and CEO of an AI startup
42
00:01:36.250 --> 00:01:38.300
called Liveline Technologies
43
00:01:38.300 --> 00:01:40.810
that has come out of some work that we did
44
00:01:40.810 --> 00:01:42.920
as R&D within Cooper Standard.
45
00:01:42.920 --> 00:01:46.560
We provide components in the spaces of
46
00:01:46.560 --> 00:01:48.280
vehicle sealing and enclosures,
47
00:01:48.280 --> 00:01:50.350
as well as fluid handling,
48
00:01:50.350 --> 00:01:52.540
whether it's brake fluid or coolant,
49
00:01:52.540 --> 00:01:54.840
all the fluid systems in the vehicle.
50
00:01:54.840 --> 00:01:58.200
We also invest in material science technologies
51
00:01:58.200 --> 00:01:59.680
that we believe can have an impact
52
00:01:59.680 --> 00:02:01.380
beyond automotive.
53
00:02:01.380 --> 00:02:03.460
Many of our products may be not visible
54
00:02:03.460 --> 00:02:04.550
to the average consumer.
55
00:02:04.550 --> 00:02:06.520
In fact, some of our products,
56
00:02:06.520 --> 00:02:08.700
hopefully nobody has to worry about them
57
00:02:08.700 --> 00:02:10.771
when were moving fuel around your vehicle,
58
00:02:10.771 --> 00:02:14.330
but they're critically important to the driving experience
59
00:02:14.330 --> 00:02:17.040
and having a safe and reliable vehicle.
60
00:02:17.040 --> 00:02:18.160
For example,
61
00:02:18.160 --> 00:02:20.870
we developed a brand-new
62
00:02:20.870 --> 00:02:22.800
category of polymer
63
00:02:22.800 --> 00:02:25.063
that we have called Fortrex.
64
00:02:26.500 --> 00:02:27.712
Fortrex provides a much
65
00:02:27.712 --> 00:02:29.230
better seal
66
00:02:29.230 --> 00:02:31.400
around the doors and windows in your vehicle.
67
00:02:31.400 --> 00:02:32.460
Why is that important?
68
00:02:32.460 --> 00:02:33.920
It's important especially
69
00:02:33.920 --> 00:02:37.500
as we move into an electrified-vehicle world.
70
00:02:37.500 --> 00:02:39.711
As engine and transmission noises
71
00:02:39.711 --> 00:02:43.360
decrease, because there's no more gasoline engine.
72
00:02:43.360 --> 00:02:45.750
Other sources of noise become more prevalent,
73
00:02:45.750 --> 00:02:46.760
and the largest one of those
74
00:02:46.760 --> 00:02:47.898
is the noise coming in
75
00:02:47.898 --> 00:02:50.971
due to wind around your doors and windows.
76
00:02:50.971 --> 00:02:54.260
By providing an enhanced sealing package for those,
77
00:02:54.260 --> 00:02:56.840
we believe we've got the right products
78
00:02:56.840 --> 00:02:59.650
to service an electrifying world.
79
00:02:59.650 --> 00:03:02.110
How is artificial intelligence involved in that
80
00:03:02.110 --> 00:03:03.850
development of the polymer?
81
00:03:03.850 --> 00:03:05.550
We spent a lot of time and money
82
00:03:05.550 --> 00:03:07.710
coming up with advanced polymer formulations.
83
00:03:07.710 --> 00:03:10.490
A lot of it historically has been trial and error.
84
00:03:10.490 --> 00:03:13.433
That's what industrial chemists often do.
85
00:03:14.328 --> 00:03:18.400
We used AI to develop a system
86
00:03:18.400 --> 00:03:20.870
that advises our chemists on
87
00:03:20.870 --> 00:03:23.680
the next set of recipes to try
88
00:03:23.680 --> 00:03:27.000
as they iterate toward a final solution.
89
00:03:27.000 --> 00:03:29.058
We've found dramatic reductions,
90
00:03:29.058 --> 00:03:31.746
in many cases, with that approach.
91
00:03:31.746 --> 00:03:33.250
Dramatic means reducing
92
00:03:33.250 --> 00:03:36.440
those R&D loops 70% or 80%.
93
00:03:36.440 --> 00:03:37.273
Got it,
94
00:03:37.273 --> 00:03:39.290
but before we talk more about Cooper Standard's
95
00:03:39.290 --> 00:03:40.490
success with AI,
96
00:03:40.490 --> 00:03:41.450
can you tell us a bit more about
97
00:03:41.450 --> 00:03:43.690
your own background and career path?
98
00:03:43.690 --> 00:03:44.900
Well, I think
99
00:03:44.900 --> 00:03:46.250
the best way to describe myself
100
00:03:46.250 --> 00:03:48.430
is a lifelong manufacturing addict,
101
00:03:48.430 --> 00:03:49.263
first of all.
102
00:03:49.263 --> 00:03:52.160
As a kid, I took apart everything in the house
103
00:03:52.160 --> 00:03:54.400
probably got hit with wall current more than once
104
00:03:54.400 --> 00:03:56.450
Explains a lot about me today, I suppose.
105
00:03:57.370 --> 00:04:00.250
I was the kid that built their first car out of a kit.
106
00:04:00.250 --> 00:04:02.490
I was a hard-core mechanical engineer
107
00:04:02.490 --> 00:04:05.990
with a focus on manufacturing and controls in school.
108
00:04:05.990 --> 00:04:07.360
My side projects include things
109
00:04:07.360 --> 00:04:10.110
like building autonomous drones that
110
00:04:10.110 --> 00:04:11.400
fly at high altitude.
111
00:04:11.400 --> 00:04:14.470
I'm just a manufacturing nerd.
112
00:04:14.470 --> 00:04:17.670
That has really served me well in my career.
113
00:04:17.670 --> 00:04:21.110
I spent the first third of my working life
114
00:04:21.110 --> 00:04:22.390
in a Japanese company.
115
00:04:22.390 --> 00:04:24.100
I worked for Toyota
116
00:04:24.100 --> 00:04:27.160
and went and joined them in Japan.
117
00:04:27.160 --> 00:04:29.200
Spent a dozen years with them
118
00:04:29.200 --> 00:04:31.150
designing and building and
119
00:04:31.150 --> 00:04:34.140
ultimately being involved in plant operations.
120
00:04:34.140 --> 00:04:36.340
I spent the next third of my career
121
00:04:36.340 --> 00:04:38.040
running a P&L
122
00:04:38.040 --> 00:04:39.690
for an automotive supplier
123
00:04:39.690 --> 00:04:41.290
on the business side,
124
00:04:41.290 --> 00:04:42.850
mostly based out of Asia,
125
00:04:42.850 --> 00:04:45.113
so I have a business bent as well,
126
00:04:45.113 --> 00:04:47.610
which may color a lot of what I say today.
127
00:04:47.610 --> 00:04:50.420
Then, the last third or so of my career has been
128
00:04:50.420 --> 00:04:51.510
in CTO gigs.
129
00:04:51.510 --> 00:04:53.200
I'm in my second one here.
130
00:04:53.200 --> 00:04:55.413
I'm again at an automotive supplier,
131
00:04:55.413 --> 00:04:58.980
but we get our fingers into all kinds of interesting
132
00:04:58.980 --> 00:05:01.570
tech domains, just given what's happening
133
00:05:01.570 --> 00:05:02.700
in the world today,
134
00:05:02.700 --> 00:05:05.580
whether it's material science or AI.
135
00:05:05.580 --> 00:05:07.169
So here I am.
136
00:05:07.169 --> 00:05:09.750
Didn't really expect to be doing
137
00:05:09.750 --> 00:05:10.830
my second job here,
138
00:05:10.830 --> 00:05:12.240
if you would have asked me two years ago,
139
00:05:12.240 --> 00:05:14.020
but it's certainly been
140
00:05:14.020 --> 00:05:14.853
a lot of fun,
141
00:05:14.853 --> 00:05:16.810
and were excited about delivering some impact
142
00:05:16.810 --> 00:05:18.317
with these technologies.
143
00:05:18.317 --> 00:05:21.330
Chris, tell us about the open innovation
144
00:05:21.330 --> 00:05:22.280
at Cooper Standard.
145
00:05:22.280 --> 00:05:24.100
What is that all about?
146
00:05:24.100 --> 00:05:24.933
You know,
147
00:05:24.933 --> 00:05:26.530
as we looked around at our tech portfolio
148
00:05:26.530 --> 00:05:29.760
a couple years ago when I joined the company.
149
00:05:29.760 --> 00:05:31.430
I was, first of all, overwhelmed
150
00:05:31.430 --> 00:05:33.620
by the different domains that we really
151
00:05:33.620 --> 00:05:35.390
had to compete in.
152
00:05:35.390 --> 00:05:37.470
I mentioned materials science earlier,
153
00:05:37.470 --> 00:05:38.890
but there's different aspects
154
00:05:38.890 --> 00:05:41.872
of manufacturing technology and product design.
155
00:05:41.872 --> 00:05:44.160
The whole topic of analytics and AI
156
00:05:44.160 --> 00:05:46.270
right, that were going to talk about today.
157
00:05:46.270 --> 00:05:48.460
And was very convinced
158
00:05:48.460 --> 00:05:50.560
that there was no way that we could do it all ourselves.
159
00:05:50.560 --> 00:05:52.840
Cooper Standard isn't a small company.
160
00:05:52.840 --> 00:05:55.690
We're just shy of three billion in revenue,
161
00:05:55.690 --> 00:05:57.903
but were not the biggest.
162
00:05:57.903 --> 00:05:59.410
And so open innovation
163
00:05:59.410 --> 00:06:01.030
was really an attempt to reach out
164
00:06:01.030 --> 00:06:02.440
and build a pipeline
165
00:06:02.440 --> 00:06:04.639
to draw ideas and technology,
166
00:06:04.639 --> 00:06:06.389
and maybe even talent,
167
00:06:06.389 --> 00:06:08.060
from the outside world.
168
00:06:08.060 --> 00:06:11.360
And so through that, we engage with
169
00:06:11.360 --> 00:06:13.790
universities, with consortia
170
00:06:13.790 --> 00:06:14.890
around the world.
171
00:06:14.890 --> 00:06:17.770
We engage heavily with startup companies
172
00:06:17.770 --> 00:06:21.090
and use that as a source of ideas.
173
00:06:21.090 --> 00:06:23.170
In fact, our first
174
00:06:23.170 --> 00:06:26.700
proper AI project, if you will, really came through that
175
00:06:26.700 --> 00:06:29.360
open-innovation pipeline.
176
00:06:29.360 --> 00:06:31.810
We partnered up with a brand-new startup
177
00:06:31.810 --> 00:06:33.620
that was called Uncountable,
178
00:06:33.620 --> 00:06:35.240
out of the Bay Area,
179
00:06:35.240 --> 00:06:37.720
and they helped us develop a system
180
00:06:37.720 --> 00:06:41.280
that would serve effectively as an adviser
181
00:06:41.280 --> 00:06:43.630
for our chemists that make new formulations
182
00:06:43.630 --> 00:06:45.220
for materials that we use all the time.
183
00:06:45.220 --> 00:06:49.300
That wound up being a great accelerator for our R&D process,
184
00:06:49.300 --> 00:06:51.090
cutting iterations out of those
185
00:06:51.090 --> 00:06:53.743
design-and-test loops, if you will.
186
00:06:53.743 --> 00:06:56.200
That was one of those big, aha! moments, right?
187
00:06:56.200 --> 00:06:58.820
That there is a huge
188
00:06:58.820 --> 00:07:00.490
potential to
189
00:07:00.490 --> 00:07:02.470
accelerate ourselves in many domains.
190
00:07:02.470 --> 00:07:04.144
We can't do it all ourselves,
191
00:07:04.144 --> 00:07:08.236
and so how do we really build that external pipeline?
192
00:07:08.236 --> 00:07:11.410
We now call it CS Open Innovation,
193
00:07:11.410 --> 00:07:12.988
but that was the impetus.
194
00:07:12.988 --> 00:07:16.380
Sounds like a very, sort of
195
00:07:16.380 --> 00:07:18.003
unique way of bringing
196
00:07:18.003 --> 00:07:20.240
folks with different backgrounds
197
00:07:20.240 --> 00:07:21.520
and different talents
198
00:07:21.520 --> 00:07:23.650
and getting them all work together.
199
00:07:23.650 --> 00:07:25.020
What did you find
200
00:07:25.020 --> 00:07:27.770
was the secret sauce of making that happen?
201
00:07:27.770 --> 00:07:28.603
I think
202
00:07:28.603 --> 00:07:30.640
whether it's AI, whether it's materials science,
203
00:07:30.640 --> 00:07:31.870
whether it's other domains,
204
00:07:31.870 --> 00:07:33.130
my answer is the same:
205
00:07:33.130 --> 00:07:36.398
It really is all about the ability to focus.
206
00:07:36.398 --> 00:07:39.980
The reason that we, as many other companies,
207
00:07:39.980 --> 00:07:41.380
have put in place
208
00:07:41.380 --> 00:07:43.620
innovation pipelines and processes
209
00:07:43.620 --> 00:07:45.687
and stage-gate processes
210
00:07:45.687 --> 00:07:49.030
that govern innovation is because of the focus.
211
00:07:49.030 --> 00:07:50.950
How do we quickly narrow down
212
00:07:50.950 --> 00:07:53.360
where were going to allocate our precious
213
00:07:53.360 --> 00:07:54.320
R&D dollars,
214
00:07:54.320 --> 00:07:56.370
and how do we govern
215
00:07:56.370 --> 00:07:57.340
those correctly?
216
00:07:57.340 --> 00:07:59.050
So we think like a startup.
217
00:07:59.050 --> 00:08:01.140
We're doing the minimal
218
00:08:01.140 --> 00:08:03.210
investment to sort of answer the next,
219
00:08:03.210 --> 00:08:05.100
most important question
220
00:08:05.100 --> 00:08:07.240
and either wind up killing things quickly
221
00:08:07.240 --> 00:08:09.620
or take them to fruition.
222
00:08:09.620 --> 00:08:12.180
And a fair amount of fail fast, and test
223
00:08:12.180 --> 00:08:13.013
and learn,
224
00:08:13.013 --> 00:08:16.330
and sort of go big behind things that are working
225
00:08:16.330 --> 00:08:18.430
and shut down things that aren't, right?
226
00:08:18.430 --> 00:08:19.609
Did I hear that correctly?
227
00:08:19.609 --> 00:08:20.442
Exactly,
228
00:08:20.442 --> 00:08:21.324
and that's not unique.
229
00:08:21.324 --> 00:08:22.950
I think that
230
00:08:22.950 --> 00:08:25.830
there's nothing special about AI-based projects, right?
231
00:08:25.830 --> 00:08:27.960
We sort of think in the same way
232
00:08:27.960 --> 00:08:30.540
and very quickly try to
233
00:08:30.540 --> 00:08:32.220
motivate those with
234
00:08:32.220 --> 00:08:34.123
a clear-eyed view of ROI.
235
00:08:35.420 --> 00:08:37.570
Frankly, one of the things
236
00:08:37.570 --> 00:08:39.560
that I think we've seen over the years
237
00:08:39.560 --> 00:08:42.790
when it comes to analytics, AI.
238
00:08:42.790 --> 00:08:45.940
Especially coupled with manufacturing and Industry 4.0,
239
00:08:45.940 --> 00:08:50.154
ROI has sometimes been hard to come by.
240
00:08:50.154 --> 00:08:52.060
A lot of creative ideas,
241
00:08:52.060 --> 00:08:54.587
a lot of interesting things to do with data,
242
00:08:54.587 --> 00:08:55.720
but the question is,
243
00:08:55.720 --> 00:08:57.500
how does it translate to the bottom line?
244
00:08:57.500 --> 00:08:58.610
And if that
245
00:08:58.610 --> 00:09:01.830
story can't be told, even as a hypothesis
246
00:09:01.830 --> 00:09:03.830
that were going to prove through the
247
00:09:03.830 --> 00:09:05.750
innovation project, then it's
248
00:09:05.750 --> 00:09:07.883
hard to justify working on it.
249
00:09:08.720 --> 00:09:09.990
It seems like the opposite,
250
00:09:09.990 --> 00:09:11.050
though, is that,
251
00:09:11.050 --> 00:09:12.910
just to push back a little bit,
252
00:09:12.910 --> 00:09:15.080
if you get too focused on ROI,
253
00:09:15.080 --> 00:09:16.210
where are you going to get something
254
00:09:16.210 --> 00:09:17.930
weird and big and unusual?
255
00:09:17.930 --> 00:09:19.030
Absolutely.
256
00:09:19.030 --> 00:09:21.160
How are you balancing that
257
00:09:21.160 --> 00:09:24.650
tension between focusing on ROI and also
258
00:09:24.650 --> 00:09:26.150
trying not to miss out on,
259
00:09:26.150 --> 00:09:28.230
or trying not to be too incremental?
260
00:09:28.230 --> 00:09:30.540
I think the stage-gate mentality is useful here.
261
00:09:30.540 --> 00:09:31.740
I think in the early state,
262
00:09:31.740 --> 00:09:33.370
we look at a lot of crazy stuff.
263
00:09:33.370 --> 00:09:36.430
We have crazy ideas that come in through open innovation.
264
00:09:36.430 --> 00:09:38.080
We have crazy ideas from our own
265
00:09:38.080 --> 00:09:40.220
teams, and that's fantastic.
266
00:09:40.220 --> 00:09:42.070
We don't hesitate to look at them
267
00:09:42.070 --> 00:09:44.300
and maybe even
268
00:09:44.300 --> 00:09:46.740
spend a little pocket money to chase them down
269
00:09:46.740 --> 00:09:47.573
to some degree.
270
00:09:47.573 --> 00:09:49.140
The question then is,
271
00:09:49.140 --> 00:09:52.260
what are we going to invest in to try to productize?
272
00:09:52.260 --> 00:09:54.660
That's really the next gate, if you will.
273
00:09:54.660 --> 00:09:55.560
So absolutely,
274
00:09:55.560 --> 00:09:57.520
the exploration is important.
275
00:09:57.520 --> 00:09:59.460
We certainly do some of that.
276
00:09:59.460 --> 00:10:00.780
I hesitate to say it almost,
277
00:10:00.780 --> 00:10:03.090
but it's having some space to play
278
00:10:03.090 --> 00:10:05.260
with ideas and technologies,
279
00:10:05.260 --> 00:10:07.430
but then, when it's time to go productize,
280
00:10:07.430 --> 00:10:09.620
right, you have to be clear-eyed on what
281
00:10:09.620 --> 00:10:11.360
you're going to get out of it.
282
00:10:11.360 --> 00:10:13.230
That seems like something that might differ for
283
00:10:13.230 --> 00:10:14.063
an AI approach.
284
00:10:14.063 --> 00:10:15.200
I mean, you said, Well,
285
00:10:15.200 --> 00:10:17.280
AI's no different, just a second ago,
286
00:10:17.280 --> 00:10:21.250
but it seems like, I guess I wonder if there is something
287
00:10:21.250 --> 00:10:22.230
different about
288
00:10:22.230 --> 00:10:24.970
these new technologies that may require
289
00:10:24.970 --> 00:10:26.660
a little more freedom up front
290
00:10:26.660 --> 00:10:28.260
to do something weird
291
00:10:28.260 --> 00:10:30.477
than perhaps some others.
292
00:10:30.477 --> 00:10:31.970
I think that's fair.
293
00:10:31.970 --> 00:10:32.803
In our experience,
294
00:10:32.803 --> 00:10:35.210
I think one of the differences with AI is that
295
00:10:35.210 --> 00:10:36.900
you probably have less
296
00:10:36.900 --> 00:10:41.290
familiarity with the tools and the applications among
297
00:10:41.290 --> 00:10:42.730
the general technical population.
298
00:10:42.730 --> 00:10:44.650
Right, so if you're talking to design engineers,
299
00:10:44.650 --> 00:10:47.300
or talking to manufacturing process engineers,
300
00:10:47.300 --> 00:10:48.750
they may have read some things
301
00:10:48.750 --> 00:10:51.380
and maybe seen an interesting demo somewhere
302
00:10:51.380 --> 00:10:52.810
but may not be so versed
303
00:10:52.810 --> 00:10:55.560
in the nuts and bolts of how it works,
304
00:10:55.560 --> 00:10:56.940
much less the nuts and bolts
305
00:10:56.940 --> 00:10:58.640
of what would it take to scale that
306
00:10:58.640 --> 00:11:00.280
at an enterprise level.
307
00:11:00.280 --> 00:11:02.640
Because getting models running in a Jupyter Notebook
308
00:11:02.640 --> 00:11:03.994
off of a
309
00:11:03.994 --> 00:11:05.450
CSV file on your hard drive
310
00:11:05.450 --> 00:11:07.230
is a whole different story from
311
00:11:07.230 --> 00:11:08.676
production on a global scale.
312
00:11:08.676 --> 00:11:09.509
Exactly.
313
00:11:09.509 --> 00:11:11.510
And so I think just that lack
314
00:11:11.510 --> 00:11:14.690
of exposure to the technologies makes it a bit different.
315
00:11:14.690 --> 00:11:16.160
If were talking about
316
00:11:16.160 --> 00:11:18.120
traditional robotics, or
317
00:11:18.120 --> 00:11:21.250
maybe simpler types of IOT concepts,
318
00:11:21.250 --> 00:11:22.930
plenty of engineers have a good clue
319
00:11:22.930 --> 00:11:25.110
and maybe have used some things in their career,
320
00:11:25.110 --> 00:11:27.730
but much less so when it comes to AI.
321
00:11:27.730 --> 00:11:29.260
That is a difference, I would agree.
322
00:11:29.260 --> 00:11:31.010
The good news is,
323
00:11:31.010 --> 00:11:33.630
I am very convinced that
324
00:11:33.630 --> 00:11:35.860
one of the wonderful things about AI
325
00:11:35.860 --> 00:11:37.210
is that it is
326
00:11:37.210 --> 00:11:39.040
cheap to pilot.
327
00:11:39.040 --> 00:11:40.750
I was just sort of
328
00:11:40.750 --> 00:11:44.530
making up a silly example about Jupyter Notebooks and
329
00:11:44.530 --> 00:11:45.440
CSV files,
330
00:11:45.440 --> 00:11:47.810
but that's a great way to explore some concepts,
331
00:11:47.810 --> 00:11:49.460
and the cost of that
332
00:11:49.460 --> 00:11:51.480
is very close to zero,
333
00:11:51.480 --> 00:11:53.844
other than acquiring the knowledge
334
00:11:53.844 --> 00:11:54.677
to do it.
335
00:11:54.677 --> 00:11:58.200
And even then, I think that we've proven
336
00:11:58.200 --> 00:11:59.930
over and over in our internal teams
337
00:11:59.930 --> 00:12:02.750
that even the knowledge acquisition
338
00:12:02.750 --> 00:12:05.710
is reasonably priced, if you will.
339
00:12:05.710 --> 00:12:08.070
Chris, I want to build on that point you said,
340
00:12:08.070 --> 00:12:09.990
that AI is
341
00:12:09.990 --> 00:12:12.150
relatively inexpensive to pilot,
342
00:12:12.150 --> 00:12:14.170
and I agree with that because
343
00:12:14.170 --> 00:12:16.220
we see, of course,
344
00:12:16.220 --> 00:12:17.800
a proliferation of
345
00:12:17.800 --> 00:12:19.800
proofs of concept and
346
00:12:19.800 --> 00:12:21.500
different teams trying different approaches,
347
00:12:21.500 --> 00:12:22.870
different ideas.
348
00:12:22.870 --> 00:12:25.260
It also seems there is a fact
349
00:12:25.260 --> 00:12:28.020
that AI is also quite hard to scale.
350
00:12:28.020 --> 00:12:28.853
Right.
351
00:12:28.853 --> 00:12:31.380
And so why not sort of get your
352
00:12:31.380 --> 00:12:33.210
reactions to,
353
00:12:33.210 --> 00:12:36.130
something's easy to pilot,
354
00:12:36.130 --> 00:12:38.120
quite hard to scale.
355
00:12:38.120 --> 00:12:40.790
the real meaningful ROI will come
356
00:12:40.790 --> 00:12:42.470
after you scale it.
357
00:12:42.470 --> 00:12:44.910
So how do you make that transition?
358
00:12:44.910 --> 00:12:47.560
And how do you sort of make
359
00:12:47.560 --> 00:12:49.500
things that are really easy to pilot
360
00:12:49.500 --> 00:12:51.220
and get excitement around
361
00:12:51.220 --> 00:12:54.290
but then harder down the line to actually
362
00:12:54.290 --> 00:12:57.640
embed into business processes and ways of working?
363
00:12:57.640 --> 00:12:59.933
How do you envision that transition working?
364
00:13:00.770 --> 00:13:02.630
Right. Yeah, it's a great question,
365
00:13:02.630 --> 00:13:04.530
and it's definitely not
366
00:13:04.530 --> 00:13:07.080
easy and maybe not for the faint of heart, right?
367
00:13:07.080 --> 00:13:08.850
Because sometimes it does
368
00:13:08.850 --> 00:13:10.740
take a leap of faith
369
00:13:10.740 --> 00:13:13.290
in the ability to scale, ultimately.
370
00:13:13.290 --> 00:13:16.290
The best I can say from our experience with Liveline,
371
00:13:16.290 --> 00:13:19.030
We did some very early prototyping,
372
00:13:19.030 --> 00:13:20.660
we thought we understood
373
00:13:20.660 --> 00:13:22.300
the data science aspect,
374
00:13:22.300 --> 00:13:24.030
but that was only the beginning.
375
00:13:24.030 --> 00:13:26.113
That was nearly two years ago.
376
00:13:27.480 --> 00:13:30.260
Only in the past months have we begun
377
00:13:30.260 --> 00:13:32.510
to go to a global scale-out.
378
00:13:32.510 --> 00:13:34.710
The only insight I have there is,
379
00:13:34.710 --> 00:13:37.040
as you prototype, as you pilot,
380
00:13:37.040 --> 00:13:39.620
you've just got to try to be as judicious
381
00:13:39.620 --> 00:13:41.080
as you can
382
00:13:41.080 --> 00:13:45.050
about selecting use cases that are realistic
383
00:13:45.050 --> 00:13:47.140
and that everybody can get their
384
00:13:47.140 --> 00:13:49.890
heads around and connect the dots to the ROI
385
00:13:49.890 --> 00:13:51.730
at the end of the day.
386
00:13:51.730 --> 00:13:53.560
How are you getting people,
387
00:13:53.560 --> 00:13:55.520
you know once you've got these solutions in place,
388
00:13:55.520 --> 00:13:57.620
what about the adoption within the organization?
389
00:13:57.620 --> 00:14:00.520
How are you getting people to work on teams that
390
00:14:00.520 --> 00:14:02.050
used to have human partners
391
00:14:02.050 --> 00:14:04.260
and now have machine partners?
392
00:14:04.260 --> 00:14:07.140
With Liveline, the basic concept of Liveline
393
00:14:07.140 --> 00:14:10.110
is to automate the creation of automation
394
00:14:10.110 --> 00:14:12.710
for complex manufacturing environments.
395
00:14:12.710 --> 00:14:14.990
We're using machine learning techniques to design
396
00:14:14.990 --> 00:14:16.580
the control policies
397
00:14:16.580 --> 00:14:18.170
that we deploy
398
00:14:18.170 --> 00:14:20.750
onto the lines to control machine parameters
399
00:14:20.750 --> 00:14:21.753
in real time.
400
00:14:22.790 --> 00:14:24.320
We think this is very useful for
401
00:14:24.320 --> 00:14:27.760
attacking a diverse range of processes that have been
402
00:14:27.760 --> 00:14:30.690
too complex or too costly to automate.
403
00:14:30.690 --> 00:14:32.840
otherwise, and our early successes
404
00:14:32.840 --> 00:14:34.900
have been in continuous-flow
405
00:14:34.900 --> 00:14:36.870
manufacturing processes,
406
00:14:36.870 --> 00:14:39.300
chemical conversion, polymer extrusion,
407
00:14:39.300 --> 00:14:41.780
and we think there's a broad applicability to this
408
00:14:41.780 --> 00:14:45.460
to areas like oil and gas, wire and cable, etc.
409
00:14:45.460 --> 00:14:47.360
One of my fears when we
410
00:14:47.360 --> 00:14:51.440
first got into plants to do live production trials
411
00:14:51.440 --> 00:14:54.130
is that the plant personnel might view this as
412
00:14:54.130 --> 00:14:56.730
sort of a threat, right were automating.
413
00:14:56.730 --> 00:14:58.680
that has some negative connotations
414
00:14:58.680 --> 00:15:02.040
sometimes in terms of impacts on people's jobs
415
00:15:02.040 --> 00:15:03.460
and so forth.
416
00:15:03.460 --> 00:15:06.580
But there's a couple of things I think really
417
00:15:06.580 --> 00:15:07.890
gained us some traction,
418
00:15:07.890 --> 00:15:10.280
and the reception has been quite warm.
419
00:15:10.280 --> 00:15:12.920
In fact, the plants are pulling very hard now
420
00:15:12.920 --> 00:15:14.310
to roll this out.
421
00:15:14.310 --> 00:15:18.220
My attitude is to really democratize the information
422
00:15:18.220 --> 00:15:20.253
and what's happening with the tool.
423
00:15:21.678 --> 00:15:23.700
For example, we spent
424
00:15:23.700 --> 00:15:25.640
quite some effort to make sure that
425
00:15:25.640 --> 00:15:27.640
operators in the plant environment
426
00:15:27.640 --> 00:15:29.620
had screens where they can
427
00:15:29.620 --> 00:15:32.320
see data streams in real time,
428
00:15:32.320 --> 00:15:34.340
that they couldn't before.
429
00:15:34.340 --> 00:15:36.900
Sometimes they were data streams that we had created
430
00:15:36.900 --> 00:15:39.460
for the sake of the machine learning.
431
00:15:39.460 --> 00:15:41.510
We give them visibility into it.
432
00:15:41.510 --> 00:15:43.680
We give them visibility, if they want,
433
00:15:43.680 --> 00:15:47.120
into the decisions that the system is making.
434
00:15:47.120 --> 00:15:49.390
We also give them the ability to turn it off,
435
00:15:49.390 --> 00:15:51.610
haha, the big red button, right
436
00:15:51.610 --> 00:15:53.097
if they're not comfortable with what the
437
00:15:53.097 --> 00:15:56.380
HAL 9000's doing on their production line.
438
00:15:56.380 --> 00:15:59.920
Also, we give them the ability to bias it.
439
00:15:59.920 --> 00:16:01.410
If they feel,
440
00:16:01.410 --> 00:16:02.420
based on their experience,
441
00:16:02.420 --> 00:16:04.390
that the system is making parts
442
00:16:04.390 --> 00:16:05.343
that are a little,
443
00:16:06.500 --> 00:16:08.140
let's just say too thin or too thick,
444
00:16:08.140 --> 00:16:10.210
they can bias it down a little bit.
445
00:16:10.210 --> 00:16:12.040
I think that sort of exposure
446
00:16:12.040 --> 00:16:14.040
and opening up the black box,
447
00:16:14.040 --> 00:16:15.550
at least in the plant environment,
448
00:16:15.550 --> 00:16:17.270
is very critical to
449
00:16:17.270 --> 00:16:20.960
people buying in and believing in what's going on.
450
00:16:20.960 --> 00:16:23.570
One of our learnings at Liveline with that
451
00:16:23.570 --> 00:16:25.900
was the enhanced feedback that we get.
452
00:16:25.900 --> 00:16:29.290
We have received several very
453
00:16:29.290 --> 00:16:31.700
influential and useful ideas
454
00:16:31.700 --> 00:16:34.340
from people that were really doing nothing more
455
00:16:34.340 --> 00:16:37.040
than looking at data streams and watching
456
00:16:37.040 --> 00:16:39.360
the system making decisions.
457
00:16:39.360 --> 00:16:41.580
They asked good questions back to us
458
00:16:41.580 --> 00:16:43.440
and gave us good insights.
459
00:16:43.440 --> 00:16:44.870
Suggested
460
00:16:44.870 --> 00:16:46.300
new types of data that
461
00:16:46.300 --> 00:16:48.610
we could be tagging that might be useful
462
00:16:48.610 --> 00:16:50.880
once they really began to get a little intuition
463
00:16:50.880 --> 00:16:53.770
about what we were trying to do with the data science.
464
00:16:53.770 --> 00:16:56.360
I think that sort of democratization,
465
00:16:56.360 --> 00:16:58.140
if you will, of the system and
466
00:16:58.140 --> 00:17:00.930
opening up and exposing the guts as it
467
00:17:00.930 --> 00:17:03.590
does its thing has been, at least in this case,
468
00:17:03.590 --> 00:17:05.700
one of the success factors.
469
00:17:05.700 --> 00:17:08.600
That's a great example. It covers,
470
00:17:08.600 --> 00:17:09.433
Exactly. Haha
471
00:17:09.433 --> 00:17:10.460
Sam, I feel like it covers
472
00:17:10.460 --> 00:17:11.460
a lot of what we've talked
473
00:17:11.460 --> 00:17:13.940
about in our report,
474
00:17:13.940 --> 00:17:18.770
in terms of different modes of human-AI interaction.
475
00:17:18.770 --> 00:17:19.770
No black box.
476
00:17:19.770 --> 00:17:23.370
Allowing the human to override or bias,
477
00:17:23.370 --> 00:17:25.470
but also - I was going to ask you,
478
00:17:25.470 --> 00:17:27.180
Chris, you hit the point before
479
00:17:27.180 --> 00:17:28.570
I got a chance to ask you,
480
00:17:28.570 --> 00:17:30.070
which is the feedback loop.
481
00:17:30.070 --> 00:17:32.600
I guess my follow-on question is,
482
00:17:32.600 --> 00:17:34.890
how has that feedback loop been working
483
00:17:34.890 --> 00:17:35.920
in terms of
484
00:17:35.920 --> 00:17:37.290
maybe skeptics
485
00:17:37.290 --> 00:17:40.450
having become more sort of AI-friendly,
486
00:17:40.450 --> 00:17:42.360
or more trust having been formed
487
00:17:42.360 --> 00:17:45.540
between humans and AI?
488
00:17:45.540 --> 00:17:48.490
Any anecdotes you can comment on that?
489
00:17:48.490 --> 00:17:50.030
Absolutely.
490
00:17:50.030 --> 00:17:51.630
I'll give you a great anecdote from
491
00:17:51.630 --> 00:17:54.040
one of our plants in the southern U.S.
492
00:17:54.040 --> 00:17:56.890
In fact, this was the plant where we did our final
493
00:17:56.890 --> 00:17:58.360
pilot for Liveline
494
00:17:58.360 --> 00:18:01.220
before we made the decision as a company to go
495
00:18:01.220 --> 00:18:02.743
do a global rollout.
496
00:18:04.010 --> 00:18:06.460
We first had the line running in
497
00:18:06.460 --> 00:18:08.260
what we call automatic mode.
498
00:18:08.260 --> 00:18:11.203
Gosh, I think it was about Q3 of last year.
499
00:18:13.130 --> 00:18:15.360
One of the criteria for the pilot
500
00:18:15.360 --> 00:18:18.340
was that we would do some A-versus-B runs.
501
00:18:18.340 --> 00:18:19.920
Concept's very simple.
502
00:18:19.920 --> 00:18:21.690
For these four hours, were going to run
503
00:18:21.690 --> 00:18:24.300
with the system engaged in automatic mode.
504
00:18:24.300 --> 00:18:25.990
For these four hours, were going to turn it off,
505
00:18:25.990 --> 00:18:29.950
and you all can run the plant like you always do.
506
00:18:29.950 --> 00:18:31.780
Then, over a series of days and weeks,
507
00:18:31.780 --> 00:18:34.090
we'll add up the statistics about
508
00:18:34.090 --> 00:18:36.080
scrap rates and quality
509
00:18:36.080 --> 00:18:38.520
and unplanned line stops,
510
00:18:38.520 --> 00:18:41.923
and we will quantify exactly what the value was.
511
00:18:42.920 --> 00:18:44.500
We came to the first review point
512
00:18:44.500 --> 00:18:46.270
a few weeks into that,
513
00:18:46.270 --> 00:18:48.010
and as I sat with the team,
514
00:18:48.010 --> 00:18:49.280
they sort of
515
00:18:49.280 --> 00:18:50.890
pulled up chairs and looked at their shoes
516
00:18:50.890 --> 00:18:52.340
and said, Hey, we have an issue.
517
00:18:52.340 --> 00:18:55.103
We don't have the B data with the system off.
518
00:18:56.060 --> 00:18:57.070
I said, Why is that?
519
00:18:57.070 --> 00:18:58.680
They said, Because once the plant turned on,
520
00:18:58.680 --> 00:19:00.380
they refused to turn it off again.
521
00:19:02.520 --> 00:19:04.770
They don't want to run with the system disengaged anymore
522
00:19:04.770 --> 00:19:07.600
because the impact was so significant to them
523
00:19:07.600 --> 00:19:10.830
and helped them to operate the lines better
524
00:19:10.830 --> 00:19:12.340
that they don't want to run with it off anymore.
525
00:19:12.340 --> 00:19:14.960
That was very consistent with the type of
526
00:19:14.960 --> 00:19:16.440
reaction we saw in other
527
00:19:16.440 --> 00:19:19.800
pilots in Canada and our tech center in Michigan.
528
00:19:19.800 --> 00:19:20.640
That is great.
529
00:19:20.640 --> 00:19:23.410
Yeah, that sort of feedback is very reassuring,
530
00:19:23.410 --> 00:19:25.090
but again, I think that
531
00:19:25.090 --> 00:19:27.480
from the get-go, having a philosophy of really
532
00:19:27.480 --> 00:19:30.930
just opening up and showing people what's going on,
533
00:19:30.930 --> 00:19:32.200
letting them look at data,
534
00:19:32.200 --> 00:19:33.860
be participants in
535
00:19:33.860 --> 00:19:36.820
problem-solving and tuning and enhancement,
536
00:19:36.820 --> 00:19:39.440
really sets the stage for that emotional
537
00:19:39.440 --> 00:19:41.990
connection and commitment to the project.
538
00:19:41.990 --> 00:19:43.300
Those seem like some very different
539
00:19:43.300 --> 00:19:45.930
ways of getting feedback to a system,
540
00:19:45.930 --> 00:19:47.400
and then the other one you mentioned
541
00:19:47.400 --> 00:19:48.445
was the idea of
542
00:19:48.445 --> 00:19:52.030
suggesting new tags or new data to come back in.
543
00:19:52.030 --> 00:19:52.863
Right.
544
00:19:52.863 --> 00:19:53.880
I can see, for example,
545
00:19:53.880 --> 00:19:56.080
adjusting the bias being a real-time
546
00:19:56.080 --> 00:19:57.360
sort of feedback, and clearly,
547
00:19:57.360 --> 00:19:59.460
pressing the red button would happen immediately,
548
00:19:59.460 --> 00:20:02.073
I hope. That's the point of a red button.
549
00:20:03.070 --> 00:20:06.300
How do the processes work for some of these
550
00:20:06.300 --> 00:20:07.860
non-immediate feedback?
551
00:20:07.860 --> 00:20:09.237
What do you do with the suggestions
552
00:20:09.237 --> 00:20:11.090
for new data and tags?
553
00:20:11.090 --> 00:20:12.800
Is there a process around those?
554
00:20:12.800 --> 00:20:13.760
This is, by the way,
555
00:20:13.760 --> 00:20:16.400
sorry to interrupt your response,
556
00:20:16.400 --> 00:20:19.370
this is Sam's and to some extent,
557
00:20:19.370 --> 00:20:22.369
my chemical engineering background coming out.
558
00:20:22.369 --> 00:20:23.710
And you can think of,
559
00:20:23.710 --> 00:20:24.910
at least for Cooper Standard,
560
00:20:24.910 --> 00:20:27.450
the majority of our lines are chemical processing lines.
561
00:20:27.450 --> 00:20:28.730
We're taking
562
00:20:28.730 --> 00:20:30.060
different types of compounds,
563
00:20:30.060 --> 00:20:32.030
and were extruding them and,
564
00:20:32.030 --> 00:20:33.630
in the case of thermosets,
565
00:20:33.630 --> 00:20:35.790
putting them through oven stages,
566
00:20:35.790 --> 00:20:37.357
200 meters of processes,
567
00:20:37.357 --> 00:20:39.200
and a lot of it is chemistry, as it goes.
568
00:20:39.200 --> 00:20:42.043
Yeah, so you guys are in your sweet spot.
569
00:20:43.379 --> 00:20:45.750
What's the process for new data tags?
570
00:20:45.750 --> 00:20:47.620
How do you formalize that process
571
00:20:47.620 --> 00:20:50.268
in something that's less real time?
572
00:20:50.268 --> 00:20:52.380
I'll give you a real example.
573
00:20:52.380 --> 00:20:55.240
We were doing a pilot maybe a year ago,
574
00:20:55.240 --> 00:20:57.980
and we had a process engineer who's not
575
00:20:57.980 --> 00:21:00.350
a machine learning expert,
576
00:21:00.350 --> 00:21:02.780
was watching the system run,
577
00:21:02.780 --> 00:21:04.040
was looking at the data,
578
00:21:04.040 --> 00:21:07.610
was looking at the analysis
579
00:21:07.610 --> 00:21:09.730
that the machine learning has generated
580
00:21:09.730 --> 00:21:12.570
and how predictable the outcomes
581
00:21:12.570 --> 00:21:14.663
from the line were.
582
00:21:16.431 --> 00:21:17.400
At that stage,
583
00:21:17.400 --> 00:21:19.790
we weren't getting the results that we wanted,
584
00:21:19.790 --> 00:21:22.720
and we were seeing variation in output in the real world
585
00:21:22.720 --> 00:21:24.610
that we weren't picking up and predicting
586
00:21:24.610 --> 00:21:26.313
in the silicone world.
587
00:21:27.580 --> 00:21:28.940
As he was watching the line
588
00:21:28.940 --> 00:21:30.180
he said, Look, I have a theory.
589
00:21:30.180 --> 00:21:33.100
I have a theory that there's something going on
590
00:21:33.100 --> 00:21:36.110
with one of the raw materials were feeding the line.
591
00:21:36.110 --> 00:21:40.720
My theory is that material is more susceptible
592
00:21:40.720 --> 00:21:44.290
to the history of temperature and humidity that
593
00:21:44.290 --> 00:21:46.970
it's experienced as it was shipped to the plant.
594
00:21:46.970 --> 00:21:49.320
Why don't we throw some data-logging
595
00:21:49.320 --> 00:21:51.140
devices on those palettes
596
00:21:51.140 --> 00:21:53.250
as we ship it around the country
597
00:21:53.250 --> 00:21:55.328
and be able to look at that
598
00:21:55.328 --> 00:21:57.600
time and temperature history
599
00:21:57.600 --> 00:21:59.130
and integrate that into the analytics and
600
00:21:59.130 --> 00:22:02.520
see if it would help us be more predictive?
601
00:22:02.520 --> 00:22:04.360
Lo and behold, that was actually helpful.
602
00:22:04.360 --> 00:22:09.360
That's a real-life example of a non-AI expert
603
00:22:09.520 --> 00:22:10.880
interacting with the system
604
00:22:10.880 --> 00:22:13.600
and using their human judgment to suggest
605
00:22:13.600 --> 00:22:14.710
ways to improve it,
606
00:22:14.710 --> 00:22:18.023
even though they can't write AI code.
607
00:22:19.237 --> 00:22:21.260
Once we had exposed enough
608
00:22:21.260 --> 00:22:22.400
of them to what's going on that they were
609
00:22:22.400 --> 00:22:24.930
able to get some human intuition about
610
00:22:24.930 --> 00:22:25.763
what's happening here,
611
00:22:25.763 --> 00:22:27.600
then they were able to participate in the process.
612
00:22:27.600 --> 00:22:29.450
That's a very powerful thing.
613
00:22:29.450 --> 00:22:32.040
Chris, I want to ask you about talent.
614
00:22:32.040 --> 00:22:33.770
You've been talking about a lot of innovation,
615
00:22:33.770 --> 00:22:35.840
a lot of cool ideas.
616
00:22:35.840 --> 00:22:38.670
Different groups, internal and external,
617
00:22:38.670 --> 00:22:40.530
coming together to really
618
00:22:40.530 --> 00:22:41.870
experiment new things,
619
00:22:41.870 --> 00:22:43.310
try new things,
620
00:22:43.310 --> 00:22:46.100
make a really game-changing impact.
621
00:22:46.100 --> 00:22:49.850
What do you think it takes to get the right talent,
622
00:22:49.850 --> 00:22:50.760
motivate them,
623
00:22:50.760 --> 00:22:52.580
keep them excited,
624
00:22:52.580 --> 00:22:55.970
and sort of get that virtuous cycle of
625
00:22:55.970 --> 00:22:59.513
excitement and energy and innovation going?
626
00:23:00.410 --> 00:23:01.790
That's a great question.
627
00:23:01.790 --> 00:23:04.650
I think the answer may be a little different depending on
628
00:23:04.650 --> 00:23:07.313
what sort of technical talent you're talking about.
629
00:23:08.320 --> 00:23:10.620
The way that we would think about a
630
00:23:10.620 --> 00:23:13.330
manufacturing process engineer or controls engineer
631
00:23:13.330 --> 00:23:15.540
may be a bit different how we think about
632
00:23:15.540 --> 00:23:16.590
folks with different
633
00:23:16.590 --> 00:23:18.890
skills in the world of AI,
634
00:23:18.890 --> 00:23:20.510
and sometimes the talent is
635
00:23:20.510 --> 00:23:22.260
in different places in the country.
636
00:23:23.550 --> 00:23:25.720
I'm not sure there's a one-size-fits-all answer.
637
00:23:25.720 --> 00:23:26.940
I think,
638
00:23:26.940 --> 00:23:29.030
in general, when
639
00:23:29.030 --> 00:23:31.650
we find people that we would like to
640
00:23:31.650 --> 00:23:32.860
bring into the company,
641
00:23:32.860 --> 00:23:35.730
I think if we can show them
642
00:23:35.730 --> 00:23:39.320
that the sustained commitment to innovation
643
00:23:39.320 --> 00:23:41.110
and doing cool stuff
644
00:23:41.110 --> 00:23:42.060
is real,
645
00:23:42.060 --> 00:23:43.890
that helps a lot.
646
00:23:43.890 --> 00:23:46.050
I think being able to prove to people
647
00:23:46.050 --> 00:23:47.270
that you're willing to
648
00:23:47.270 --> 00:23:50.700
stay the course in what you're investing in
649
00:23:50.700 --> 00:23:52.223
is part of the story.
650
00:23:53.343 --> 00:23:55.080
Then the second thing that I think is important
651
00:23:55.080 --> 00:23:56.660
is just the culture.
652
00:23:56.660 --> 00:23:58.810
Having people believe that,
653
00:23:58.810 --> 00:24:00.720
in addition to investments
654
00:24:00.720 --> 00:24:03.020
in resource availability,
655
00:24:03.020 --> 00:24:04.920
were just serious about being innovative.
656
00:24:04.920 --> 00:24:06.300
We're just serious
657
00:24:06.300 --> 00:24:08.150
about doing things better.
658
00:24:08.150 --> 00:24:10.970
We're serious about winning through technology,
659
00:24:10.970 --> 00:24:13.800
from the boardroom to the shop floor.
660
00:24:13.800 --> 00:24:16.360
If that culture is real, people know it,
661
00:24:16.360 --> 00:24:18.080
and if it's not real and you're faking it,
662
00:24:18.080 --> 00:24:19.890
I think people know it.
663
00:24:19.890 --> 00:24:22.240
You can't earn that in a quarter;
664
00:24:22.240 --> 00:24:25.597
you've got to earn it over a couple or several years.
665
00:24:25.597 --> 00:24:27.740
I like to think that we've done
666
00:24:27.740 --> 00:24:30.520
a pretty good job with that, but that's really key
667
00:24:30.520 --> 00:24:31.353
in my mind.
668
00:24:32.250 --> 00:24:33.550
Chris, many thanks for taking
669
00:24:33.550 --> 00:24:34.640
the time to talk with us today.
670
00:24:34.640 --> 00:24:37.240
You brought out some quite interesting points.
671
00:24:37.240 --> 00:24:38.490
Thanks for taking the time.
672
00:24:38.490 --> 00:24:40.130
Yeah, Chris, thank you so much.
673
00:24:40.130 --> 00:24:41.390
You're more than welcome.
674
00:24:41.390 --> 00:24:43.240
Hopefully, you can tell I'm excited about
675
00:24:43.240 --> 00:24:45.150
AI. I'm excited about what it can do
676
00:24:45.150 --> 00:24:47.920
in manufacturing as well as other industries.
677
00:24:47.920 --> 00:24:50.080
I think it's going to be a fun future,
678
00:24:50.080 --> 00:24:51.685
I'm looking forward to help build it.
679
00:24:51.685 --> 00:24:54.352
(upbeat melody)
680
00:24:57.970 --> 00:25:00.120
Shervin, Chris covered quite a few key points.
681
00:25:00.120 --> 00:25:02.386
What struck you as particularly noteworthy?
682
00:25:02.386 --> 00:25:05.190
I thought that was really, really insightful.
683
00:25:05.190 --> 00:25:07.030
Obviously, they've done a ton with AI
684
00:25:07.030 --> 00:25:08.830
and a lot of innovation
685
00:25:08.830 --> 00:25:10.430
and cool ideas, and they've put
686
00:25:10.430 --> 00:25:12.400
many of those into production.
687
00:25:12.400 --> 00:25:16.120
I felt a lot of the key sort of
688
00:25:16.120 --> 00:25:18.180
steps in getting value from AI
689
00:25:18.180 --> 00:25:19.920
that we've been talking about
690
00:25:19.920 --> 00:25:22.040
was echoed in what he talked about.
691
00:25:22.040 --> 00:25:23.220
The notion of
692
00:25:23.220 --> 00:25:25.715
experimentation and test and learn.
693
00:25:25.715 --> 00:25:30.190
The culture and importance of allowing folks to try ideas
694
00:25:30.190 --> 00:25:32.540
and fail fast and then moving on.
695
00:25:32.540 --> 00:25:35.570
The notion of focusing on
696
00:25:35.570 --> 00:25:37.490
a few things to scale,
697
00:25:37.490 --> 00:25:40.740
focusing on a lot to sort of test and prototype,
698
00:25:40.740 --> 00:25:43.830
but a few to scale and invest behind.
699
00:25:43.830 --> 00:25:45.460
I thought that was really interesting.
700
00:25:45.460 --> 00:25:47.370
I thought Chris also had a nice blend
701
00:25:47.370 --> 00:25:49.563
of both excitement and patience.
702
00:25:50.636 --> 00:25:51.469
I mean, clearly excited
703
00:25:51.469 --> 00:25:52.750
about some of the things they're doing,
704
00:25:52.750 --> 00:25:54.470
but at the same time,
705
00:25:54.470 --> 00:25:55.670
some of the initiatives were taking
706
00:25:55.670 --> 00:25:57.930
two years or so to come to fruition.
707
00:25:57.930 --> 00:25:59.120
That has to be hard to balance.
708
00:25:59.120 --> 00:26:00.220
Being excited about something
709
00:26:00.220 --> 00:26:02.740
and then also waiting two years for it to come out.
710
00:26:02.740 --> 00:26:04.430
I thought that was a nice blend.
711
00:26:04.430 --> 00:26:06.480
Yeah, and also, to that point,
712
00:26:06.480 --> 00:26:08.060
the importance of focus, right?
713
00:26:08.060 --> 00:26:09.960
I mean, once you've picked it
714
00:26:09.960 --> 00:26:12.920
and you've decided that this is the right thing to do,
715
00:26:12.920 --> 00:26:14.820
and you're sort of seeing it
716
00:26:14.820 --> 00:26:16.760
progressing toward that,
717
00:26:16.760 --> 00:26:19.680
realizing that it's not time to give up now,
718
00:26:19.680 --> 00:26:22.790
you just have to mobilize and double down on it.
719
00:26:22.790 --> 00:26:24.830
The one thing that really struck me a lot was
720
00:26:24.830 --> 00:26:27.180
the importance of culture
721
00:26:27.180 --> 00:26:29.260
and how he said,
722
00:26:29.260 --> 00:26:30.810
from boardroom
723
00:26:30.810 --> 00:26:33.330
all the way to middle management,
724
00:26:33.330 --> 00:26:34.820
they have to believe
725
00:26:36.150 --> 00:26:37.530
that were behind it,
726
00:26:37.530 --> 00:26:40.540
and were investing, and this is not just a fad.
727
00:26:40.540 --> 00:26:42.250
That has to be sort of
728
00:26:42.250 --> 00:26:44.590
permeating across the entire organization
729
00:26:44.590 --> 00:26:47.410
to keep talent really excited and interested.
730
00:26:47.410 --> 00:26:48.950
And it went down even into the
731
00:26:48.950 --> 00:26:50.370
people who were using the systems.
732
00:26:50.370 --> 00:26:52.490
I thought that was a beautiful example of
733
00:26:52.490 --> 00:26:54.700
people who may have been so busy
734
00:26:54.700 --> 00:26:56.980
trying to just get their job done
735
00:26:56.980 --> 00:26:58.330
that they couldn't
736
00:26:58.330 --> 00:26:59.730
step back and think a little bit.
737
00:26:59.730 --> 00:27:02.060
He gave a great example of how
738
00:27:02.060 --> 00:27:05.200
that freedom of having the machine do some of the work
739
00:27:05.200 --> 00:27:07.730
let the human do things that humans are good at.
740
00:27:07.730 --> 00:27:09.960
He covered almost all of the steps in our prior report
741
00:27:09.960 --> 00:27:11.780
about the different ways of people
742
00:27:11.780 --> 00:27:13.120
working with machines.
743
00:27:13.120 --> 00:27:14.700
We didn't prompt him for that.
744
00:27:14.700 --> 00:27:16.290
The other thing I really liked
745
00:27:16.290 --> 00:27:18.700
was his view on talent.
746
00:27:18.700 --> 00:27:22.180
I asked him, what does it take to recruit and
747
00:27:22.180 --> 00:27:24.120
cultivate and retain good talent?
748
00:27:24.120 --> 00:27:26.430
He said, it's not a one-size-fits-all.
749
00:27:26.430 --> 00:27:28.120
That recognition that
750
00:27:28.120 --> 00:27:30.630
not all talent is
751
00:27:30.630 --> 00:27:32.010
of the same cloth,
752
00:27:32.010 --> 00:27:33.950
and different people, different skill sets,
753
00:27:33.950 --> 00:27:34.783
have different
754
00:27:34.783 --> 00:27:35.750
sensibilities, and they're
755
00:27:35.750 --> 00:27:37.210
looking for different things,
756
00:27:37.210 --> 00:27:39.020
but the common theme of
757
00:27:40.090 --> 00:27:43.130
people that go there want a continuous
758
00:27:43.130 --> 00:27:46.603
focus and commitment to innovation,
759
00:27:47.723 --> 00:27:48.830
and they want to see that,
760
00:27:48.830 --> 00:27:50.490
and maybe that's the common thing.
761
00:27:50.490 --> 00:27:53.210
Then, data scientists and technologists
762
00:27:53.210 --> 00:27:55.959
and chemists and engineers might have different
763
00:27:55.959 --> 00:27:58.750
career paths and career aspirations,
764
00:27:58.750 --> 00:28:01.610
but they all share in that common
765
00:28:01.610 --> 00:28:02.900
striving for innovation.
766
00:28:02.900 --> 00:28:04.490
I don't think he mentioned it,
767
00:28:04.490 --> 00:28:06.910
but Chris is a Techstars mentor,
768
00:28:06.910 --> 00:28:08.440
and I'm sure that some of that background
769
00:28:08.440 --> 00:28:10.840
also influences the way he thinks about
770
00:28:10.840 --> 00:28:12.240
different people and different ideas
771
00:28:12.240 --> 00:28:14.420
and how that talent can come together.
772
00:28:14.420 --> 00:28:15.910
Yup, that's right.
773
00:28:15.910 --> 00:28:18.245
He didn't mention it, but that's true.
774
00:28:18.245 --> 00:28:19.078
(upbeat melody)
775
00:28:19.078 --> 00:28:20.640
Thanks for joining us today.
776
00:28:20.640 --> 00:28:22.460
Next time, we'll talk with Huiming Qu
777
00:28:22.460 --> 00:28:24.460
about how The Home Depot continues to build
778
00:28:24.460 --> 00:28:26.100
its AI capabilities.
779
00:28:26.100 --> 00:28:26.953
Please join us.
780
00:28:29.570 --> 00:28:32.320
Thanks for listening to Me, Myself, and AI.
781
00:28:32.320 --> 00:28:33.950
If you're enjoying the show,
782
00:28:33.950 --> 00:28:35.757
take a minute to write us a review.
783
00:28:35.757 --> 00:28:37.680
If you send us a screenshot,
784
00:28:37.680 --> 00:28:39.030
we'll send you a collection of
785
00:28:39.030 --> 00:28:42.580
MIT SMR's best articles on artificial intelligence,
786
00:28:42.580 --> 00:28:44.510
free for a limited time.
787
00:28:44.510 --> 00:28:46.130
Send your review screenshot to
788
00:28:46.130 --> 00:28:49.253
smrfeedback@mit.edu.