WEBVTT
1
00:00:03.330 --> 00:00:05.410
Can you get to the moon without first getting
2
00:00:05.410 --> 00:00:07.250
to your own roof?
3
00:00:07.250 --> 00:00:09.180
This will be the topic of our conversation
4
00:00:09.180 --> 00:00:11.963
with Will Grannis, Google Cloud CTO.
5
00:00:13.820 --> 00:00:15.810
Welcome to Me, Myself, and AI,
6
00:00:15.810 --> 00:00:19.020
a podcast on artificial intelligence in business.
7
00:00:19.020 --> 00:00:21.420
Each episode, we introduce you to someone innovating
8
00:00:21.420 --> 00:00:22.870
with AI.
9
00:00:22.870 --> 00:00:25.730
I'm Sam Ransbotham, Professor of Information Systems
10
00:00:25.730 --> 00:00:27.400
at Boston College.
11
00:00:27.400 --> 00:00:30.460
I'm also the Guest Editor for the AI and Business Strategy
12
00:00:30.460 --> 00:00:34.500
Big Idea Program at MIT Sloan Management Review.
13
00:00:34.500 --> 00:00:38.380
And I'm Shervin Khodabandeh, Senior Partner with
14
00:00:38.380 --> 00:00:42.190
BCG, and I co-lead BCG's AI practice in North America.
15
00:00:42.190 --> 00:00:46.260
And together, MIT SMR and BCG have been researching AI
16
00:00:46.260 --> 00:00:49.870
for five years, interviewing hundreds of practitioners
17
00:00:49.870 --> 00:00:53.820
and surveying thousands of companies on what it takes to
18
00:00:53.820 --> 00:00:56.520
build and to deploy and scale AI capabilities
19
00:00:56.520 --> 00:00:58.800
across the organization and really transform
20
00:00:58.800 --> 00:01:00.323
the way organizations operate.
21
00:01:01.768 --> 00:01:04.300
(upbeat music)
22
00:01:04.300 --> 00:01:05.550
We're talking with Will Grannis today.
23
00:01:05.550 --> 00:01:08.210
He's the founder and leader of the office of the CTO
24
00:01:08.210 --> 00:01:09.580
at Google Cloud.
25
00:01:09.580 --> 00:01:11.580
Thank you for joining us today, Will.
26
00:01:11.580 --> 00:01:12.610
Yeah, great to be here.
27
00:01:12.610 --> 00:01:13.990
Thanks for having me.
28
00:01:13.990 --> 00:01:17.352
So it's quite a difference, between being at
29
00:01:17.352 --> 00:01:19.380
Google Cloud and your background.
30
00:01:19.380 --> 00:01:21.360
So can you tell us a little bit about how you ended up
31
00:01:21.360 --> 00:01:22.920
where you are?
32
00:01:22.920 --> 00:01:26.010
Call it maybe a mix of formal education
33
00:01:26.010 --> 00:01:28.020
and informal education,
34
00:01:28.020 --> 00:01:32.210
formerly Arizona public school system.
35
00:01:32.210 --> 00:01:35.180
And then later on West Point,
36
00:01:35.180 --> 00:01:39.190
Math and Engineering Undergrad, and then later on UPENN,
37
00:01:39.190 --> 00:01:43.410
University of Pennsylvania Wharton for my MBA.
38
00:01:43.410 --> 00:01:44.710
Now maybe the more interesting part
39
00:01:44.710 --> 00:01:46.240
is the informal education.
40
00:01:46.240 --> 00:01:48.610
And this started in the third grade.
41
00:01:48.610 --> 00:01:51.500
And back then I think it was gaming
42
00:01:51.500 --> 00:01:54.420
that originally spiked my curiosity in technology.
43
00:01:54.420 --> 00:01:58.800
So this was Pong, Oregon Trail, Intellivision, Nintendo,
44
00:01:58.800 --> 00:02:00.360
all the gaming platforms.
45
00:02:00.360 --> 00:02:02.890
I was just fascinated that you could turn a disc
46
00:02:02.890 --> 00:02:06.560
on a handset, and you could see Tron move around on a screen
47
00:02:06.560 --> 00:02:08.240
that was like the coolest thing ever.
48
00:02:08.240 --> 00:02:12.460
So today's manifestation, Khan Academy, edX, Codeacademy,
49
00:02:12.460 --> 00:02:16.250
platforms like that, this entire online catalog knowledge,
50
00:02:16.250 --> 00:02:18.400
thanks to my current employer, Google.
51
00:02:18.400 --> 00:02:20.740
And just as an example like this week,
52
00:02:20.740 --> 00:02:23.840
I'm porting some machine learning code to a microcontroller
53
00:02:23.840 --> 00:02:27.310
and brushing up on my C, thanks to these, what I'd call
54
00:02:27.310 --> 00:02:29.100
informal education platforms.
55
00:02:29.100 --> 00:02:32.220
So a journey that started with formal education,
56
00:02:32.220 --> 00:02:36.990
but was really accelerated by others, by curiosity,
57
00:02:36.990 --> 00:02:40.120
and by these informal platforms, where I could go explore
58
00:02:40.120 --> 00:02:42.380
the things I was really interested in.
59
00:02:42.380 --> 00:02:44.210
Nothing particular with artificial intelligence,
60
00:02:44.210 --> 00:02:47.870
we're so focused about games and whether or not the machine
61
00:02:47.870 --> 00:02:50.530
has beat a human at this game or that game,
62
00:02:50.530 --> 00:02:53.860
when there seems to be such a difference between games
63
00:02:53.860 --> 00:02:55.710
and business scenarios.
64
00:02:55.710 --> 00:02:57.970
So how can we make that connection?
65
00:02:57.970 --> 00:03:01.080
How can we move from what we can learn from games
66
00:03:01.080 --> 00:03:04.493
to what businesses can learn from artificial intelligence?
67
00:03:05.470 --> 00:03:08.100
Gaming is exciting, and it is interesting.
68
00:03:08.100 --> 00:03:10.080
But let's take a foundational element of games.
69
00:03:10.080 --> 00:03:13.610
So understanding the environment that you're in and defining
70
00:03:13.610 --> 00:03:15.260
the problem you want to solve.
71
00:03:15.260 --> 00:03:17.800
What's the objective function, if you will?
72
00:03:17.800 --> 00:03:20.350
That is exactly the same question that every manufacturer,
73
00:03:20.350 --> 00:03:22.620
every retailer, or every financial services organization
74
00:03:22.620 --> 00:03:24.490
asked themselves when they're first starting to apply
75
00:03:24.490 --> 00:03:25.920
machine learning.
76
00:03:25.920 --> 00:03:28.030
And so in games, the objective functions tend
77
00:03:28.030 --> 00:03:30.740
to be a little bit more fun, that could be an adversarial
78
00:03:30.740 --> 00:03:34.510
game where you're trying to win and beat others.
79
00:03:34.510 --> 00:03:37.230
But those underpinnings of how to win in a game,
80
00:03:37.230 --> 00:03:39.930
actually are very, very relevant to how you design
81
00:03:39.930 --> 00:03:43.770
machine learning in the real world to maximize
82
00:03:43.770 --> 00:03:45.860
any other type of objective function that you have.
83
00:03:45.860 --> 00:03:49.270
So for example in retail, if you're trying to decrease
84
00:03:49.270 --> 00:03:52.110
the friction of a consumer's online experience,
85
00:03:52.110 --> 00:03:54.690
you actually have some objectives that you're trying
86
00:03:54.690 --> 00:03:57.210
to optimize and thinking about it like a game
87
00:03:57.210 --> 00:03:59.360
is actually a useful construct at the beginning
88
00:03:59.360 --> 00:04:01.250
of problem definition.
89
00:04:01.250 --> 00:04:03.340
What is it that we really want to achieve?
90
00:04:03.340 --> 00:04:05.530
And I'll tell you that being around AI machine learning
91
00:04:05.530 --> 00:04:08.220
now for a couple decades, when it was cool,
92
00:04:08.220 --> 00:04:10.770
when it wasn't cool, I can tell you that the problem
93
00:04:10.770 --> 00:04:13.500
definition and really getting a rich sense of the problem
94
00:04:13.500 --> 00:04:15.620
you're trying to solve is absolutely the number one most
95
00:04:15.620 --> 00:04:17.747
important criteria for being successful with AI
96
00:04:17.747 --> 00:04:19.310
and machine learning.
97
00:04:19.310 --> 00:04:21.270
Yeah, I think that's quite insightful, Will.
98
00:04:21.270 --> 00:04:24.320
And it's probably a very good segue to my question.
99
00:04:24.320 --> 00:04:27.090
That is, it feels like in almost any sector,
100
00:04:27.090 --> 00:04:31.210
what we're seeing is that there are winners and losers
101
00:04:31.210 --> 00:04:34.040
in terms of getting impact from AI.
102
00:04:34.040 --> 00:04:37.270
There are a lot less winners than there are losers.
103
00:04:37.270 --> 00:04:42.220
And I'm sure that many CEOs are looking at this wondering
104
00:04:42.220 --> 00:04:43.800
what is going on.
105
00:04:43.800 --> 00:04:47.400
And I deeply believe a lot of it is what you said,
106
00:04:47.400 --> 00:04:49.470
which is, it absolutely has to start
107
00:04:49.470 --> 00:04:54.030
with the problem definition and getting the perspective
108
00:04:54.030 --> 00:04:58.340
of business users and process owners and line managers
109
00:04:58.340 --> 00:05:01.870
into that problem definition which should be critical.
110
00:05:01.870 --> 00:05:05.010
And since we're talking about this it would be interesting
111
00:05:05.010 --> 00:05:08.070
to get your views on what are some of the success factors
112
00:05:08.070 --> 00:05:11.200
from where you're sitting and where you're observing
113
00:05:11.200 --> 00:05:13.173
to get maximum impact from AI?
114
00:05:14.290 --> 00:05:16.910
Well, I can't speak to exactly why every company
115
00:05:16.910 --> 00:05:18.410
is successful or unsuccessful with AI.
116
00:05:18.410 --> 00:05:21.060
But I can give you a couple of principles
117
00:05:21.060 --> 00:05:24.480
that we try to apply, that I try to apply generally.
118
00:05:24.480 --> 00:05:27.970
I think today we hear and we see a lot about AI,
119
00:05:27.970 --> 00:05:29.623
and the magic that it creates.
120
00:05:30.550 --> 00:05:32.780
And I think sometimes that does a disservice
121
00:05:32.780 --> 00:05:35.020
to people who are trying to implement it in production.
122
00:05:35.020 --> 00:05:36.670
I'll give you an example.
123
00:05:36.670 --> 00:05:39.680
Where did we start with AI at Google?
124
00:05:39.680 --> 00:05:42.120
Well, it was in a place where we already had really
125
00:05:42.120 --> 00:05:45.790
well-constructed data pipelines, or we had already exhausted
126
00:05:45.790 --> 00:05:50.190
the heuristics that we were using to determine performance.
127
00:05:50.190 --> 00:05:53.800
And instead, we looked at machine learning as one option
128
00:05:53.800 --> 00:05:58.570
to improve our lift on advertising,
129
00:05:58.570 --> 00:06:00.120
for example.
130
00:06:00.120 --> 00:06:03.000
And it was only because we already had all the foundational
131
00:06:03.000 --> 00:06:07.490
work done, we understood how to curate,
132
00:06:07.490 --> 00:06:10.280
extract, transform, load data, how to share it,
133
00:06:10.280 --> 00:06:12.530
how to think about what that data might yield
134
00:06:12.530 --> 00:06:14.880
in terms of outcomes, how to construct experiments,
135
00:06:14.880 --> 00:06:17.680
design of experiments, and utilize that data effectively
136
00:06:17.680 --> 00:06:20.150
and efficiently, that we were able to test the frontier
137
00:06:20.150 --> 00:06:21.690
of machine learning within our organization.
138
00:06:21.690 --> 00:06:24.260
And maybe to your question, maybe one of the biggest
139
00:06:24.260 --> 00:06:26.619
opportunities for most organizations today,
140
00:06:26.619 --> 00:06:28.560
maybe it will be machine learning.
141
00:06:28.560 --> 00:06:32.060
But maybe today, it's actually in how they leverage data,
142
00:06:32.060 --> 00:06:34.500
how they share, how they collaborate around data.
143
00:06:34.500 --> 00:06:37.390
How they enrich it, how they make it easy to share
144
00:06:37.390 --> 00:06:39.760
with groups that have high sophistication levels,
145
00:06:39.760 --> 00:06:40.593
like data scientists.
146
00:06:40.593 --> 00:06:44.710
But also, analysts and business intelligence professionals
147
00:06:44.710 --> 00:06:47.170
who are trying to answer a difficult question
148
00:06:47.170 --> 00:06:49.100
in a short period of time for the head of the line
149
00:06:49.100 --> 00:06:49.970
of business.
150
00:06:49.970 --> 00:06:53.530
And unless you have that level of data sophistication,
151
00:06:53.530 --> 00:06:55.390
machine learning will probably be out of reach
152
00:06:55.390 --> 00:06:57.500
for the foreseeable future.
153
00:06:57.500 --> 00:06:59.980
Yeah, Will, one other place I thought you might
154
00:06:59.980 --> 00:07:01.870
go is building on what you were saying earlier
155
00:07:01.870 --> 00:07:06.650
about the analog between gaming and business,
156
00:07:06.650 --> 00:07:08.230
all around problem definition.
157
00:07:08.230 --> 00:07:10.610
Now, it's important to get the problem definition right.
158
00:07:10.610 --> 00:07:12.370
And what resonated with me when you were saying
159
00:07:12.370 --> 00:07:14.870
that was probably a lot of companies just don't know
160
00:07:14.870 --> 00:07:18.120
how to make that connection, and don't know where to get
161
00:07:18.120 --> 00:07:21.180
started, which is actually what is the actual problem
162
00:07:21.180 --> 00:07:23.630
that we're trying to solve with AI?
163
00:07:23.630 --> 00:07:27.000
And many are focusing on what are all the cool things AI
164
00:07:27.000 --> 00:07:29.470
can do and what's all the data and technology we need,
165
00:07:29.470 --> 00:07:32.250
rather than actually starting with the problem definition
166
00:07:32.250 --> 00:07:36.100
and working their way backwards from the problem definition
167
00:07:36.100 --> 00:07:39.560
to the data, and then how can AI help them solve
168
00:07:39.560 --> 00:07:40.750
that problem.
169
00:07:40.750 --> 00:07:42.830
It's really a mindset.
170
00:07:42.830 --> 00:07:45.220
I'll share a little inside scoop.
171
00:07:45.220 --> 00:07:48.960
At Google, we have an internal document that our engineers
172
00:07:48.960 --> 00:07:51.720
have written to help each other out with getting started
173
00:07:51.720 --> 00:07:52.553
on machine learning.
174
00:07:52.553 --> 00:07:55.880
And the number one, there's a list of 72 factors,
175
00:07:55.880 --> 00:07:59.230
things you need to do to be successful in machine learning.
176
00:07:59.230 --> 00:08:01.823
And number one is, you don't need machine learning.
177
00:08:03.637 --> 00:08:06.230
And the reason why it's stated so strongly is actually
178
00:08:06.230 --> 00:08:09.320
to get the mindset of uncovering the richness
179
00:08:09.320 --> 00:08:10.850
of the problem.
180
00:08:10.850 --> 00:08:13.610
And the nuances of that problem actually create
181
00:08:13.610 --> 00:08:16.250
all of the downstream to your point, all of the downstream
182
00:08:16.250 --> 00:08:17.940
implementation decisions.
183
00:08:17.940 --> 00:08:22.940
So if you want to reduce friction in online checkout,
184
00:08:23.310 --> 00:08:26.870
that is a different problem than trying to optimize
185
00:08:26.870 --> 00:08:30.500
really great recommendations within someone's
186
00:08:30.500 --> 00:08:32.610
e-commerce experience online for retail.
187
00:08:32.610 --> 00:08:33.870
Those are two very different problems,
188
00:08:33.870 --> 00:08:35.360
and you might approach them very differently.
189
00:08:35.360 --> 00:08:37.930
They might have completely different data sets,
190
00:08:37.930 --> 00:08:40.800
they might have completely different outcomes
191
00:08:40.800 --> 00:08:42.160
on your business.
192
00:08:42.160 --> 00:08:44.370
And so one of the things that we've done here at Google
193
00:08:44.370 --> 00:08:48.040
over time is we tried to take our internal shorthand
194
00:08:48.040 --> 00:08:51.290
for innovation -- approach to innovation and creativity --
195
00:08:51.290 --> 00:08:54.030
and we've tried to codify it so that we can be consistent
196
00:08:54.030 --> 00:08:56.700
in how we execute projects, especially the ones that venture
197
00:08:56.700 --> 00:08:59.230
into the murkiness of the future.
198
00:08:59.230 --> 00:09:04.230
And this framework, it really has three principles.
199
00:09:04.320 --> 00:09:07.420
And the first one, as you might expect, is to focus
200
00:09:07.420 --> 00:09:09.760
on the user, which is really a way of saying
201
00:09:09.760 --> 00:09:11.960
let's get after the problem, the pain that they care
202
00:09:11.960 --> 00:09:13.090
the most about.
203
00:09:13.090 --> 00:09:15.400
The second step is to think 10x.
204
00:09:15.400 --> 00:09:18.530
Because we know it is going to be worth the investment
205
00:09:18.530 --> 00:09:22.210
of all of these cross-functional teams' time to create
206
00:09:22.210 --> 00:09:24.870
the data pipelines and to curate them and to test
207
00:09:24.870 --> 00:09:28.390
for potential bias within these pipelines
208
00:09:28.390 --> 00:09:31.540
or within data sets to build models,
209
00:09:31.540 --> 00:09:32.930
and to test those models.
210
00:09:32.930 --> 00:09:35.430
That's a significant investment of time and expertise
211
00:09:35.430 --> 00:09:36.610
and attention.
212
00:09:36.610 --> 00:09:39.200
And so we want to make sure we're solving for a problem
213
00:09:39.200 --> 00:09:42.010
that also has the scale that will be worth it,
214
00:09:42.010 --> 00:09:44.050
and really advances whatever we're trying to do
215
00:09:44.050 --> 00:09:46.690
not in a small way, but in a really big way.
216
00:09:46.690 --> 00:09:48.900
And then the third one is rapidly prototyping.
217
00:09:48.900 --> 00:09:50.420
And you can't get to the rapid prototyping
218
00:09:50.420 --> 00:09:51.540
unless you've thought through the problem.
219
00:09:51.540 --> 00:09:54.930
You've constructed your environment so that you can conduct
220
00:09:54.930 --> 00:09:59.030
these experiments rapidly and sometimes we'll proxy
221
00:09:59.030 --> 00:10:02.590
outcomes just to see if we care about them at all,
222
00:10:02.590 --> 00:10:04.400
without running them at full production.
223
00:10:04.400 --> 00:10:07.830
So that framework, that focusing on the user thinking 10x.
224
00:10:07.830 --> 00:10:11.000
And then rapid prototyping is an approach that we use
225
00:10:11.000 --> 00:10:14.393
across Google, regardless of product domain.
226
00:10:15.370 --> 00:10:19.290
That's really insightful, especially that
227
00:10:19.290 --> 00:10:21.300
think 10x piece, which I think is really, really helpful.
228
00:10:21.300 --> 00:10:22.810
I really like that.
229
00:10:22.810 --> 00:10:26.940
You're lobbying I think, for what we call it, very
230
00:10:26.940 --> 00:10:29.730
strong exploration mindset towards your approach
231
00:10:29.730 --> 00:10:33.742
to artificial intelligence, versus more of an incremental,
232
00:10:33.742 --> 00:10:36.750
or let's do what we have better.
233
00:10:36.750 --> 00:10:38.070
Is that right for everybody?
234
00:10:38.070 --> 00:10:39.280
Do you think that that's ...
235
00:10:39.280 --> 00:10:41.517
Is that idiosyncratic to Google?
236
00:10:41.517 --> 00:10:43.680
And I guess, almost everyone listening today
237
00:10:43.680 --> 00:10:47.040
is not going to be working at Google, is that something
238
00:10:47.040 --> 00:10:49.290
that you think works in all kinds of places?
239
00:10:49.290 --> 00:10:52.350
That may be beyond what you can speak to,
240
00:10:52.350 --> 00:10:55.080
but how well do you think that that works across
241
00:10:55.080 --> 00:10:56.680
all organizations?
242
00:10:56.680 --> 00:10:57.746
Well, I think there's a difference
243
00:10:57.746 --> 00:10:58.837
between a mindset
244
00:10:58.837 --> 00:11:03.837
and then the way that these principles manifest themselves.
245
00:11:04.580 --> 00:11:08.603
Machine learning just in its nature is exploration.
246
00:11:09.490 --> 00:11:11.420
It's approximations.
247
00:11:11.420 --> 00:11:14.100
And you're looking through the math, and you're looking
248
00:11:14.100 --> 00:11:17.270
for the places where you're pretty confident
249
00:11:17.270 --> 00:11:19.310
that things have changed significantly for the better
250
00:11:19.310 --> 00:11:21.530
or for the worse, so that you can do your feature
251
00:11:21.530 --> 00:11:25.160
engineering, and you can understand the impact of choices
252
00:11:25.160 --> 00:11:26.550
that you're making.
253
00:11:26.550 --> 00:11:30.760
And in a lot of ways, the mathematical exploration
254
00:11:30.760 --> 00:11:33.233
is an analog to the human exploration.
255
00:11:33.233 --> 00:11:35.700
It's that we try to encourage people.
256
00:11:35.700 --> 00:11:38.210
By the way, just because we have a great idea
257
00:11:38.210 --> 00:11:39.520
doesn't mean it gets funded at Google.
258
00:11:39.520 --> 00:11:40.970
Yes, we are a very large company.
259
00:11:40.970 --> 00:11:44.190
Yes, we're doing pretty well.
260
00:11:44.190 --> 00:11:47.720
But most of our big breakthroughs have not come
261
00:11:47.720 --> 00:11:51.380
from some top-down-mandated, gigantic project that everybody
262
00:11:51.380 --> 00:11:53.239
said was going to be successful.
263
00:11:53.239 --> 00:11:55.950
Gmail was built by people who were told very early
264
00:11:55.950 --> 00:11:57.910
on they would never succeed.
265
00:11:57.910 --> 00:11:59.660
And we find that this is a very common path.
266
00:11:59.660 --> 00:12:02.110
And before Google, I've been an entrepreneur
267
00:12:02.110 --> 00:12:04.240
a couple of times in my own company and somebody else's.
268
00:12:04.240 --> 00:12:05.910
And I've worked in other large companies
269
00:12:05.910 --> 00:12:08.540
that had world-class engineering teams as well.
270
00:12:08.540 --> 00:12:11.800
And I can tell you, this is a pattern, which is giving
271
00:12:11.800 --> 00:12:16.800
people just enough freedom to think about what that future
272
00:12:17.340 --> 00:12:18.340
could look like.
273
00:12:18.340 --> 00:12:20.920
We have a way of describing 10x
274
00:12:20.920 --> 00:12:24.463
at Google, you may have heard called moonshots?
275
00:12:24.463 --> 00:12:26.220
Well, our internal engineering team has also coined
276
00:12:26.220 --> 00:12:30.050
the term roofshots, because the moonshots are often
277
00:12:30.050 --> 00:12:32.780
accomplished by a series of these roofshots.
278
00:12:32.780 --> 00:12:36.780
And if people don't believe in the end state,
279
00:12:36.780 --> 00:12:40.550
the big transformation, they're usually much less likely
280
00:12:40.550 --> 00:12:43.790
to journey across those roofshots, and to keep going
281
00:12:43.790 --> 00:12:45.170
when things get hard.
282
00:12:45.170 --> 00:12:49.980
And we don't flood people with resources and help
283
00:12:49.980 --> 00:12:52.190
at the beginning, because...
284
00:12:53.370 --> 00:12:54.300
This is hard for me to say
285
00:12:54.300 --> 00:12:57.360
as a senior executive leading technology innovation,
286
00:12:57.360 --> 00:13:00.880
but quite often, I don't have perfect knowledge
287
00:13:01.970 --> 00:13:05.480
of what will be the most impactful project that teams
288
00:13:05.480 --> 00:13:06.730
are working on.
289
00:13:06.730 --> 00:13:10.230
My job is to create an environment where people feel
290
00:13:10.230 --> 00:13:15.190
empowered, encouraged, and excited to try, and to try
291
00:13:15.190 --> 00:13:17.020
to demotivate them as little as possible.
292
00:13:17.020 --> 00:13:19.020
Because they'll find their way.
293
00:13:19.020 --> 00:13:20.490
They'll find their way to the roofshot,
294
00:13:20.490 --> 00:13:22.190
and then the next one, and then the next one.
295
00:13:22.190 --> 00:13:25.340
And then pretty soon, you're three years in,
296
00:13:25.340 --> 00:13:28.100
and I couldn't stop a project if I wanted to.
297
00:13:28.100 --> 00:13:30.060
It's going to happen because of that spirit,
298
00:13:30.060 --> 00:13:31.243
that voyager spirit.
299
00:13:32.670 --> 00:13:34.695
Tell us a bit more about your role at
300
00:13:34.695 --> 00:13:35.528
Google Cloud.
301
00:13:36.398 --> 00:13:37.780
I think I have the best job in the industry,
302
00:13:37.780 --> 00:13:42.780
which is, I get to lead a collective of CTOs who have come
303
00:13:43.700 --> 00:13:47.550
from every industry, and every geography,
304
00:13:47.550 --> 00:13:50.200
and every kind of place in the stack
305
00:13:50.200 --> 00:13:54.320
from hardware engineering, all the way up to SAS,
306
00:13:54.320 --> 00:13:55.420
quantum security.
307
00:13:55.420 --> 00:13:57.240
And I get to lead this incredible team.
308
00:13:57.240 --> 00:14:01.280
And our mission is to create this bridge
309
00:14:01.280 --> 00:14:04.720
between our customers, our top customers, our top partners
310
00:14:04.720 --> 00:14:06.690
of Google who are trying to do incredible things
311
00:14:06.690 --> 00:14:09.000
with technology, and the people who are building
312
00:14:09.000 --> 00:14:10.600
these foundational platforms at Google
313
00:14:10.600 --> 00:14:11.720
and to try to harmonize them.
314
00:14:11.720 --> 00:14:15.570
Because with the evolution of Google now,
315
00:14:15.570 --> 00:14:19.850
especially with our cloud business, we have become a partner
316
00:14:19.850 --> 00:14:23.200
to many of the world's top organizations.
317
00:14:23.200 --> 00:14:26.620
And so for example, if Major League Baseball wants to create
318
00:14:26.620 --> 00:14:30.200
a new, immersive experience for you at home
319
00:14:30.200 --> 00:14:33.530
through a digital device, or eventually when we get back
320
00:14:33.530 --> 00:14:36.193
to it, more into the stadiums.
321
00:14:37.110 --> 00:14:39.940
It's not just us creating technology, surfacing it to them,
322
00:14:39.940 --> 00:14:42.230
them telling us what they like about it, and then sending
323
00:14:42.230 --> 00:14:44.330
it back and then we spin it.
324
00:14:44.330 --> 00:14:46.770
It's actually collaborative innovation.
325
00:14:46.770 --> 00:14:49.660
So we have these approaches to machine learning
326
00:14:49.660 --> 00:14:51.740
that we think could be pretty interesting.
327
00:14:51.740 --> 00:14:53.570
We have technologies in AR, VR,
328
00:14:53.570 --> 00:14:56.550
we have content delivery networks,
329
00:14:56.550 --> 00:14:58.460
we have all of these different platforms
330
00:14:58.460 --> 00:15:01.590
that we have at Google and in this exploratory mode,
331
00:15:01.590 --> 00:15:02.930
or we get together these large customers,
332
00:15:02.930 --> 00:15:07.330
and they help guide not only the features, but they help us
333
00:15:07.330 --> 00:15:08.870
think about what we're going to build next.
334
00:15:08.870 --> 00:15:11.280
And then they end up depending, they layer on top
335
00:15:11.280 --> 00:15:13.320
of these foundational platforms, the experience
336
00:15:13.320 --> 00:15:16.160
that they want is Major League Baseball, to us
337
00:15:16.160 --> 00:15:18.290
as baseball fans.
338
00:15:18.290 --> 00:15:21.890
And that intertwined collaborative technology development
339
00:15:21.890 --> 00:15:23.670
is at the heart, and that collaborative innovation,
340
00:15:23.670 --> 00:15:27.400
that's at the heart of what we do here in the CTO group.
341
00:15:27.400 --> 00:15:29.430
That's a great example, can you say a bit more
342
00:15:29.430 --> 00:15:33.140
about how you set the strategy for projects like that?
343
00:15:33.140 --> 00:15:37.800
I'm very, very bullish about having the CTO and the
344
00:15:37.800 --> 00:15:40.810
CIO at the top table in an organization.
345
00:15:40.810 --> 00:15:45.190
Because, the CIO often is involved in the technology
346
00:15:45.190 --> 00:15:49.050
that a company uses for itself, for its own innovation.
347
00:15:49.050 --> 00:15:52.030
And I've often found that the tools and the collaboration,
348
00:15:52.030 --> 00:15:54.340
the culture that you have internally manifests itself
349
00:15:54.340 --> 00:15:56.630
in the technology that you build for others.
350
00:15:56.630 --> 00:16:00.290
So the CIO's perspective on how to collaborate the tools,
351
00:16:00.290 --> 00:16:02.380
how people are working together, how they could be working
352
00:16:02.380 --> 00:16:05.250
together is just as important as the CTO's view
353
00:16:05.250 --> 00:16:09.080
into what technology could be most impactful,
354
00:16:09.080 --> 00:16:11.640
most disruptive, kind of coming from the outside in.
355
00:16:11.640 --> 00:16:14.240
But you also want them sitting next to the CMO.
356
00:16:14.240 --> 00:16:16.620
You want them sitting next to the chief revenue officer,
357
00:16:16.620 --> 00:16:19.580
you want them with the CEO and the CFO.
358
00:16:19.580 --> 00:16:22.540
And the reason is because it creates a tension.
359
00:16:22.540 --> 00:16:27.350
I would never advocate that all of my ideas are great.
360
00:16:27.350 --> 00:16:28.900
Some of them are.
361
00:16:28.900 --> 00:16:31.760
But some of them have panned out.
362
00:16:31.760 --> 00:16:35.000
And it's really important that that unfiltered tension
363
00:16:35.000 --> 00:16:37.150
is created to the point at which corporate strategy
364
00:16:37.150 --> 00:16:37.983
is delivered.
365
00:16:37.983 --> 00:16:39.860
In fact, this is one of the things I learned a lot
366
00:16:39.860 --> 00:16:43.670
from working for a couple CEOs, both outside of Google
367
00:16:43.670 --> 00:16:46.820
and here, is that it's a shared responsibility,
368
00:16:46.820 --> 00:16:49.590
the responsibility of the CTO to put themselves in the room
369
00:16:49.590 --> 00:16:50.870
to add that value.
370
00:16:50.870 --> 00:16:53.220
And it's the responsibility of the CEO to pull it
371
00:16:53.220 --> 00:16:55.990
through the organization when the mode of operation
372
00:16:55.990 --> 00:16:58.123
may not be that way today.
373
00:16:59.010 --> 00:17:00.270
Yeah, that's very true.
374
00:17:00.270 --> 00:17:02.960
And it corroborates our work, Sam, to a large extent
375
00:17:02.960 --> 00:17:06.280
that it's not just about building the layers of tech,
376
00:17:06.280 --> 00:17:07.570
it's about process change.
377
00:17:07.570 --> 00:17:09.580
It's about strategy alignment.
378
00:17:09.580 --> 00:17:13.240
And also, it's about ultimately what humans have to do
379
00:17:13.240 --> 00:17:14.410
differently.
380
00:17:14.410 --> 00:17:18.430
And to work with AI and work with AI collaboratively.
381
00:17:18.430 --> 00:17:21.680
It's also about how managers and middle managers
382
00:17:21.680 --> 00:17:24.030
and the folks that are using AI to be more productive,
383
00:17:24.030 --> 00:17:27.020
to be more precise, to be more innovative, more imaginative
384
00:17:27.020 --> 00:17:29.060
in their day-to-day work.
385
00:17:29.060 --> 00:17:30.730
Can you comment a bit on that in terms
386
00:17:30.730 --> 00:17:32.800
of how it could have changed the roles of individual
387
00:17:32.800 --> 00:17:33.750
employees.
388
00:17:33.750 --> 00:17:36.870
Let's say in different roles, whether it's in marketing,
389
00:17:36.870 --> 00:17:40.610
or in pricing, or customer servicing, any thoughts or ideas
390
00:17:40.610 --> 00:17:42.030
on that?
391
00:17:42.030 --> 00:17:45.560
We had an exercise like this with a large
392
00:17:45.560 --> 00:17:47.930
retail customer. And it turned out that someone from outside
393
00:17:47.930 --> 00:17:51.180
of the organization, the kind of physical security
394
00:17:51.180 --> 00:17:53.710
and kind of monitoring organization, it turns out
395
00:17:53.710 --> 00:17:57.220
that one of the most disruptive and interesting
396
00:17:57.220 --> 00:18:02.220
and impactful framings of that problem came from someone
397
00:18:02.440 --> 00:18:06.950
who was in a product team totally unrelated to this area
398
00:18:06.950 --> 00:18:09.260
that just got invited to this workshop
399
00:18:09.260 --> 00:18:12.233
as a representative of their org.
400
00:18:13.600 --> 00:18:17.390
So, we can't have everybody in every brainstorming session,
401
00:18:17.390 --> 00:18:19.990
despite that the technology allows us to put a lot of people
402
00:18:19.990 --> 00:18:21.810
in one place at one time.
403
00:18:21.810 --> 00:18:26.270
But choosing who is in those moments is absolutely critical.
404
00:18:26.270 --> 00:18:28.680
And just going to default roles or going to default
405
00:18:28.680 --> 00:18:31.390
responsibilities is one way to just keep the same
406
00:18:31.390 --> 00:18:34.560
information coming back again and again and again.
407
00:18:34.560 --> 00:18:35.720
That's certainly something we're thinking
408
00:18:35.720 --> 00:18:38.760
about at a humanities-based university, that blend
409
00:18:38.760 --> 00:18:40.270
and that role of people.
410
00:18:40.270 --> 00:18:42.430
It's interesting to me that in all your examples,
411
00:18:42.430 --> 00:18:44.700
you've talked about joining people and people from cross
412
00:18:44.700 --> 00:18:47.600
functional teams, you've never mentioned a machine
413
00:18:47.600 --> 00:18:50.070
as one of these roles, or a player.
414
00:18:50.070 --> 00:18:52.130
Is that too far-fetched?
415
00:18:52.130 --> 00:18:56.020
How are these combinations of humans going to add
416
00:18:56.020 --> 00:18:58.940
the combination of machine in here?
417
00:18:58.940 --> 00:19:00.820
We've got a lot of learning from machines.
418
00:19:00.820 --> 00:19:04.340
And I think, certainly the task level,
419
00:19:04.340 --> 00:19:08.330
what point does it get elevated to more a strategic level?
420
00:19:08.330 --> 00:19:09.440
Is that too far away?
421
00:19:09.440 --> 00:19:11.000
No, I don't think so.
422
00:19:11.000 --> 00:19:13.380
But certainly in its early days, and one of the ways
423
00:19:13.380 --> 00:19:16.750
you can see this manifest is natural language processing,
424
00:19:16.750 --> 00:19:18.000
for example.
425
00:19:18.000 --> 00:19:22.319
I remember one project we had, we were training a chatbot.
426
00:19:22.319 --> 00:19:26.440
And it turned out we used raw logs, all privacy assured
427
00:19:26.440 --> 00:19:28.900
and everything, but we used these logs the customer
428
00:19:28.900 --> 00:19:31.510
had provided because they wanted to see if we could build
429
00:19:31.510 --> 00:19:32.830
a better model.
430
00:19:32.830 --> 00:19:35.930
And it turns out that the chat agent wasn't exactly
431
00:19:35.930 --> 00:19:38.290
speaking the way we'd want another human being to speak
432
00:19:38.290 --> 00:19:39.150
to us.
433
00:19:39.150 --> 00:19:39.983
And why?
434
00:19:39.983 --> 00:19:43.440
Because people get pretty upset when they're talking
435
00:19:43.440 --> 00:19:45.770
to customer support.
436
00:19:45.770 --> 00:19:48.260
And the language that they use isn't necessarily language
437
00:19:48.260 --> 00:19:52.240
I think we would use with each other on this podcast.
438
00:19:52.240 --> 00:19:57.240
And so, we do think that machines will be able to offer
439
00:19:57.260 --> 00:20:01.240
some interesting kind of response, inputs,
440
00:20:01.240 --> 00:20:03.200
generalized inputs at some point.
441
00:20:03.200 --> 00:20:06.110
But I can tell you right now, you want to be really careful
442
00:20:06.110 --> 00:20:09.540
about letting loose a natural language-enabled partner
443
00:20:09.540 --> 00:20:12.220
that is a machine inside of your creativity and innovation
444
00:20:12.220 --> 00:20:15.260
session, because you may not hear things that you like.
445
00:20:15.260 --> 00:20:16.800
Well, it seems like there's a role here too,
446
00:20:16.800 --> 00:20:20.100
that I don't know, these machines, there's going to be bias
447
00:20:20.100 --> 00:20:20.933
in these things.
448
00:20:20.933 --> 00:20:21.860
This is inevitable.
449
00:20:21.860 --> 00:20:25.290
And in some sense, I'm often happy to see bias decisions
450
00:20:25.290 --> 00:20:28.230
coming out of these AI and ML systems,
451
00:20:28.230 --> 00:20:30.500
because then it's at least surfaced.
452
00:20:30.500 --> 00:20:33.720
We've got a lot of that unconsciously going on in our world
453
00:20:33.720 --> 00:20:34.553
right now.
454
00:20:34.553 --> 00:20:36.220
And if one of the things that we're learning
455
00:20:36.220 --> 00:20:41.070
is that the machines are pointing out how ugly we're talking
456
00:20:41.070 --> 00:20:45.000
to chatbots, or how poorly we're making other decisions.
457
00:20:45.000 --> 00:20:48.720
And that may be a step one, to improving overall.
458
00:20:48.720 --> 00:20:52.790
Yeah, the responsible AI push, it's never over.
459
00:20:52.790 --> 00:20:55.830
It's one of those things that ensuring those responsible
460
00:20:55.830 --> 00:20:59.460
in ethical practices require a focus across
461
00:20:59.460 --> 00:21:01.023
the entire activity chain.
462
00:21:01.870 --> 00:21:05.540
And two areas that we've seen as really impactful
463
00:21:05.540 --> 00:21:08.770
when you can focus on principles as an organization.
464
00:21:08.770 --> 00:21:12.540
So, what are the principles through which you will take
465
00:21:12.540 --> 00:21:15.853
your projects, and shine the light on them,
466
00:21:16.940 --> 00:21:20.850
and examine them, and think about the ramifications?
467
00:21:20.850 --> 00:21:24.580
Because you can't a priori define all of the potential
468
00:21:24.580 --> 00:21:28.640
outputs that machine learning and AI may generate.
469
00:21:28.640 --> 00:21:31.210
That's why I referred to it's a journey.
470
00:21:31.210 --> 00:21:34.910
And I'm not sure that there is a final destination.
471
00:21:34.910 --> 00:21:37.420
I think it's one that is a constant.
472
00:21:37.420 --> 00:21:39.970
And kind of in the theme of a lot of what we talked
473
00:21:39.970 --> 00:21:41.750
about today is, it's iterative.
474
00:21:41.750 --> 00:21:43.960
You think about how you want to approach it,
475
00:21:43.960 --> 00:21:45.880
you have principles, you have governance,
476
00:21:45.880 --> 00:21:48.150
and then you see what happens.
477
00:21:48.150 --> 00:21:50.480
And then you make the adjustments along the way.
478
00:21:50.480 --> 00:21:52.810
But not having that foundation means you're dealing
479
00:21:52.810 --> 00:21:57.070
with every single instance as its own unique instance.
480
00:21:57.070 --> 00:21:59.140
And that becomes untenable at scale.
481
00:21:59.140 --> 00:22:01.730
Even smaller, this isn't just a Google scale thing.
482
00:22:01.730 --> 00:22:04.960
This is any company that wants to distinguish itself,
483
00:22:04.960 --> 00:22:07.970
with AI, at any type of scale
484
00:22:07.970 --> 00:22:09.780
is going to bump into that.
485
00:22:09.780 --> 00:22:11.690
Well, we really appreciate you taking the time
486
00:22:11.690 --> 00:22:12.930
to talk with us today.
487
00:22:12.930 --> 00:22:13.840
It's been fabulous.
488
00:22:13.840 --> 00:22:14.830
We've learned so much.
489
00:22:14.830 --> 00:22:15.663
Really, really an insightful and
490
00:22:15.663 --> 00:22:16.496
candid conversation.
491
00:22:16.496 --> 00:22:18.810
We really appreciate it.
492
00:22:18.810 --> 00:22:20.200
Oh, absolutely my pleasure.
493
00:22:20.200 --> 00:22:21.473
Thanks for having me.
494
00:22:21.473 --> 00:22:24.056
(upbeat music)
495
00:22:26.650 --> 00:22:27.483
Sam, I thought that was a really good
496
00:22:27.483 --> 00:22:28.316
conversation.
497
00:22:29.160 --> 00:22:30.920
We've been talking with Will Grannis,
498
00:22:30.920 --> 00:22:33.900
Founder and Leader of the Office of the CTO
499
00:22:33.900 --> 00:22:34.900
at Google Cloud.
500
00:22:35.910 --> 00:22:37.860
Well, we may have lost some listeners saying
501
00:22:37.860 --> 00:22:41.150
that you don't need ML as item one on his checklist,
502
00:22:41.150 --> 00:22:44.730
but I think he had 71 other items on his checklist
503
00:22:44.730 --> 00:22:47.230
that I think do involve machine learning.
504
00:22:47.230 --> 00:22:50.040
But I thought he was making a really important
505
00:22:50.040 --> 00:22:53.970
point, that don't get hung up on the technology
506
00:22:53.970 --> 00:22:58.090
and the feature functionality, and think about
507
00:22:58.090 --> 00:23:00.343
the business problem,
508
00:23:02.574 --> 00:23:03.407
and the impact and shoot
509
00:23:03.407 --> 00:23:06.960
really, really big for the impact.
510
00:23:06.960 --> 00:23:10.920
And then also don't think about you have to achieve
511
00:23:10.920 --> 00:23:14.570
the moonshot in one jump, and that you could get
512
00:23:14.570 --> 00:23:17.040
there in progressive jumps.
513
00:23:17.040 --> 00:23:18.980
But you'll always have to keep your eye on the moon,
514
00:23:18.980 --> 00:23:21.163
which I think is really really insightful.
515
00:23:22.000 --> 00:23:24.320
That's a great way of putting it because I do think
516
00:23:24.320 --> 00:23:29.140
we got focused on thinking about the 10x, and we maybe
517
00:23:29.140 --> 00:23:31.660
paid less attention to his number one, which was the
518
00:23:31.660 --> 00:23:32.493
user focus on the problem.
519
00:23:32.493 --> 00:23:34.640
The other thing I thought that is an important
520
00:23:34.640 --> 00:23:37.190
point is the point about collaboration.
521
00:23:37.190 --> 00:23:38.740
I think it's really an overused term,
522
00:23:38.740 --> 00:23:42.210
because in every organization, every team would say yes
523
00:23:42.210 --> 00:23:44.310
yes, we're completely collaborative,
524
00:23:44.310 --> 00:23:45.950
everybody's collaborating, they're keeping
525
00:23:45.950 --> 00:23:47.090
each other informed.
526
00:23:47.090 --> 00:23:49.690
But I think the true meaning of what Will was talking
527
00:23:49.690 --> 00:23:51.136
about is beyond that.
528
00:23:51.136 --> 00:23:54.290
There's multiple meanings to collaboration, you could say,
529
00:23:54.290 --> 00:23:56.430
as long as I'm keeping people informed or sending them
530
00:23:56.430 --> 00:23:58.160
documents that I'm collaborating.
531
00:23:58.160 --> 00:24:01.770
But what he said is, there's not a single person on my team
532
00:24:01.770 --> 00:24:04.260
that can succeed on his or her own.
533
00:24:04.260 --> 00:24:06.350
And that's a different kind of collaboration, it
534
00:24:06.350 --> 00:24:09.420
actually means you're so interlinked with the rest
535
00:24:09.420 --> 00:24:12.470
of your team that your own outcome and output
536
00:24:12.470 --> 00:24:15.710
depends on everybody else's work, so you can't succeed
537
00:24:15.710 --> 00:24:18.100
without them, and they can't succeed without you.
538
00:24:18.100 --> 00:24:19.680
It's really beyond collaboration.
539
00:24:19.680 --> 00:24:23.520
It's like the team is an amalgam of all the people
540
00:24:23.520 --> 00:24:25.410
and they're all embedded in each other
541
00:24:25.410 --> 00:24:26.980
as just one substance.
542
00:24:26.980 --> 00:24:28.253
What's the chemical term for that?
543
00:24:28.253 --> 00:24:29.086
(laughing) Yes, I knew you were going to make a
544
00:24:29.086 --> 00:24:30.570
chemical reference there.
545
00:24:30.570 --> 00:24:31.403
There we go, amalgam.
546
00:24:31.403 --> 00:24:32.830
"Ah-mal-gum" or "am-all-gam"?
547
00:24:32.830 --> 00:24:34.610
I should know this as a chemical reference.
548
00:24:34.610 --> 00:24:36.400
Exactly, we're not going to be tested on.
549
00:24:36.400 --> 00:24:37.233
I hope my Caltech colleagues aren't
550
00:24:37.233 --> 00:24:38.066
listening to this.
551
00:24:39.272 --> 00:24:41.930
(laughing)
552
00:24:41.930 --> 00:24:43.530
Yeah, actually, the collaboration thing.
553
00:24:43.530 --> 00:24:45.740
It's easy to espouse collaboration.
554
00:24:45.740 --> 00:24:48.150
If you think about it, nobody we interview is going to say,
555
00:24:48.150 --> 00:24:50.470
all right, I really think people should not collaborate.
556
00:24:50.470 --> 00:24:52.140
I mean, just no one's going to take that.
557
00:24:52.140 --> 00:24:54.740
But what's different about what he said they had process
558
00:24:54.740 --> 00:24:55.640
around it.
559
00:24:55.640 --> 00:24:58.910
And they had what sounded like instructure and incentives
560
00:24:58.910 --> 00:25:03.233
so that people were incentivized to align well.
561
00:25:04.410 --> 00:25:06.460
I like the gaming analog.
562
00:25:06.460 --> 00:25:09.870
The objective function in the game,
563
00:25:09.870 --> 00:25:14.870
whether it's adversarial, or you're trying to beat or coerce
564
00:25:15.020 --> 00:25:19.600
or unleash some hidden prize somewhere,
565
00:25:19.600 --> 00:25:24.600
that there is some kind of an optimization or simulation
566
00:25:25.280 --> 00:25:30.080
or approximation or correlation going on in these games.
567
00:25:30.080 --> 00:25:34.320
And so the analog of that to a business problem
568
00:25:34.320 --> 00:25:37.750
resting so heavily on the very definition
569
00:25:37.750 --> 00:25:39.630
of the objective function.
570
00:25:39.630 --> 00:25:42.320
Yeah, I thought the twist that he said on games
571
00:25:42.320 --> 00:25:45.330
was important because he did pull out immediately
572
00:25:45.330 --> 00:25:47.740
that we can think about these as games.
573
00:25:47.740 --> 00:25:48.810
What do we learn from games?
574
00:25:48.810 --> 00:25:51.680
We've learned from games that we need an objective,
575
00:25:51.680 --> 00:25:54.400
we need the structure, we need to define the problem.
576
00:25:54.400 --> 00:25:57.490
And he tied that really well into the transition
577
00:25:57.490 --> 00:26:00.900
from what we think of as super well-defined games
578
00:26:00.900 --> 00:26:04.070
of perfect information to unstructured.
579
00:26:04.070 --> 00:26:06.370
It still needs that problem definition.
580
00:26:06.370 --> 00:26:07.530
I thought that was a good switch.
581
00:26:07.530 --> 00:26:08.620
That's right.
582
00:26:08.620 --> 00:26:09.950
(upbeat music)
583
00:26:09.950 --> 00:26:11.920
Will brought out the importance of having good data
584
00:26:11.920 --> 00:26:13.580
for ML to work.
585
00:26:13.580 --> 00:26:15.760
He also highlighted how Google Cloud collaborates,
586
00:26:15.760 --> 00:26:18.053
both internally and with external customers.
587
00:26:18.980 --> 00:26:20.860
Next time, we'll talk with Amit Shah,
588
00:26:20.860 --> 00:26:23.700
President of 1-800-Flowers, about the unique collaboration
589
00:26:23.700 --> 00:26:27.250
challenges that he uses AI to address to his platform.
590
00:26:27.250 --> 00:26:28.500
Please join us next time.
591
00:26:29.960 --> 00:26:32.710
Thanks for listening to Me, Myself, and AI.
592
00:26:32.710 --> 00:26:35.360
If you're enjoying the show, take a minute to write us
593
00:26:35.360 --> 00:26:36.460
a review.
594
00:26:36.460 --> 00:26:39.310
If you send us a screenshot, we'll send you a collection
595
00:26:39.310 --> 00:26:43.060
of MIT SMR's best articles on artificial intelligence
596
00:26:43.060 --> 00:26:44.930
free for a limited time.
597
00:26:44.930 --> 00:26:49.930
Send your review screenshot to smrfeedback@mit.edu.
598
00:26:49.975 --> 00:26:52.558
(upbeat music)