WEBVTT
1
00:00:02.150 --> 00:00:04.110
What role did artificial intelligence have
2
00:00:04.110 --> 00:00:07.180
in helping combat the coronavirus pandemic?
3
00:00:07.180 --> 00:00:09.083
Find out today when we talk with an innovative company
4
00:00:09.083 --> 00:00:11.060
that used artificial intelligence
5
00:00:11.060 --> 00:00:12.870
to help solve the critical problems
6
00:00:12.870 --> 00:00:14.623
society faced in the last year.
7
00:00:15.970 --> 00:00:18.040
Welcome to "Me, Myself, and AI,"
8
00:00:18.040 --> 00:00:21.220
a podcast on artificial intelligence in business.
9
00:00:21.220 --> 00:00:22.180
Each episode,
10
00:00:22.180 --> 00:00:25.080
we introduce you to someone innovating with AI.
11
00:00:25.080 --> 00:00:28.010
I'm Sam Ransbotham, Professor of Information Systems
12
00:00:28.010 --> 00:00:29.630
at Boston College.
13
00:00:29.630 --> 00:00:31.050
I'm also the guest editor
14
00:00:31.050 --> 00:00:33.910
for the AI and Business Strategy Big Idea program
15
00:00:33.910 --> 00:00:36.710
at MIT Sloan Management Review
16
00:00:36.710 --> 00:00:38.830
And I'm Shervin Khodabandeh
17
00:00:38.830 --> 00:00:40.600
Senior Partner with BCG,
18
00:00:40.600 --> 00:00:44.400
and I co-lead BCG's AI practice in North America.
19
00:00:44.400 --> 00:00:48.480
And together MIT SMR and BCG have been researching AI
20
00:00:48.480 --> 00:00:49.920
for five years,
21
00:00:49.920 --> 00:00:52.080
interviewing hundreds of practitioners
22
00:00:52.080 --> 00:00:54.110
and surveying thousands of companies
23
00:00:54.110 --> 00:00:56.030
on what it takes to build
24
00:00:56.030 --> 00:00:58.740
and to deploy and scale AI capabilities
25
00:00:58.740 --> 00:01:00.100
across the organization,
26
00:01:00.100 --> 00:01:03.347
and really transform the way organizations operate.
27
00:01:03.347 --> 00:01:05.870
(upbeat music)
28
00:01:05.870 --> 00:01:07.620
Today we're talking with Dave Johnson,
29
00:01:07.620 --> 00:01:11.390
Chief Data and Artificial Intelligence Officer at Moderna.
30
00:01:11.390 --> 00:01:12.350
Dave, thanks for joining us.
31
00:01:12.350 --> 00:01:13.210
Welcome.
32
00:01:13.210 --> 00:01:15.010
Thanks guys for having me.
33
00:01:15.010 --> 00:01:18.150
Can you describe your current role at Moderna?
34
00:01:18.150 --> 00:01:20.920
I'm Chief Data and AI Officer at Moderna.
35
00:01:20.920 --> 00:01:21.753
In my role,
36
00:01:21.753 --> 00:01:24.190
I'm responsible for all of our enterprise data functions
37
00:01:24.190 --> 00:01:27.150
from data engineering to data science integration.
38
00:01:27.150 --> 00:01:29.660
And I also manage a software engineering team building,
39
00:01:29.660 --> 00:01:30.493
you know,
40
00:01:30.493 --> 00:01:31.420
unique custom applications
41
00:01:31.420 --> 00:01:33.680
to curate and create new data sets,
42
00:01:33.680 --> 00:01:36.290
but also to then take those AI models that are created
43
00:01:36.290 --> 00:01:38.370
and build them into processes.
44
00:01:38.370 --> 00:01:39.840
So it's kind of end-to-end
45
00:01:39.840 --> 00:01:41.970
everything to actually deploy an AI model
46
00:01:41.970 --> 00:01:45.150
to build, deploy, and put an AI model into production.
47
00:01:45.150 --> 00:01:46.170
How did you end up in that role?
48
00:01:46.170 --> 00:01:48.355
I know you have physics in your background.
49
00:01:48.355 --> 00:01:50.870
That's not... I didn't hear any physics
50
00:01:50.870 --> 00:01:52.500
in what you just said.
51
00:01:52.500 --> 00:01:53.890
Yeah, no, it's a good point.
52
00:01:53.890 --> 00:01:57.140
So I have my PhD in what's called Information Physics,
53
00:01:57.140 --> 00:01:59.652
which is a field closely related to data science actually.
54
00:01:59.652 --> 00:02:03.310
It's about the foundations of Bayesian statistics
55
00:02:03.310 --> 00:02:04.700
and information theory,
56
00:02:04.700 --> 00:02:07.460
a lot of what is involved in data science.
57
00:02:07.460 --> 00:02:10.160
My particular research was in applying that
58
00:02:10.160 --> 00:02:13.050
to a framework that derives quantum mechanics
59
00:02:13.050 --> 00:02:15.370
from the rules of information theory.
60
00:02:15.370 --> 00:02:17.670
So that part, you're right is not particularly relevant
61
00:02:17.670 --> 00:02:19.110
to my day-to-day job,
62
00:02:19.110 --> 00:02:21.650
but the information theory part and the Bayesian stats is
63
00:02:21.650 --> 00:02:24.110
completely on target for what I do.
64
00:02:24.110 --> 00:02:24.943
In addition to that,
65
00:02:24.943 --> 00:02:27.230
I spent many years doing independent consulting
66
00:02:27.230 --> 00:02:30.210
in a kind of a software engineering data science capacity.
67
00:02:30.210 --> 00:02:31.830
And when I finished my PhD, you know,
68
00:02:31.830 --> 00:02:34.120
I realized academia wasn't really for me,
69
00:02:34.120 --> 00:02:36.250
I wanted to do applications.
70
00:02:36.250 --> 00:02:37.770
And I ended up with a consulting firm
71
00:02:37.770 --> 00:02:40.090
doing work for large pharmaceutical companies.
72
00:02:40.090 --> 00:02:42.350
So I spent a number of years doing that
73
00:02:42.350 --> 00:02:43.900
and it turned out to be a real great marriage
74
00:02:43.900 --> 00:02:45.040
of my skill sets, you know,
75
00:02:45.040 --> 00:02:46.980
understanding of science, understanding of data,
76
00:02:46.980 --> 00:02:49.400
understanding of, you know, software engineering.
77
00:02:49.400 --> 00:02:52.210
And so I did one project in particular for a number of years
78
00:02:52.210 --> 00:02:54.970
in research at a pharmaceutical company
79
00:02:54.970 --> 00:02:57.110
around capturing data, you know,
80
00:02:57.110 --> 00:03:00.660
structured, useful way in the preclinical space
81
00:03:00.660 --> 00:03:03.010
in order to feed into kind of advanced data
82
00:03:03.010 --> 00:03:04.460
and advanced models.
83
00:03:04.460 --> 00:03:06.670
So very much what I'm doing today.
84
00:03:06.670 --> 00:03:09.640
And about seven years ago, I moved over to Moderna.
85
00:03:09.640 --> 00:03:10.830
And at that time we were, you know,
86
00:03:10.830 --> 00:03:12.280
a preclinical stage company.
87
00:03:12.280 --> 00:03:14.727
And the big challenge we had was producing
88
00:03:14.727 --> 00:03:18.420
enough small-scale mRNA to run our experiments.
89
00:03:18.420 --> 00:03:19.970
And what we're really trying to do
90
00:03:19.970 --> 00:03:22.000
is accelerate the pace of research,
91
00:03:22.000 --> 00:03:24.100
so that we can get as many drugs in the clinic
92
00:03:24.100 --> 00:03:25.850
as quickly as possible.
93
00:03:25.850 --> 00:03:27.833
And one of the big bottlenecks was having
94
00:03:27.833 --> 00:03:30.940
this mRNA for the scientists to run tests in.
95
00:03:30.940 --> 00:03:33.050
And so what we did is we put in place a ton
96
00:03:33.050 --> 00:03:34.780
of robotic automation,
97
00:03:34.780 --> 00:03:36.530
put in place a lot of digital systems
98
00:03:36.530 --> 00:03:39.780
and process automation and AI algorithms as well.
99
00:03:39.780 --> 00:03:42.370
And what went from maybe, you know,
100
00:03:42.370 --> 00:03:45.640
about 30 mRNAs manually produced in a given month,
101
00:03:45.640 --> 00:03:48.430
to a capacity of about a thousand in a month period.
102
00:03:48.430 --> 00:03:49.320
So without
103
00:03:49.320 --> 00:03:50.153
you know,
104
00:03:50.153 --> 00:03:51.560
significantly more resources
105
00:03:51.560 --> 00:03:54.400
and much better consistency in quality and so on.
106
00:03:54.400 --> 00:03:56.470
And so then I just kind of from there grew with the company
107
00:03:56.470 --> 00:03:58.320
and grew into this role that we have now
108
00:03:58.320 --> 00:04:00.250
where I'm applying those same ideas
109
00:04:00.250 --> 00:04:02.310
to the broader enterprise.
110
00:04:02.310 --> 00:04:03.560
That's great Dave,
111
00:04:03.560 --> 00:04:08.070
can you comment a bit on the spectrum of use cases
112
00:04:08.070 --> 00:04:10.610
that AI is being applied to here
113
00:04:10.610 --> 00:04:13.240
and is it really making a difference?
114
00:04:13.240 --> 00:04:14.620
For us what we've seen
115
00:04:14.620 --> 00:04:16.360
a lot of is in the research space
116
00:04:16.360 --> 00:04:17.790
and particularly in Moderna,
117
00:04:17.790 --> 00:04:20.700
that's been because that's where we digitized early.
118
00:04:20.700 --> 00:04:23.950
We see that putting in digital systems and processes
119
00:04:23.950 --> 00:04:26.990
to actually capture, you know, homogenous good data
120
00:04:26.990 --> 00:04:28.500
that can feed into that
121
00:04:28.500 --> 00:04:30.690
is obviously a really important first step,
122
00:04:30.690 --> 00:04:33.070
but it also lays the foundation of processes
123
00:04:33.070 --> 00:04:35.120
that are then amenable to these greater degrees
124
00:04:35.120 --> 00:04:36.310
of automation.
125
00:04:36.310 --> 00:04:39.490
So that's where we've seen a lot of that value is, you know,
126
00:04:39.490 --> 00:04:40.750
in this preclinical production,
127
00:04:40.750 --> 00:04:41.950
we have kind of high throughput.
128
00:04:41.950 --> 00:04:42.980
We have lots of data.
129
00:04:42.980 --> 00:04:45.250
We're able to start automating those steps
130
00:04:45.250 --> 00:04:48.180
and judgments that were previously done by humans.
131
00:04:48.180 --> 00:04:51.800
So one example is our mRNA sequence design.
132
00:04:51.800 --> 00:04:53.090
We're coding for some protein,
133
00:04:53.090 --> 00:04:54.970
which is an amino acid sequence,
134
00:04:54.970 --> 00:04:56.440
but there's a huge degeneracy
135
00:04:56.440 --> 00:04:59.500
of potential nucleotide sequences that could code for that.
136
00:04:59.500 --> 00:05:01.510
And so starting from an amino acid sequence,
137
00:05:01.510 --> 00:05:04.230
you have to figure out what's the ideal way to get there,
138
00:05:04.230 --> 00:05:05.170
right?
139
00:05:05.170 --> 00:05:06.470
And so what we have is algorithms
140
00:05:06.470 --> 00:05:07.970
that can do that translation
141
00:05:07.970 --> 00:05:09.350
in an optimal way.
142
00:05:09.350 --> 00:05:10.910
And then we have algorithms that can take one
143
00:05:10.910 --> 00:05:12.370
and then optimize it even further
144
00:05:12.370 --> 00:05:13.850
to make it better for production
145
00:05:13.850 --> 00:05:15.390
or to avoid things that we know
146
00:05:15.390 --> 00:05:19.450
are bad for this mRNA in production or for expression.
147
00:05:19.450 --> 00:05:21.880
And so we can integrate those into these live systems
148
00:05:21.880 --> 00:05:22.713
that we have
149
00:05:22.713 --> 00:05:24.130
so that scientists just press a button
150
00:05:24.130 --> 00:05:25.420
and the work is done for them,
151
00:05:25.420 --> 00:05:27.080
and they don't know what's going on behind the scenes,
152
00:05:27.080 --> 00:05:27.913
but then poof,
153
00:05:27.913 --> 00:05:30.471
the outcome is this better sequence for them.
154
00:05:30.471 --> 00:05:33.130
And then we've seen it with quality control steps as well.
155
00:05:33.130 --> 00:05:34.920
We're also doing some work right now
156
00:05:34.920 --> 00:05:38.180
with our clinical partners in the clinical operations space,
157
00:05:38.180 --> 00:05:40.810
in terms of like optimal trial planning.
158
00:05:40.810 --> 00:05:42.700
We're doing some work right now
159
00:05:42.700 --> 00:05:44.840
around our call center planning
160
00:05:44.840 --> 00:05:46.840
now that we're rolling our vaccine out
161
00:05:46.840 --> 00:05:48.250
across the whole world.
162
00:05:48.250 --> 00:05:49.660
More and more phone calls are coming in.
163
00:05:49.660 --> 00:05:51.300
And as we look to launching in new countries,
164
00:05:51.300 --> 00:05:53.410
we have to start planning our resources for that.
165
00:05:53.410 --> 00:05:54.950
So we're looking at machine learning models
166
00:05:54.950 --> 00:05:58.040
to help predict the forecast of these calls,
167
00:05:58.040 --> 00:06:00.110
so we can then staff up appropriately.
168
00:06:00.110 --> 00:06:01.720
So we just see it across, you know,
169
00:06:01.720 --> 00:06:03.670
a variety of different areas.
170
00:06:03.670 --> 00:06:05.100
You mentioned pressing the button,
171
00:06:05.100 --> 00:06:08.140
your scientists press the button and some trials happen.
172
00:06:08.140 --> 00:06:09.670
So what are these scientists thinking?
173
00:06:09.670 --> 00:06:12.240
I mean, you've suddenly taken away something that used to be
174
00:06:12.240 --> 00:06:13.520
something that they did
175
00:06:13.520 --> 00:06:16.270
and you're having AI do it.
176
00:06:16.270 --> 00:06:18.680
What's the reaction? Do they, are they thrilled?
177
00:06:18.680 --> 00:06:22.490
Are they despondent? You know, or somewhere in between?
178
00:06:22.490 --> 00:06:25.130
I would say closer to the thrilled side,
179
00:06:25.130 --> 00:06:26.200
usually how it works.
180
00:06:26.200 --> 00:06:27.180
You know we're a company
181
00:06:27.180 --> 00:06:30.320
that believes in giving people a lot of responsibility
182
00:06:30.320 --> 00:06:32.360
and, you know, people work really hard
183
00:06:32.360 --> 00:06:33.210
and what that leads to
184
00:06:33.210 --> 00:06:35.140
is people doing a lot of work, right?
185
00:06:35.140 --> 00:06:38.280
And so what often happens is folks will come to us and say,
186
00:06:38.280 --> 00:06:40.240
"Look, I'm doing this activity over and over.
187
00:06:40.240 --> 00:06:43.840
I would really love some help to automate this process."
188
00:06:43.840 --> 00:06:45.400
And so in that case they're thrilled, right?
189
00:06:45.400 --> 00:06:46.290
They don't want to be,
190
00:06:46.290 --> 00:06:48.360
you know, looking at some screen of data
191
00:06:48.360 --> 00:06:50.050
over and over and over again.
192
00:06:50.050 --> 00:06:53.140
They want to be doing something insightful and creative.
193
00:06:53.140 --> 00:06:54.900
And so that's where we really partner with them
194
00:06:54.900 --> 00:06:57.100
and take off that component of what they do.
195
00:06:58.050 --> 00:06:59.550
Dave, I want to build on that,
196
00:06:59.550 --> 00:07:01.150
because I think you're putting your finger
197
00:07:01.150 --> 00:07:02.360
on something quite interesting.
198
00:07:02.360 --> 00:07:05.332
And in addition to the financial impact
199
00:07:05.332 --> 00:07:07.820
that many get from AI
200
00:07:07.820 --> 00:07:10.520
productivity efficiency and all of that,
201
00:07:10.520 --> 00:07:12.550
you talked about some of those, Dave.
202
00:07:12.550 --> 00:07:16.080
There is an impact in overall organizational culture
203
00:07:16.080 --> 00:07:19.970
and you know teams being more collaborative,
204
00:07:19.970 --> 00:07:22.050
higher morale, happier,
205
00:07:22.050 --> 00:07:24.480
more confident, et cetera.
206
00:07:24.480 --> 00:07:25.910
Are those some of the things
207
00:07:25.910 --> 00:07:27.650
that you guys are seeing as well?
208
00:07:27.650 --> 00:07:29.190
Yeah, for sure.
209
00:07:29.190 --> 00:07:30.460
I think one of the surest signs of that
210
00:07:30.460 --> 00:07:32.650
is we get a lot of repeat customers.
211
00:07:32.650 --> 00:07:35.100
You know, if we do some particular algorithm for somebody,
212
00:07:35.100 --> 00:07:37.400
that person comes back with the next one or,
213
00:07:37.400 --> 00:07:40.120
you know, that their team comes back time and time again,
214
00:07:40.120 --> 00:07:43.610
we don't think about AI in the context of replacing humans.
215
00:07:43.610 --> 00:07:45.850
Like we always think about it in terms of
216
00:07:45.850 --> 00:07:48.190
this human-machine collaboration,
217
00:07:48.190 --> 00:07:49.700
because they're good at different things.
218
00:07:49.700 --> 00:07:50.533
You know,
219
00:07:50.533 --> 00:07:52.810
humans are really good at creativity and flexibility
220
00:07:52.810 --> 00:07:53.900
and insight,
221
00:07:53.900 --> 00:07:55.860
whereas machines are really good at, you know,
222
00:07:55.860 --> 00:07:58.223
precision and giving you the exact same result
223
00:07:58.223 --> 00:08:01.400
every single time and doing it at scale and speed.
224
00:08:01.400 --> 00:08:03.350
What we find the most successful projects
225
00:08:03.350 --> 00:08:05.440
are where we kind of put the two together,
226
00:08:05.440 --> 00:08:07.990
have the machine do the parts of the job that it's good at.
227
00:08:07.990 --> 00:08:10.520
Let the humans take over for the rest of that.
228
00:08:10.520 --> 00:08:13.300
So with this freedom, what have people done?
229
00:08:13.300 --> 00:08:15.275
You know you've opened up this time,
230
00:08:15.275 --> 00:08:16.108
(Shervin chuckles)
231
00:08:16.108 --> 00:08:18.524
What kind of new, you know...
232
00:08:18.524 --> 00:08:20.149
I got two shots of that.
233
00:08:20.149 --> 00:08:21.212
(all laugh)
234
00:08:21.212 --> 00:08:22.045
What people have done...
235
00:08:22.045 --> 00:08:23.200
Yeah actually there's at least one product
236
00:08:23.200 --> 00:08:24.524
that's in the market now isn't there?
237
00:08:24.524 --> 00:08:25.630
I think I've heard something
238
00:08:25.630 --> 00:08:26.463
in the news.
239
00:08:26.463 --> 00:08:28.610
Just there's one. Yeah. You know,
240
00:08:28.610 --> 00:08:30.810
I always like to joke that work is like a gas
241
00:08:30.810 --> 00:08:33.140
that always expands to fill the container.
242
00:08:33.140 --> 00:08:34.850
So if you take something off somebody's plate,
243
00:08:34.850 --> 00:08:36.360
there's all this mountain of work
244
00:08:36.360 --> 00:08:38.700
that they didn't even realize just wasn't being done.
245
00:08:38.700 --> 00:08:41.850
And so people are always relieved to then go on and find,
246
00:08:41.850 --> 00:08:43.100
you know, the next mountain to climb
247
00:08:43.100 --> 00:08:44.400
and the next thing to do.
248
00:08:44.400 --> 00:08:45.630
But what are these kinds of things
249
00:08:45.630 --> 00:08:49.040
like, you know how are people choosing how to, you know,
250
00:08:49.040 --> 00:08:51.110
expand to fill that space?
251
00:08:51.110 --> 00:08:52.970
Well, if you think of the examples, you know,
252
00:08:52.970 --> 00:08:55.080
like the preclinical quality control steps
253
00:08:55.080 --> 00:08:56.290
that we've automated,
254
00:08:56.290 --> 00:08:59.170
the reality is, you know,
255
00:08:59.170 --> 00:09:01.690
one operator stretched over a huge amount of work is,
256
00:09:01.690 --> 00:09:02.540
it's really hard for them
257
00:09:02.540 --> 00:09:05.620
to really do really in-depth inspection of these samples.
258
00:09:05.620 --> 00:09:07.100
And so by taking off, you know,
259
00:09:07.100 --> 00:09:07.933
a bulk of that work,
260
00:09:07.933 --> 00:09:10.250
80-90% for the algorithm to do that,
261
00:09:10.250 --> 00:09:12.940
we're able to do just do a better more thorough job
262
00:09:12.940 --> 00:09:15.410
of inspecting the samples that are left.
263
00:09:15.410 --> 00:09:16.400
It also means that we're not hiring
264
00:09:16.400 --> 00:09:17.320
a whole bunch of other people
265
00:09:17.320 --> 00:09:19.870
just to go look at, you know, screens of data.
266
00:09:19.870 --> 00:09:22.300
It's a bit of an immediate gain for the people who are there
267
00:09:22.300 --> 00:09:23.870
and then kind of this longer-term gain
268
00:09:23.870 --> 00:09:25.590
on our headcount plans.
269
00:09:25.590 --> 00:09:27.320
Some folks, you know, talk about AI
270
00:09:27.320 --> 00:09:29.020
in the pharma space being like,
271
00:09:29.020 --> 00:09:31.360
"I just want an algorithm that can predict from
272
00:09:31.360 --> 00:09:33.320
the you know, the structure of a small molecule,
273
00:09:33.320 --> 00:09:34.631
the efficacy in humans" right?
274
00:09:34.631 --> 00:09:37.600
Like that's the entire drug discovery process.
275
00:09:37.600 --> 00:09:38.740
That's just not going to happen.
276
00:09:38.740 --> 00:09:40.630
That's completely unrealistic.
277
00:09:40.630 --> 00:09:43.080
So, you know, we just think about the fact that, you know,
278
00:09:43.080 --> 00:09:44.350
there are countless processes, it's a
279
00:09:44.350 --> 00:09:47.220
very complicated process to bring something to market.
280
00:09:47.220 --> 00:09:50.300
And there are just numerous opportunities along the way.
281
00:09:50.300 --> 00:09:51.920
Even within a specific use case,
282
00:09:51.920 --> 00:09:54.160
you're rarely using one AI algorithm.
283
00:09:54.160 --> 00:09:55.980
And so often, you know, for this part of the problem,
284
00:09:55.980 --> 00:09:57.010
I need to use this algorithm,
285
00:09:57.010 --> 00:09:59.030
for this I need to use another.
286
00:09:59.030 --> 00:10:00.870
Dave I want to ask you something about
287
00:10:00.870 --> 00:10:03.070
the talent base and people.
288
00:10:03.070 --> 00:10:06.570
You commented that Moderna is a kind of company
289
00:10:06.570 --> 00:10:08.920
that likes to give people a lot of freedom,
290
00:10:08.920 --> 00:10:11.270
you know, highly motivated,
291
00:10:11.270 --> 00:10:16.200
smart, ambitious team working to do the best they can.
292
00:10:16.200 --> 00:10:18.930
How do you bring in and cultivate that talent?
293
00:10:18.930 --> 00:10:23.710
And what are you finding to be some of the lessons learned
294
00:10:23.710 --> 00:10:27.470
in terms of how to build a high performing team?
295
00:10:27.470 --> 00:10:28.540
It's a good question.
296
00:10:28.540 --> 00:10:29.810
I don't know that, you know,
297
00:10:29.810 --> 00:10:31.610
if we look across the company as a whole,
298
00:10:31.610 --> 00:10:34.780
there is one particular place where we hire people.
299
00:10:34.780 --> 00:10:37.820
We get people from biotechs, you know,
300
00:10:37.820 --> 00:10:41.030
five people, to pharmas of a hundred thousand people,
301
00:10:41.030 --> 00:10:41.890
and everywhere in between
302
00:10:41.890 --> 00:10:44.150
inside the industry and outside the industry.
303
00:10:44.150 --> 00:10:45.750
I think for us it's always about finding
304
00:10:45.750 --> 00:10:47.110
the right person for the job
305
00:10:47.110 --> 00:10:50.320
regardless of where they come from and their background.
306
00:10:50.320 --> 00:10:51.620
I think the important thing for us
307
00:10:51.620 --> 00:10:54.920
is to make sure that we set expectations appropriately
308
00:10:54.920 --> 00:10:55.910
as we bring them in.
309
00:10:55.910 --> 00:10:58.320
And we say, look this is a digital company.
310
00:10:58.320 --> 00:11:00.450
We're really bold. We're really ambitious.
311
00:11:00.450 --> 00:11:03.080
We have really high quality standards.
312
00:11:03.080 --> 00:11:05.280
And if we set those expectations really high,
313
00:11:05.280 --> 00:11:07.450
you know, it does start to self-select a lot of the people
314
00:11:07.450 --> 00:11:10.090
who want to come through that process.
315
00:11:10.090 --> 00:11:11.700
Do you want to flip over and talk you know,
316
00:11:11.700 --> 00:11:13.400
you mentioned some of
317
00:11:13.400 --> 00:11:16.680
the infrastructure I would call it that you put in place
318
00:11:16.680 --> 00:11:20.250
that suddenly the world benefited from a few months ago.
319
00:11:20.250 --> 00:11:22.250
How do people know to get those things set up
320
00:11:22.250 --> 00:11:23.083
in the first place?
321
00:11:23.083 --> 00:11:24.510
You mentioned being able to scale from,
322
00:11:24.510 --> 00:11:27.650
I think you know, 30 to a thousand difference.
323
00:11:27.650 --> 00:11:30.010
How did you know that was the direction
324
00:11:30.010 --> 00:11:33.800
or that was the vision to get those things set up?
325
00:11:33.800 --> 00:11:35.140
Yeah, that's a great point.
326
00:11:35.140 --> 00:11:37.140
The whole COVID vaccine development,
327
00:11:37.140 --> 00:11:38.820
you know, we're immensely proud of the work
328
00:11:38.820 --> 00:11:39.680
that we've done there.
329
00:11:39.680 --> 00:11:41.840
And we're immensely proud of the superhuman effort
330
00:11:41.840 --> 00:11:43.240
that our people went through
331
00:11:43.240 --> 00:11:45.670
to bring it to market so quickly.
332
00:11:45.670 --> 00:11:47.700
But a lot of it was built on just what you said,
333
00:11:47.700 --> 00:11:50.120
this infrastructure that we had put in place
334
00:11:50.120 --> 00:11:53.130
where we didn't build algorithms specifically for COVID.
335
00:11:53.130 --> 00:11:55.420
We just put them through the same pipeline of activity
336
00:11:55.420 --> 00:11:56.253
that we've been doing.
337
00:11:56.253 --> 00:11:58.440
We just turned it as fast as we could.
338
00:11:58.440 --> 00:12:01.410
When we think about everything we do at Moderna,
339
00:12:01.410 --> 00:12:04.170
we think about this platform capability.
340
00:12:04.170 --> 00:12:05.003
You know,
341
00:12:05.003 --> 00:12:06.090
we were never going to make one drug,
342
00:12:06.090 --> 00:12:07.140
that was never the plan.
343
00:12:07.140 --> 00:12:10.480
The plan was always to make a whole platform around mRNA.
344
00:12:10.480 --> 00:12:12.640
Because since it's an information-based product,
345
00:12:12.640 --> 00:12:15.250
all you do is change the information encoded in the molecule
346
00:12:15.250 --> 00:12:17.410
and you have a completely different drug.
347
00:12:17.410 --> 00:12:19.000
We knew that if you get one in the market,
348
00:12:19.000 --> 00:12:21.025
you can get any number of them to the market.
349
00:12:21.025 --> 00:12:23.560
And so all the decisions we made around
350
00:12:23.560 --> 00:12:25.050
how we designed the company
351
00:12:25.050 --> 00:12:27.380
and how we designed the digital infrastructure
352
00:12:27.380 --> 00:12:29.440
was all around this platform notion
353
00:12:29.440 --> 00:12:31.380
that we're not going to build this for one thing,
354
00:12:31.380 --> 00:12:32.670
we're going to build a solution
355
00:12:32.670 --> 00:12:34.630
that services this whole platform.
356
00:12:34.630 --> 00:12:36.470
And so that's exactly why we built, you know,
357
00:12:36.470 --> 00:12:37.910
this early preclinical stuff
358
00:12:37.910 --> 00:12:39.620
where, you know, we can just crank
359
00:12:39.620 --> 00:12:40.560
through quite a few of these.
360
00:12:40.560 --> 00:12:43.340
That's why we built these algorithms to automate activities
361
00:12:43.340 --> 00:12:45.930
anytime where we see something where we know that,
362
00:12:45.930 --> 00:12:46.763
you know,
363
00:12:46.763 --> 00:12:48.527
scale and making it parallel was going to improve things,
364
00:12:48.527 --> 00:12:51.300
we put in place this process.
365
00:12:51.300 --> 00:12:52.840
The proof is certainly in the pudding.
366
00:12:52.840 --> 00:12:55.490
One thing that I'm kind of finding fascinating is how
367
00:12:56.420 --> 00:12:57.570
like normal this all...
368
00:12:57.570 --> 00:12:59.620
like I guess I'm just surprised
369
00:12:59.620 --> 00:13:01.720
at how much that seems to be part of your,
370
00:13:02.740 --> 00:13:04.775
can I use the word DNA here in this,
371
00:13:04.775 --> 00:13:06.540
(Sam chuckles)
372
00:13:06.540 --> 00:13:08.899
all right RNA,
Is part of mRNA
373
00:13:11.470 --> 00:13:12.303
Yeah, no, it's true.
374
00:13:12.303 --> 00:13:13.657
You know,
375
00:13:13.657 --> 00:13:15.010
we were founded as a digital biotech
376
00:13:15.010 --> 00:13:17.010
and like a lot of companies say things
377
00:13:17.010 --> 00:13:17.843
and put taglines on stuff,
378
00:13:17.843 --> 00:13:19.410
but we really meant it.
379
00:13:19.410 --> 00:13:21.190
And we have pushed on this for many years
380
00:13:21.190 --> 00:13:23.210
and we've built out this for many years.
381
00:13:23.210 --> 00:13:24.630
Yeah it's the platform we built
382
00:13:24.630 --> 00:13:26.280
and now, now he's running.
383
00:13:26.280 --> 00:13:27.280
Yeah.
384
00:13:27.280 --> 00:13:28.990
And it's the platform approach we take to our data science
385
00:13:28.990 --> 00:13:30.810
and AI projects as well.
386
00:13:30.810 --> 00:13:33.110
I hear a lot of struggles from folks around
387
00:13:33.110 --> 00:13:36.090
like, great I built a model in a Jupyter Notebook
388
00:13:36.090 --> 00:13:37.960
now what do I do with it? Right?
389
00:13:37.960 --> 00:13:41.020
Because there's all this data cleansing and data curation
390
00:13:41.020 --> 00:13:43.600
to even get it to be in a useful state.
391
00:13:43.600 --> 00:13:46.230
And then they don't know where to go from it to deploy it,
392
00:13:46.230 --> 00:13:47.063
right?
393
00:13:47.063 --> 00:13:49.370
And we took the same platform approach
394
00:13:49.370 --> 00:13:50.590
to our data science activities.
395
00:13:50.590 --> 00:13:53.270
We spent a lot of time on the data curation,
396
00:13:53.270 --> 00:13:54.840
the data ingestion to make sure the data
397
00:13:54.840 --> 00:13:56.990
is good to be used right away.
398
00:13:56.990 --> 00:13:59.540
And then we put a lot of tooling and infrastructure in place
399
00:13:59.540 --> 00:14:02.120
to get those models into production and integrated.
400
00:14:02.120 --> 00:14:03.280
This platform mentality
401
00:14:03.280 --> 00:14:05.130
is just so ingrained in how we think.
402
00:14:06.130 --> 00:14:07.990
Take us back to, you know,
403
00:14:07.990 --> 00:14:11.138
early in the COVID race for a vaccine,
404
00:14:11.138 --> 00:14:11.971
you know,
405
00:14:11.971 --> 00:14:14.150
what was it like being part of that team
406
00:14:14.150 --> 00:14:15.260
and a part of that process?
407
00:14:15.260 --> 00:14:17.930
I mean, what were the emotions like when the algorithms
408
00:14:17.930 --> 00:14:20.840
when the people find something that seems to work
409
00:14:20.840 --> 00:14:22.270
or it seems promising.
410
00:14:22.270 --> 00:14:24.390
Does that lead to a massive appetite
411
00:14:24.390 --> 00:14:27.410
for more artificial intelligence and more algorithms?
412
00:14:27.410 --> 00:14:30.420
I guess to tell us a little bit about that story.
413
00:14:30.420 --> 00:14:32.870
I think if you look at how people felt
414
00:14:32.870 --> 00:14:33.930
in general at the time,
415
00:14:33.930 --> 00:14:37.707
I mean it was a real sense of honor and pride right?
416
00:14:37.707 --> 00:14:40.330
We felt very uniquely positioned.
417
00:14:40.330 --> 00:14:43.210
We even spent a decade getting to this point
418
00:14:43.210 --> 00:14:45.210
and putting all of this infrastructure in place
419
00:14:45.210 --> 00:14:47.210
and putting things in the clinic before this,
420
00:14:47.210 --> 00:14:48.210
to get to this moment.
421
00:14:48.210 --> 00:14:51.510
And so we just felt truly honored to be in that position.
422
00:14:51.510 --> 00:14:53.500
And for those of us on the digital side,
423
00:14:53.500 --> 00:14:55.870
who have kind of contributed to this and built it.
424
00:14:55.870 --> 00:14:57.020
You know, this is why we did it.
425
00:14:57.020 --> 00:14:58.010
This is why we're here
426
00:14:58.010 --> 00:15:00.380
is to help bring as many patients
427
00:15:00.380 --> 00:15:02.930
as fast lanes as quickly and safely as possible
428
00:15:02.930 --> 00:15:04.100
to the world.
429
00:15:04.100 --> 00:15:05.250
But there was always the question of
430
00:15:05.250 --> 00:15:07.320
would this thing work in the real world?
431
00:15:07.320 --> 00:15:09.530
You know, and that's where the proof came
432
00:15:09.530 --> 00:15:10.560
in the clinical data
433
00:15:10.560 --> 00:15:12.090
and we're all anxiously waiting
434
00:15:12.090 --> 00:15:14.710
like everybody else to see that readout.
435
00:15:14.710 --> 00:15:19.640
Was AI always front and center at Moderna
436
00:15:19.640 --> 00:15:23.130
or has it become more critical
437
00:15:23.130 --> 00:15:26.313
as a pillar of growth and innovation over time?
438
00:15:27.240 --> 00:15:29.450
Yeah I think it's always been there
439
00:15:29.450 --> 00:15:31.960
that we probably didn't call it that in the early days
440
00:15:31.960 --> 00:15:34.070
it's become obviously much more of, you know,
441
00:15:34.070 --> 00:15:36.380
a hot marketing term than it used to be.
442
00:15:36.380 --> 00:15:39.700
But the notion of algorithms taking over decision making,
443
00:15:39.700 --> 00:15:41.300
and you know, data science capability
444
00:15:41.300 --> 00:15:42.960
was absolutely always there.
445
00:15:42.960 --> 00:15:45.230
We were very thoughtful about how we
446
00:15:45.230 --> 00:15:46.610
built this digital landscape,
447
00:15:46.610 --> 00:15:48.720
such that we're collecting structured data
448
00:15:48.720 --> 00:15:50.190
across all these steps,
449
00:15:50.190 --> 00:15:51.630
knowing full well that what we want to do
450
00:15:51.630 --> 00:15:54.010
is then turn those into algorithms to do things.
451
00:15:54.010 --> 00:15:56.170
So it was very purposeful for that.
452
00:15:56.170 --> 00:15:58.990
But I do think it's also come into a greater focus, right?
453
00:15:58.990 --> 00:16:00.690
Because we've seen the power of it
454
00:16:00.690 --> 00:16:02.250
very recently obviously we've seen
455
00:16:02.250 --> 00:16:03.627
how this digital infrastructure
456
00:16:03.627 --> 00:16:04.780
and how these algorithms
457
00:16:04.780 --> 00:16:06.760
can really help push things forward.
458
00:16:06.760 --> 00:16:09.100
And so it's got that kind of a renewed focus
459
00:16:09.100 --> 00:16:10.870
and importance in the company.
460
00:16:10.870 --> 00:16:12.760
We tend not to be a company of half measures.
461
00:16:12.760 --> 00:16:14.210
So when we decided we're going to do something,
462
00:16:14.210 --> 00:16:15.051
we're going to do it.
463
00:16:15.051 --> 00:16:17.970
And so it's been a very strong message
464
00:16:17.970 --> 00:16:19.349
from our senior leadership
465
00:16:19.349 --> 00:16:21.730
about this is the future of the company
466
00:16:21.730 --> 00:16:24.500
is in injecting digital and AI into everything we do.
467
00:16:24.500 --> 00:16:25.880
And under no uncertain terms,
468
00:16:25.880 --> 00:16:28.580
this is happening to the point that, you know,
469
00:16:28.580 --> 00:16:29.610
as we think about the fact that
470
00:16:29.610 --> 00:16:31.047
we are growing really fast as a company,
471
00:16:31.047 --> 00:16:32.650
you know, we just doubled,
472
00:16:32.650 --> 00:16:34.410
we're probably going to double again.
473
00:16:34.410 --> 00:16:35.900
We're bringing in a lot of new folks
474
00:16:35.900 --> 00:16:37.670
from outside the company to grow,
475
00:16:37.670 --> 00:16:39.920
who are not necessarily familiar with this digital culture
476
00:16:39.920 --> 00:16:41.100
that we've had.
477
00:16:41.100 --> 00:16:43.060
And so what we're working on right now
478
00:16:43.060 --> 00:16:46.060
is actually developing what we're calling an AI academy,
479
00:16:46.060 --> 00:16:49.960
which we intend to be a very thorough in-depth training
480
00:16:49.960 --> 00:16:51.160
for our company.
481
00:16:51.160 --> 00:16:54.410
So from people who would use and interact with AI models
482
00:16:54.410 --> 00:16:55.890
in their daily basis
483
00:16:55.890 --> 00:16:57.500
to senior leaders who would be responsible,
484
00:16:57.500 --> 00:17:00.760
kind of a portfolio of potential projects in their areas.
485
00:17:00.760 --> 00:17:02.490
And that just shows the level of serious commitment
486
00:17:02.490 --> 00:17:04.020
we have about this.
487
00:17:04.020 --> 00:17:07.860
We were built in this concept of having a smaller company
488
00:17:07.860 --> 00:17:10.300
that's very agile and can move fast.
489
00:17:10.300 --> 00:17:12.820
So we see digital as a key enabler for that
490
00:17:12.820 --> 00:17:14.750
and AI as a key enabler for that.
491
00:17:14.750 --> 00:17:16.760
And so the hope is that that helps us to compete
492
00:17:16.760 --> 00:17:18.700
in ways that other companies can't.
493
00:17:18.700 --> 00:17:21.130
And that is certainly the intention here.
494
00:17:21.130 --> 00:17:23.110
Well Dave thanks so much for talking with us today.
495
00:17:23.110 --> 00:17:23.943
We really enjoyed,
496
00:17:23.943 --> 00:17:26.760
I mean, you mentioned Moderna hires smart people
497
00:17:26.760 --> 00:17:29.000
and we know that from the sample size of one
498
00:17:29.000 --> 00:17:30.190
that's clearly true.
499
00:17:30.190 --> 00:17:32.500
Thanks for taking the time to talk with us today.
500
00:17:32.500 --> 00:17:33.460
Thank you so much.
501
00:17:33.460 --> 00:17:35.917
Absolutely guys, I really appreciate it.
502
00:17:35.917 --> 00:17:38.500
(upbeat music)
503
00:17:41.150 --> 00:17:43.300
Sam that was an awesome conversation with Dave.
504
00:17:43.300 --> 00:17:44.470
What do you think?
505
00:17:44.470 --> 00:17:45.310
Yep. Impressive.
506
00:17:45.310 --> 00:17:46.170
I'm glad he's around.
507
00:17:46.170 --> 00:17:48.320
I'm glad Moderna is around.
508
00:17:48.320 --> 00:17:49.153
That's right.
509
00:17:49.153 --> 00:17:51.000
He's not...
510
00:17:51.000 --> 00:17:54.580
lip service to buzzwords and this that
511
00:17:54.580 --> 00:17:58.230
the other he's just like, yeah, we started this way.
512
00:17:58.230 --> 00:17:59.520
That's why we're doing it.
513
00:17:59.520 --> 00:18:02.790
We would not have existed without digital
514
00:18:02.790 --> 00:18:04.900
and AI and data and analytics.
515
00:18:04.900 --> 00:18:05.733
Of course, it's real.
516
00:18:05.733 --> 00:18:06.670
Like, that's what we are.
517
00:18:06.670 --> 00:18:08.844
I mean, he said Moderna is a digital company,
518
00:18:08.844 --> 00:18:09.730
that's what he said.
519
00:18:09.730 --> 00:18:10.820
It's just part of their process.
520
00:18:10.820 --> 00:18:11.820
I mean, some of the questions,
521
00:18:11.820 --> 00:18:13.270
it didn't even occur to him
522
00:18:13.270 --> 00:18:14.560
that it was artificial intelligence
523
00:18:14.560 --> 00:18:16.880
and that's just the way they do things.
524
00:18:16.880 --> 00:18:20.640
I wonder if that's like, the new industry after industry,
525
00:18:20.640 --> 00:18:22.450
we're going to see the Moderna-type approach
526
00:18:22.450 --> 00:18:23.610
has come into industries
527
00:18:23.610 --> 00:18:27.160
and just be the dominant, you know,
528
00:18:27.160 --> 00:18:29.060
the vestiges of historical
529
00:18:29.060 --> 00:18:30.490
we've been around for a hundred years
530
00:18:30.490 --> 00:18:33.570
are almost a liability versus a plus.
531
00:18:33.570 --> 00:18:35.880
I think this sort of contrast, you know, Sam,
532
00:18:35.880 --> 00:18:39.270
that you were sort of trying to get at, which is like,
533
00:18:39.270 --> 00:18:41.050
how come it's so easy for you guys?
534
00:18:41.050 --> 00:18:44.390
And what about the pre-POST and the transformation?
535
00:18:44.390 --> 00:18:46.830
I mean, these guys, you know, he's like, well,
536
00:18:46.830 --> 00:18:48.370
we actually started this way.
537
00:18:48.370 --> 00:18:49.573
You know, we said, we want to be...
538
00:18:49.573 --> 00:18:50.620
They started POST.
539
00:18:50.620 --> 00:18:52.240
Yeah we started POST.
540
00:18:52.240 --> 00:18:53.886
We wanted to be agile.
541
00:18:53.886 --> 00:18:55.380
We wanted to be small.
542
00:18:55.380 --> 00:19:00.380
We wanted to do a lot more with everything that we had.
543
00:19:00.670 --> 00:19:03.840
And so that had to be platform centric, data centric,
544
00:19:03.840 --> 00:19:05.490
AI centric, and that's how we built.
545
00:19:05.490 --> 00:19:07.120
And so AI is everywhere.
546
00:19:07.120 --> 00:19:08.640
Why are you surprised Sam?
547
00:19:08.640 --> 00:19:10.290
That AI is everywhere? Of course, it's everywhere.
548
00:19:10.290 --> 00:19:14.630
We do it for planning and trials and sequencing
549
00:19:14.630 --> 00:19:17.130
and it's quite energizing
550
00:19:17.130 --> 00:19:18.870
and intriguing how it's just like
551
00:19:18.870 --> 00:19:21.360
a very different mind towards AI.
552
00:19:21.360 --> 00:19:22.760
Right and I know that
553
00:19:22.760 --> 00:19:24.140
we don't want to make everything AI.
554
00:19:24.140 --> 00:19:25.480
There's a lot that is going on there.
555
00:19:25.480 --> 00:19:27.130
That's not artificial intelligence you know,
556
00:19:27.130 --> 00:19:29.030
I don't want to paint it as entirely,
557
00:19:29.030 --> 00:19:31.380
but that certainly was a big chunk of this,
558
00:19:31.380 --> 00:19:32.660
of the speed story here.
559
00:19:32.660 --> 00:19:34.844
And it's pretty fascinating.
560
00:19:34.844 --> 00:19:36.180
(upbeat music)
561
00:19:36.180 --> 00:19:37.210
Thanks for joining us
562
00:19:37.210 --> 00:19:40.030
for this bonus episode of "Me, Myself, and AI."
563
00:19:40.030 --> 00:19:40.960
We'll be back in the Fall
564
00:19:40.960 --> 00:19:43.290
with new episodes for season three.
565
00:19:43.290 --> 00:19:45.595
In the meantime, stay in touch with us on LinkedIn.
566
00:19:45.595 --> 00:19:48.480
We've created a group called AI for Leaders,
567
00:19:48.480 --> 00:19:50.840
specifically for audience members like you.
568
00:19:50.840 --> 00:19:53.140
You can catch up on back episodes of the show,
569
00:19:53.140 --> 00:19:54.900
meet show creators and hosts,
570
00:19:54.900 --> 00:19:57.020
tell us what you want to hear about in season three,
571
00:19:57.020 --> 00:19:59.360
and discuss key issues about AI implementation
572
00:19:59.360 --> 00:20:00.675
with other like-minded people.
573
00:20:00.675 --> 00:20:05.675
Find the group at www.mitsmr.com/AIforLeaders
574
00:20:06.330 --> 00:20:07.810
which will redirect you to LinkedIn,
575
00:20:07.810 --> 00:20:09.590
where you can request to join.
576
00:20:09.590 --> 00:20:11.167
We'll put that link in the show notes as well,
577
00:20:11.167 --> 00:20:13.006
and we hope to see you there.
578
00:20:13.006 --> 00:20:15.589
(upbeat music)