WEBVTT
1
00:00:00.910 --> 00:00:02.200
There can be sizable gaps
2
00:00:02.200 --> 00:00:03.540
between people working directly
3
00:00:03.540 --> 00:00:06.930
with technology and people working in product teams.
4
00:00:06.930 --> 00:00:08.840
Today's episode with Slawek Kierner,
5
00:00:08.840 --> 00:00:10.780
Senior Vice President, Digital Health and Analytics
6
00:00:10.780 --> 00:00:11.650
at Humana,
7
00:00:11.650 --> 00:00:13.930
illustrates how diverse experience can come together
8
00:00:13.930 --> 00:00:16.653
in novel ways, to build value with AI.
9
00:00:18.080 --> 00:00:20.440
Hello, and welcome to "Me, Myself and AI",
10
00:00:20.440 --> 00:00:22.890
a podcast on artificial intelligence in business.
11
00:00:23.930 --> 00:00:26.300
Each episode, we introduce you to someone innovating
12
00:00:26.300 --> 00:00:27.690
with AI.
13
00:00:27.690 --> 00:00:30.420
I'm Sam Ransbotham, Professor of Information Systems
14
00:00:30.420 --> 00:00:31.373
at Boston College.
15
00:00:32.390 --> 00:00:35.170
I'm also the guest editor for the AI and Business Strategy,
16
00:00:35.170 --> 00:00:39.020
Big Idea Program at MIT Sloan Management Review.
17
00:00:39.020 --> 00:00:41.400
And I'm Shervin Khodabandeh, Senior Partner
18
00:00:41.400 --> 00:00:46.380
with BCG and I co-lead BCG's AI practice in North America,
19
00:00:46.380 --> 00:00:51.380
and together BCG and MIT SMR have been researching AI
20
00:00:51.380 --> 00:00:55.010
for four years, interviewing hundreds of practitioners
21
00:00:55.010 --> 00:00:58.520
and serving thousands of companies on what it takes
22
00:00:58.520 --> 00:01:03.010
to build and deploy and scale AI capabilities
23
00:01:03.010 --> 00:01:05.913
and really transform the way organizations operate.
24
00:01:07.060 --> 00:01:09.180
We had a great discussion with Prakhar Mehrotra,
25
00:01:09.180 --> 00:01:10.920
from Walmart last time.
26
00:01:10.920 --> 00:01:13.120
Prakhar himself has a fascinating background,
27
00:01:13.120 --> 00:01:15.480
from organizations like Twitter and Uber
28
00:01:15.480 --> 00:01:17.520
to his current job at Walmart.
29
00:01:17.520 --> 00:01:20.220
There's quite some differences between those organizations.
30
00:01:20.220 --> 00:01:22.150
It's also interesting to learn some aspects
31
00:01:22.150 --> 00:01:23.610
that are similar.
32
00:01:23.610 --> 00:01:26.293
Check out our last episode for the fun details.
33
00:01:27.270 --> 00:01:29.700
So for today, let's go on to something new.
34
00:01:29.700 --> 00:01:32.400
Shervin I'm really looking forward to today's episode.
35
00:01:33.540 --> 00:01:36.070
Slawek, thanks for taking the time to talk with us today.
36
00:01:36.070 --> 00:01:37.840
Part of our focus is on you, really.
37
00:01:37.840 --> 00:01:39.920
Can you introduce yourself,
38
00:01:39.920 --> 00:01:44.360
tell us what your current role is and we'll go from there?
39
00:01:44.360 --> 00:01:46.710
My name is Slawek Kierner,
40
00:01:46.710 --> 00:01:49.530
and I'm a Senior Vice President in charge
41
00:01:49.530 --> 00:01:52.616
of Data and Analytics at Humana.
42
00:01:52.616 --> 00:01:55.104
Humana is a Fortune 60 company,
43
00:01:55.104 --> 00:01:59.290
really focused on healthcare and helping our members live
44
00:01:59.290 --> 00:02:00.860
longer and healthier lives.
45
00:02:00.860 --> 00:02:05.510
I have up to 25 years of experience in data and analytics
46
00:02:05.510 --> 00:02:10.510
across consumer technology and healthcare industries.
47
00:02:10.860 --> 00:02:13.840
And I've always been interested in data.
48
00:02:13.840 --> 00:02:18.507
My first money, I spend on a, on a PC and got MATLAB
49
00:02:18.507 --> 00:02:19.620
and Simulink loaded there,
50
00:02:19.620 --> 00:02:22.420
and I would create all kinds of simulations,
51
00:02:22.420 --> 00:02:25.980
get them run overnight and see what's coming out in
52
00:02:25.980 --> 00:02:30.700
the morning and use all kinds of ways to visualize data,
53
00:02:30.700 --> 00:02:33.230
that time I was fascinated with process simulations
54
00:02:33.230 --> 00:02:37.810
and the autonomous control systems and adaptive control,
55
00:02:37.810 --> 00:02:40.990
And this fascination got me a job with a P&G,
56
00:02:40.990 --> 00:02:42.520
Procter and Gamble,
57
00:02:42.520 --> 00:02:45.630
and the good thing is I could continue to use my passion,
58
00:02:45.630 --> 00:02:49.570
so there was a lot of a permission space to innovate
59
00:02:49.570 --> 00:02:52.790
and bring advanced algorithms to these spaces,
60
00:02:52.790 --> 00:02:56.730
and that's how I started, and it was also fascinating time
61
00:02:56.730 --> 00:02:58.220
because you could start to have feedback
62
00:02:58.220 --> 00:03:03.220
from your algorithms, from your AI early AI based systems.
63
00:03:03.610 --> 00:03:07.270
And with this moment I got hired by Nokia.
64
00:03:07.270 --> 00:03:09.693
It was the moment when iPhone was launched and
65
00:03:09.693 --> 00:03:12.670
that was another fascinating transformation,
66
00:03:12.670 --> 00:03:14.440
and as you probably know the story
67
00:03:14.440 --> 00:03:17.350
that part of business was acquired by Microsoft,
68
00:03:17.350 --> 00:03:21.420
and I moved all of my team and I was running data
69
00:03:21.420 --> 00:03:23.178
and analytics for Nokia that time
70
00:03:23.178 --> 00:03:27.690
to Microsoft looked around for awhile, helped run some
71
00:03:27.690 --> 00:03:30.346
of the same operations across Microsoft Retail and
72
00:03:30.346 --> 00:03:32.650
the Devices Business, and then move
73
00:03:32.650 --> 00:03:34.550
to the cloud and AI unit,
74
00:03:34.550 --> 00:03:35.750
and that got me to Humana.
75
00:03:35.750 --> 00:03:37.200
So roughly one and half years ago,
76
00:03:37.200 --> 00:03:38.910
I got to this moment when I thought,
77
00:03:38.910 --> 00:03:42.860
like I built quite a bit of experience in data and analytics
78
00:03:42.860 --> 00:03:44.000
and can make money with it,
79
00:03:44.000 --> 00:03:46.820
but now let's try to use it for some good purpose.
80
00:03:46.820 --> 00:03:49.740
And that's something that we can do in healthcare where next
81
00:03:49.740 --> 00:03:54.740
to having a lot of data, we also have a very noble purpose.
82
00:03:54.750 --> 00:03:56.200
I think you've almost run the table
83
00:03:56.200 --> 00:03:57.690
at almost every sort of industry
84
00:03:57.690 --> 00:03:59.400
and every sort of application.
85
00:03:59.400 --> 00:04:00.630
It sounds like you're just right ahead
86
00:04:00.630 --> 00:04:03.228
of the whatever's happening next.
87
00:04:03.228 --> 00:04:05.760
That's what I hope so, just I'm looking
88
00:04:05.760 --> 00:04:06.910
for these challenges.
89
00:04:06.910 --> 00:04:08.630
This is pointless, but I have to follow up.
90
00:04:08.630 --> 00:04:12.850
Were you a chemical engineer back in at the P&G days?
91
00:04:12.850 --> 00:04:15.260
I worked very close to chemical engineers
92
00:04:15.260 --> 00:04:17.340
by the way, so part of my projects were
93
00:04:17.340 --> 00:04:20.323
to actually rewire large chemical factories
94
00:04:20.323 --> 00:04:23.690
and think about really, really huge chemical operations
95
00:04:23.690 --> 00:04:26.650
that P&G has in major cities.
96
00:04:26.650 --> 00:04:29.440
But to your point, I actually did two majors,
97
00:04:29.440 --> 00:04:31.750
one was in Mechatronics,
98
00:04:31.750 --> 00:04:34.440
but essentially at that time in a fascination
99
00:04:34.440 --> 00:04:36.740
with AI in my part of the world
100
00:04:36.740 --> 00:04:40.220
and the other one was in Business Management.
101
00:04:40.220 --> 00:04:41.333
I just mentioned that 'cause both Shervin
102
00:04:41.333 --> 00:04:44.200
and I were Chemical Engineers back in the dark ages
103
00:04:44.200 --> 00:04:47.190
and thought we'd found a compatriot in the whole thing,
104
00:04:47.190 --> 00:04:49.620
'cause I actually got interested in you mentioned simulation
105
00:04:49.620 --> 00:04:50.827
as your beginning as well,
106
00:04:50.827 --> 00:04:53.380
and that's where I started too, was you didn't have
107
00:04:53.380 --> 00:04:55.470
to build a plan you could just simulate it
108
00:04:55.470 --> 00:04:58.130
and just really show some of the opportunities from data.
109
00:04:58.130 --> 00:05:01.070
And it sounds like you saw some of those same things.
110
00:05:01.070 --> 00:05:04.660
Exactly and you mentioned simulations of factors
111
00:05:04.660 --> 00:05:05.900
that's exactly what I was doing.
112
00:05:05.900 --> 00:05:07.380
So it's interesting because that fascination
113
00:05:07.380 --> 00:05:09.630
that I had early on during my startings
114
00:05:09.630 --> 00:05:11.990
with simulation, like I could bring to P&G.
115
00:05:11.990 --> 00:05:14.590
So a factory is very different than you are in
116
00:05:14.590 --> 00:05:17.300
Humana, that's a completely different...
117
00:05:18.190 --> 00:05:20.570
Chemicals don't mind if they sit for a while in a vat,
118
00:05:20.570 --> 00:05:21.710
but patients do.
119
00:05:21.710 --> 00:05:24.150
How did the things you've learned from
120
00:05:24.150 --> 00:05:27.690
that past experience influence your current?
121
00:05:27.690 --> 00:05:29.820
I think there were a number of things
122
00:05:29.820 --> 00:05:32.880
that actually you can learn in supply chain
123
00:05:32.880 --> 00:05:35.670
and chemical processes where they actually do apply
124
00:05:35.670 --> 00:05:36.503
to healthcare.
125
00:05:36.503 --> 00:05:39.040
It's fascinating, but let me just list a few,
126
00:05:39.040 --> 00:05:42.040
so first of all, the whole basic setup of process
127
00:05:42.040 --> 00:05:46.650
and process control, I think applies to chemical factory,
128
00:05:46.650 --> 00:05:50.840
and when you think about market, it's very similar,
129
00:05:50.840 --> 00:05:52.791
you can apply so much marketing and,
130
00:05:52.791 --> 00:05:55.530
when you do too much, of course you're wasting money,
131
00:05:55.530 --> 00:05:56.647
not enough even going to give you
132
00:05:56.647 --> 00:06:00.260
the results, so you kind of meet in a similar problem,
133
00:06:00.260 --> 00:06:02.675
and I think troubles that you have,
134
00:06:02.675 --> 00:06:07.110
with keeping chemical processes in control.
135
00:06:07.110 --> 00:06:11.490
Also actually we see in healthcare so for example,
136
00:06:11.490 --> 00:06:15.310
if chemical control cause a lot of lag so that,
137
00:06:15.310 --> 00:06:19.150
the time from when you apply certain force or temperature
138
00:06:19.150 --> 00:06:21.830
to a time when you start seeing the result of it on
139
00:06:21.830 --> 00:06:24.580
the output of a process, the more,
140
00:06:24.580 --> 00:06:27.000
this lag is the more difficult it is to keep
141
00:06:27.000 --> 00:06:28.570
the system in control.
142
00:06:28.570 --> 00:06:30.448
Classic thermostat problem.
143
00:06:30.448 --> 00:06:31.281
That's what it is.
144
00:06:31.281 --> 00:06:32.480
And when you think about healthcare,
145
00:06:32.480 --> 00:06:34.120
we have exactly the same.
146
00:06:34.120 --> 00:06:37.470
So we are tying all kinds of treatments and programs
147
00:06:37.470 --> 00:06:40.085
for our members on the onset.
148
00:06:40.085 --> 00:06:43.666
So think about diabetes or issue of let's say,
149
00:06:43.666 --> 00:06:47.870
having a heart disease and then the needing to adhere
150
00:06:47.870 --> 00:06:51.490
to medications, we try to convince you to do it,
151
00:06:51.490 --> 00:06:55.140
but very often we need to wait very long until we see
152
00:06:55.140 --> 00:06:57.720
that actually your health has improved.
153
00:06:57.720 --> 00:07:02.080
And again, here we have this long lax and at this lax,
154
00:07:02.080 --> 00:07:05.810
are both inherent to the process itself and clinical domain,
155
00:07:05.810 --> 00:07:07.300
but quite often they also result
156
00:07:07.300 --> 00:07:10.680
from just poor data interoperability.
157
00:07:10.680 --> 00:07:13.950
I think you're making super interesting point
158
00:07:13.950 --> 00:07:18.530
that there are archetypal sort of problems
159
00:07:18.530 --> 00:07:21.710
in different disciplines, different industries,
160
00:07:21.710 --> 00:07:23.880
different fields, all together,
161
00:07:23.880 --> 00:07:27.580
but similar approaches once customized to
162
00:07:27.580 --> 00:07:31.370
that particular industry or company would give a lot
163
00:07:31.370 --> 00:07:32.320
of good results.
164
00:07:32.320 --> 00:07:34.850
And I resonate a lot with that.
165
00:07:34.850 --> 00:07:37.020
You listed some very good examples.
166
00:07:37.020 --> 00:07:42.020
Do you feel that diversity across having seen different
167
00:07:44.210 --> 00:07:46.990
archetypal problems in different industries
168
00:07:46.990 --> 00:07:47.900
and disciplines?
169
00:07:47.900 --> 00:07:52.610
Do you feel like that is a important attribute
170
00:07:52.610 --> 00:07:53.870
of someone like yourself,
171
00:07:53.870 --> 00:07:57.630
who's leading an AI organization for a company just coming
172
00:07:57.630 --> 00:07:59.723
from sort of different backgrounds,
173
00:08:00.560 --> 00:08:04.071
having seen it across very different disciplines?
174
00:08:04.071 --> 00:08:07.270
I think it is useful for me, of course,
175
00:08:07.270 --> 00:08:10.610
and I have a lot of respect for leaders
176
00:08:10.610 --> 00:08:12.437
that emerged from healthcare industry,
177
00:08:12.437 --> 00:08:14.300
and of course they have that background
178
00:08:14.300 --> 00:08:17.440
which I am learning on it how
179
00:08:17.440 --> 00:08:20.500
to really operate in a healthcare context.
180
00:08:20.500 --> 00:08:23.033
But to your point, I think in a specifically in healthcare,
181
00:08:23.033 --> 00:08:26.260
there is a need for people that will come
182
00:08:26.260 --> 00:08:29.200
from other industries and bring knowledge
183
00:08:30.210 --> 00:08:34.140
because it feels that healthcare is totally behind,
184
00:08:34.140 --> 00:08:38.930
certainly from data transformation, availability of data,
185
00:08:38.930 --> 00:08:42.190
certainly from usage of AI and also
186
00:08:42.190 --> 00:08:44.670
from platform point of view.
187
00:08:44.670 --> 00:08:46.950
So you mentioned several different examples through
188
00:08:46.950 --> 00:08:49.060
there, can you give us some more detail about
189
00:08:49.060 --> 00:08:51.540
some particular success that you've had at Humana
190
00:08:51.540 --> 00:08:54.430
with particularly around obviously artificial intelligence
191
00:08:54.430 --> 00:08:55.560
is what we're interested in.
192
00:08:55.560 --> 00:08:58.700
Is there some story of success that you're proud of?
193
00:08:58.700 --> 00:09:01.600
There are a few.
194
00:09:01.600 --> 00:09:05.610
So we certainly are testing and learning a lot.
195
00:09:05.610 --> 00:09:09.040
I think the key progress that I'm really proud of is
196
00:09:09.040 --> 00:09:12.800
the fact that we have created our own internal machine
197
00:09:12.800 --> 00:09:17.800
learning platform, which helps our data scientists
198
00:09:17.930 --> 00:09:21.100
have access to modern technologies,
199
00:09:21.100 --> 00:09:24.410
to all of the open source capabilities,
200
00:09:24.410 --> 00:09:28.240
have cloud accessibility, such that,
201
00:09:28.240 --> 00:09:30.170
computing power is not any more limited
202
00:09:30.170 --> 00:09:33.030
by what you have in your data center,
203
00:09:33.030 --> 00:09:34.832
but essentially you can tap,
204
00:09:34.832 --> 00:09:38.697
and run any kind of algorithms out there.
205
00:09:38.697 --> 00:09:40.190
And we are starting to see,
206
00:09:40.190 --> 00:09:42.590
the benefits coming through way better,
207
00:09:42.590 --> 00:09:44.880
way more accurate models
208
00:09:44.880 --> 00:09:48.500
that predict retention in our business,
209
00:09:48.500 --> 00:09:51.350
but help us predict Inpatient admission situations
210
00:09:51.350 --> 00:09:55.130
and look forward, act hopefully way ahead
211
00:09:55.130 --> 00:09:59.210
of a time when our member needs to visit ER,
212
00:09:59.210 --> 00:10:03.660
or get into hospital and hopefully be there early enough so
213
00:10:03.660 --> 00:10:08.050
that we can help this person stay in better health,
214
00:10:08.050 --> 00:10:11.603
and avoid needing an inpatient treatment.
215
00:10:12.580 --> 00:10:15.750
There's also a lot of progress in terms of usage of AI,
216
00:10:15.750 --> 00:10:19.250
algorithms and sophistication of this.
217
00:10:19.250 --> 00:10:24.250
We had to overcome a lack of proper security in the cloud
218
00:10:25.340 --> 00:10:27.710
to handle PII and PHI data.
219
00:10:27.710 --> 00:10:30.070
So as we're building those capabilities,
220
00:10:30.070 --> 00:10:33.490
and helping also build those with our vendors,
221
00:10:33.490 --> 00:10:36.740
we had to generate high quality testing data,
222
00:10:36.740 --> 00:10:38.980
but it would be differentially private.
223
00:10:38.980 --> 00:10:42.020
We are able to create a new model, an AI model
224
00:10:42.020 --> 00:10:44.809
based on synthetic data, which is of similar accuracy
225
00:10:44.809 --> 00:10:48.340
to the one that is created on real data.
226
00:10:48.340 --> 00:10:53.340
Using genetic AI, we created a high fidelity synthetic
227
00:10:53.570 --> 00:10:55.750
profiles and populations of our members
228
00:10:55.750 --> 00:11:00.510
and could use Vose to ingest that to our platform,
229
00:11:00.510 --> 00:11:02.810
start to learn how to use it.
230
00:11:02.810 --> 00:11:04.560
we train our data scientists.
231
00:11:04.560 --> 00:11:09.560
We have more than 200 PhD grade data scientists at Humana,
232
00:11:10.000 --> 00:11:11.580
and we could already get access
233
00:11:11.580 --> 00:11:16.580
and start using systems ahead of our readiness
234
00:11:16.840 --> 00:11:18.720
for PHI and PII data handling,
235
00:11:18.720 --> 00:11:20.350
which happened in the meantime
236
00:11:20.350 --> 00:11:22.710
but the fact that we have a synthetic data creation
237
00:11:22.710 --> 00:11:24.140
capability actually is helping us
238
00:11:24.140 --> 00:11:26.270
in many other fronts as well.
239
00:11:26.270 --> 00:11:28.360
So make sure I understand you're using this tool
240
00:11:28.360 --> 00:11:31.860
to help your organization learn how to handle the real data?
241
00:11:31.860 --> 00:11:35.030
So you use AI to generate synthetic data
242
00:11:35.030 --> 00:11:37.470
that then lets everybody practice
243
00:11:37.470 --> 00:11:40.710
and learn on before it becomes a real patient.
244
00:11:40.710 --> 00:11:42.230
That's correct.
245
00:11:42.230 --> 00:11:43.370
Actually, I really like that example.
246
00:11:43.370 --> 00:11:46.201
Could you describe why, I might be,
247
00:11:46.201 --> 00:11:47.820
I think I'm trivializing it,
248
00:11:47.820 --> 00:11:50.253
but why is that an AI problem and not,
249
00:11:52.417 --> 00:11:53.530
a statistical sampling problem,
250
00:11:53.530 --> 00:11:55.343
like what makes AI fit there well?
251
00:11:56.392 --> 00:11:57.950
It's a good question
252
00:11:59.060 --> 00:12:03.020
and I think it became an AI problem when AI became better.
253
00:12:03.020 --> 00:12:07.010
So I don't know, if you've probably have seen some
254
00:12:07.010 --> 00:12:08.660
of the work of Nvidia that creates
255
00:12:08.660 --> 00:12:11.370
the synthetic faces of people.
256
00:12:11.370 --> 00:12:13.720
So essentially you use deep learning
257
00:12:13.720 --> 00:12:17.290
to train you on network to essentially learn on how
258
00:12:17.290 --> 00:12:19.110
a face of a person looks like,
259
00:12:19.110 --> 00:12:21.722
and then you ask it to recreate, taking away
260
00:12:21.722 --> 00:12:23.480
the original data and controlling
261
00:12:23.480 --> 00:12:27.050
for our fitting in such a way that it can ensure
262
00:12:27.050 --> 00:12:31.870
that none of the training pictures is recreated exactly.
263
00:12:31.870 --> 00:12:34.610
So, so essentially all of the faces that
264
00:12:34.610 --> 00:12:37.850
that synthetic generator generates,
265
00:12:37.850 --> 00:12:39.970
are unreal and never existed.
266
00:12:39.970 --> 00:12:43.820
And that field has started many years ago, but initially
267
00:12:43.820 --> 00:12:46.430
this faces they kind of always had two eyes,
268
00:12:46.430 --> 00:12:48.710
but an eye could be in the middle of the forehead
269
00:12:48.710 --> 00:12:52.070
and so like you create it to see but it's not real,
270
00:12:52.070 --> 00:12:54.810
but over the last two years we have improved so much,
271
00:12:54.810 --> 00:12:59.810
that when you look at the synthetic face right now, it
272
00:12:59.940 --> 00:13:01.010
looks, it's hard to recognize,
273
00:13:01.010 --> 00:13:05.210
that it's not a human and we could be easily tricked.
274
00:13:05.210 --> 00:13:07.580
So the parallel there is that you're tricking
275
00:13:07.580 --> 00:13:08.840
you're performing the same trick,
276
00:13:08.840 --> 00:13:11.130
but with data rather than with images.
277
00:13:11.130 --> 00:13:14.600
Exactly, we look at a human's record at
278
00:13:14.600 --> 00:13:15.970
the history, you know health history,
279
00:13:15.970 --> 00:13:18.150
but also actually a complete history of the demographic
280
00:13:18.150 --> 00:13:19.340
and health data.
281
00:13:19.340 --> 00:13:24.340
and we recreate the same population through an approach like
282
00:13:24.440 --> 00:13:27.530
this one so fully differentially, private,
283
00:13:27.530 --> 00:13:28.590
very high quality,
284
00:13:28.590 --> 00:13:32.380
when you look at take a physician looking at the history
285
00:13:32.380 --> 00:13:36.560
of the synthetic individual and a physician cannot tell,
286
00:13:36.560 --> 00:13:40.090
that it's unreal and it looks like real.
287
00:13:40.090 --> 00:13:41.220
Who had the idea to do that?
288
00:13:41.220 --> 00:13:42.053
Where did you...
289
00:13:42.053 --> 00:13:43.670
I would not have thought of that.
290
00:13:43.670 --> 00:13:47.270
So it's always a mix of people
291
00:13:47.270 --> 00:13:50.080
that are sitting around the table to try to solve
292
00:13:50.080 --> 00:13:52.430
a tough issue that we have.
293
00:13:52.430 --> 00:13:54.410
That's part of it, and quite often
294
00:13:54.410 --> 00:13:58.140
we invite our partners, so in particular,
295
00:13:58.140 --> 00:14:02.060
that technology came from a collaboration with a partner
296
00:14:02.060 --> 00:14:05.250
from Europe who interestingly enough,
297
00:14:05.250 --> 00:14:10.010
also worked with me at Nokia, so a very talented individual,
298
00:14:10.010 --> 00:14:12.380
that created a synthetic data set capabilities
299
00:14:12.380 --> 00:14:15.380
and synthetic data creation capabilities
300
00:14:15.380 --> 00:14:18.040
and got a lot of success with this in Europe,
301
00:14:18.040 --> 00:14:20.470
which as you probably know,
302
00:14:20.470 --> 00:14:24.350
of course, he's much more concerned with personal privacy.
303
00:14:24.350 --> 00:14:26.870
And then another set of partners we are working very closely
304
00:14:26.870 --> 00:14:30.220
with right now is Microsoft and innovate advancement
305
00:14:30.220 --> 00:14:34.330
of differential privacy and this space.
306
00:14:34.330 --> 00:14:37.070
And then finally, we quite often connect with academia
307
00:14:37.070 --> 00:14:39.490
and we have those connections as well.
308
00:14:39.490 --> 00:14:43.450
This is a great example of how
309
00:14:43.450 --> 00:14:47.250
in a real technical topic, from a different discipline,
310
00:14:47.250 --> 00:14:52.250
with deep learning and image recognition makes a tangible
311
00:14:53.310 --> 00:14:56.480
difference in a completely different industry,
312
00:14:56.480 --> 00:15:00.620
and I could imagine maybe 20 years ago, 15 years ago,
313
00:15:00.620 --> 00:15:05.620
clinicians and business folks running a company like Humana
314
00:15:05.770 --> 00:15:08.180
would say, well, that's not the same.
315
00:15:08.180 --> 00:15:09.120
There's no real patient.
316
00:15:09.120 --> 00:15:09.953
It's all synthetic.
317
00:15:09.953 --> 00:15:12.520
We can't trust it, et cetera, et cetera.
318
00:15:12.520 --> 00:15:17.270
My question is: what level of education and sort of
319
00:15:17.270 --> 00:15:22.270
knowledge sharing do you feel you've had to go through both
320
00:15:22.583 --> 00:15:27.560
at Humana and in your prior careers to sort of bridge
321
00:15:27.560 --> 00:15:31.330
that gap between the art of the possible on the technical
322
00:15:31.330 --> 00:15:34.750
side and the business where
323
00:15:34.750 --> 00:15:36.680
the understanding is not the same,
324
00:15:36.680 --> 00:15:39.750
and do you feel like there is still a gap
325
00:15:39.750 --> 00:15:42.453
and how do you bridge that and narrow it?
326
00:15:43.478 --> 00:15:45.379
Shervin yes there is a gap
327
00:15:45.379 --> 00:15:49.430
and I think there's a gap between technology
328
00:15:49.430 --> 00:15:50.580
and business understanding,
329
00:15:50.580 --> 00:15:54.430
and there is a gap between technology and ourselves.
330
00:15:54.430 --> 00:15:59.430
We all in this particular field need to continue to learn.
331
00:15:59.910 --> 00:16:03.660
Every few years things change
332
00:16:03.660 --> 00:16:05.720
and sometimes they change totally on us.
333
00:16:05.720 --> 00:16:08.950
So part of it, and I think the first skill is
334
00:16:08.950 --> 00:16:10.490
how do you continue to learn?
335
00:16:10.490 --> 00:16:13.850
It's continuous learning, but it's necessary
336
00:16:13.850 --> 00:16:18.850
for all of us and us as leaders who need
337
00:16:19.173 --> 00:16:22.210
to inspire our teams to do the same,
338
00:16:22.210 --> 00:16:25.240
because even if you hire in PhDs in Data Science,
339
00:16:25.240 --> 00:16:28.020
but have been recently trained,
340
00:16:28.020 --> 00:16:30.180
they need to continue to learn,
341
00:16:30.180 --> 00:16:34.230
they need to have a work bench where they can tweak data and
342
00:16:34.230 --> 00:16:35.647
they can learn with others.
343
00:16:36.550 --> 00:16:38.183
And then the other part of it,
344
00:16:38.183 --> 00:16:43.183
which I spearheaded at Humana is a much tighter link
345
00:16:43.370 --> 00:16:46.900
with product teams, invest in a large technology companies
346
00:16:46.900 --> 00:16:48.834
that we collaborate with.
347
00:16:48.834 --> 00:16:50.550
So we are trying to do is to make sure
348
00:16:50.550 --> 00:16:53.250
that we are closely in touch with those product teams,
349
00:16:53.250 --> 00:16:54.850
we follow what they're doing,
350
00:16:54.850 --> 00:16:59.230
we participate to customer advisory boards
351
00:16:59.230 --> 00:17:02.110
and through this both help them shape
352
00:17:02.110 --> 00:17:04.940
their products and, and for us, and get excited
353
00:17:04.940 --> 00:17:08.930
and hopefully drive accelerated adoption
354
00:17:08.930 --> 00:17:12.290
of this new features and functionalities ourselves.
355
00:17:12.290 --> 00:17:13.320
So that's one part of your question.
356
00:17:13.320 --> 00:17:15.420
So how do we stay ahead?
357
00:17:15.420 --> 00:17:17.463
And then of course we have a huge role
358
00:17:17.463 --> 00:17:20.570
of helping our business teams
359
00:17:20.570 --> 00:17:25.060
and our partners in enterprises, where we happen to work,
360
00:17:25.060 --> 00:17:28.330
to also understand and the art of the possible
361
00:17:28.330 --> 00:17:32.180
and help us turn this technology knowledge into reality,
362
00:17:32.180 --> 00:17:35.300
that actually advances our experience both internally
363
00:17:35.300 --> 00:17:37.850
and also for our members and customers.
364
00:17:37.850 --> 00:17:42.320
My takeaway from what you're saying is hiring
365
00:17:42.320 --> 00:17:47.120
the team and keeping the team at the forefront of the art of
366
00:17:47.120 --> 00:17:51.380
the possible and inspiring them is one part of it.
367
00:17:51.380 --> 00:17:56.380
But also organizations have to take steps to actually bridge
368
00:17:56.770 --> 00:17:59.350
these gaps through all these things that you talking about
369
00:17:59.350 --> 00:18:03.680
so that there is more collaboration and sort of cross
370
00:18:03.680 --> 00:18:08.100
functional teaming and much closer sort of product
371
00:18:08.100 --> 00:18:13.100
management with analytics, with AI, with voice of customer,
372
00:18:13.510 --> 00:18:17.800
all of that so that these ideas are allowed to even incubate
373
00:18:17.800 --> 00:18:19.676
and go somewhere, Is that right?
374
00:18:19.676 --> 00:18:20.823
100% agree.
375
00:18:21.810 --> 00:18:23.300
Well if we've gotten a hundred percent agreement
376
00:18:23.300 --> 00:18:25.270
from Slawek I think that's a great time to end.
377
00:18:25.270 --> 00:18:28.141
Thank you for taking the time to talk with us Slawek.
378
00:18:28.141 --> 00:18:30.724
(mellow music)
379
00:18:32.600 --> 00:18:35.070
Shervin, Let's do a quick recap.
380
00:18:35.070 --> 00:18:39.070
Sounds good Sam, Slawek made some very interesting points.
381
00:18:39.070 --> 00:18:41.600
One of the things that he mentioned a lot was,
382
00:18:41.600 --> 00:18:44.730
how past experiences led to his current role
383
00:18:44.730 --> 00:18:47.510
and he had so many different past experiences
384
00:18:47.510 --> 00:18:50.470
and yet he found ways to apply them.
385
00:18:50.470 --> 00:18:52.330
I thought that was a pretty fascinating point.
386
00:18:52.330 --> 00:18:54.550
I really resonated with that.
387
00:18:54.550 --> 00:18:58.930
He talked about some archetypal problems like the chemical
388
00:18:58.930 --> 00:19:02.690
engineering, the problem of system controls
389
00:19:02.690 --> 00:19:04.560
and the lag in the system
390
00:19:04.560 --> 00:19:06.080
and how that has to be managed,
391
00:19:06.080 --> 00:19:11.047
and he likened this to a problem of marketing,
392
00:19:11.047 --> 00:19:14.880
and then more importantly to the problem of managing
393
00:19:14.880 --> 00:19:18.230
the care of millions of the patients,
394
00:19:18.230 --> 00:19:21.060
because as they propose different treatments
395
00:19:21.060 --> 00:19:25.010
for different members, there will be a lag between
396
00:19:25.010 --> 00:19:26.830
what's working and what's not working.
397
00:19:26.830 --> 00:19:29.180
And the ability to understand what's working
398
00:19:29.180 --> 00:19:32.220
and what's not working and what that lag time is,
399
00:19:32.220 --> 00:19:36.130
that's almost a standardized chemical engineering
400
00:19:36.130 --> 00:19:39.640
or control systems electrical engineering problem
401
00:19:39.640 --> 00:19:42.100
and his ability to see these archetypes
402
00:19:42.100 --> 00:19:45.410
and sort of transcend from discipline and domain
403
00:19:45.410 --> 00:19:50.120
and industry to health insurance is really,
404
00:19:50.120 --> 00:19:51.290
really important.
405
00:19:51.290 --> 00:19:52.130
Gave me a little hope
406
00:19:52.130 --> 00:19:54.950
that humans will still be around for a bit.
407
00:19:54.950 --> 00:19:58.510
It's interesting as well, that in a lot of these examples,
408
00:19:58.510 --> 00:20:01.270
the specific industry details were obviously different
409
00:20:01.270 --> 00:20:04.100
and you can't just blindly apply them
410
00:20:04.100 --> 00:20:06.050
from one industry to another.
411
00:20:06.050 --> 00:20:09.480
And I think there's a role too, that creativity
412
00:20:09.480 --> 00:20:12.750
and being smart about what fits and what's smart about
413
00:20:12.750 --> 00:20:13.720
what doesn't fit.
414
00:20:13.720 --> 00:20:16.130
That again seems very human.
415
00:20:16.130 --> 00:20:17.010
Yeah, completely.
416
00:20:17.010 --> 00:20:21.220
And the other thing, building up on that is importance
417
00:20:21.220 --> 00:20:26.220
of again, building trust and building trust of the humans in
418
00:20:28.960 --> 00:20:30.480
the AI solutions.
419
00:20:30.480 --> 00:20:35.310
And he talked about synthetic data to do synthetic tests
420
00:20:35.310 --> 00:20:37.980
of different treatments.
421
00:20:37.980 --> 00:20:41.300
And then he talked about the process of showing
422
00:20:41.300 --> 00:20:46.210
the clinicians how that synthetic data actually mimics
423
00:20:46.210 --> 00:20:48.920
the real data and how it does and so I think
424
00:20:48.920 --> 00:20:50.590
that's also very important,
425
00:20:50.590 --> 00:20:55.120
again building trust and building trust in areas where,
426
00:20:55.120 --> 00:20:57.920
human judgment really, really matters.
427
00:20:57.920 --> 00:21:00.370
I liked how, how pragmatic it was to...
428
00:21:00.370 --> 00:21:02.600
You know the computer are gonna have problems
429
00:21:02.600 --> 00:21:05.020
and so you'd much rather find those problems out
430
00:21:05.020 --> 00:21:06.410
with generated data and I thought,
431
00:21:06.410 --> 00:21:10.440
what was creative about his solution was using AI
432
00:21:10.440 --> 00:21:14.470
to generate that data that smelled just like real data,
433
00:21:14.470 --> 00:21:16.190
but they could afford to screw up with.
434
00:21:16.190 --> 00:21:17.030
I like things like that,
435
00:21:17.030 --> 00:21:20.300
where it makes complete sense once he says it,
436
00:21:20.300 --> 00:21:22.200
but I would never have thought of it myself.
437
00:21:22.200 --> 00:21:23.363
That's a great point.
438
00:21:24.720 --> 00:21:26.570
That's all the time we have today,
439
00:21:26.570 --> 00:21:30.473
but next time, join us as we talk to Gina Chung from DHL.
440
00:21:32.810 --> 00:21:35.590
Thanks for listening to "Me, myself and AI".
441
00:21:35.590 --> 00:21:37.200
If you're enjoying the show,
442
00:21:37.200 --> 00:21:39.310
take a minute to write us a review.
443
00:21:39.310 --> 00:21:40.920
If you send us a screenshot,
444
00:21:40.920 --> 00:21:44.360
we'll send you a collection of MIT SMRs best articles on
445
00:21:44.360 --> 00:21:47.750
artificial intelligence, free for a limited time.
446
00:21:47.750 --> 00:21:52.726
Send your review screenshot to smrfeedback@mit.edu.
447
00:21:52.726 --> 00:21:55.309
(mellow music)