WEBVTT 1 00:00:00.000 --> 00:00:02.310 2 00:00:02.310 --> 00:00:04.500 SHERVIN KHODABANDEH: An all-you-can-eat sushi buffet 3 00:00:04.500 --> 00:00:08.000 and artificial intelligence -- how are they related? 4 00:00:08.000 --> 00:00:11.880 Find out today when we talk with Sarah Karthigan of ExxonMobil. 5 00:00:11.880 --> 00:00:12.380 6 00:00:12.380 --> 00:00:14.930 SAM RANSBOTHAM: Welcome to Me, Myself, and AI, 7 00:00:14.930 --> 00:00:18.100 a podcast on artificial intelligence in business. 8 00:00:18.100 --> 00:00:21.840 Each episode, we introduce you to someone innovating with AI. 9 00:00:21.840 --> 00:00:25.160 I'm Sam Ransbotham, professor of information systems 10 00:00:25.160 --> 00:00:26.500 at Boston College. 11 00:00:26.500 --> 00:00:29.860 I'm also the guest editor for the AI and Business Strategy 12 00:00:29.860 --> 00:00:33.450 Big Idea program at MIT Sloan Management Review. 13 00:00:33.450 --> 00:00:36.050 SHERVIN KHODABANDEH: And I'm Shervin Khodabandeh, 14 00:00:36.050 --> 00:00:40.100 senior partner with BCG, and I colead BCG's AI practice 15 00:00:40.100 --> 00:00:41.280 in North America. 16 00:00:41.280 --> 00:00:44.810 Together, MIT SMR and BCG have been 17 00:00:44.810 --> 00:00:48.230 researching AI for five years, interviewing hundreds 18 00:00:48.230 --> 00:00:50.580 of practitioners and surveying thousands 19 00:00:50.580 --> 00:00:54.690 of companies on what it takes to build and to deploy and scale 20 00:00:54.690 --> 00:00:57.280 AI capabilities across the organization 21 00:00:57.280 --> 00:01:00.450 and really transform the way organizations operate. 22 00:01:00.450 --> 00:01:05.280 SAM RANSBOTHAM: Today we're talking with Sarah Karthigan. 23 00:01:05.280 --> 00:01:08.140 She's the artificial intelligence for IT operations 24 00:01:08.140 --> 00:01:09.682 manager at ExxonMobil. 25 00:01:09.682 --> 00:01:10.890 Sarah, thanks for joining us. 26 00:01:10.890 --> 00:01:11.350 Welcome. 27 00:01:11.350 --> 00:01:12.560 SARAH KARTHIGAN: Thanks for having me. 28 00:01:12.560 --> 00:01:14.893 SAM RANSBOTHAM: ExxonMobil is one of the world's largest 29 00:01:14.893 --> 00:01:17.410 oil and gas companies, and it's existed since the 1870s, 30 00:01:17.410 --> 00:01:19.153 long before artificial intelligence. 31 00:01:19.153 --> 00:01:21.070 Sarah, can you tell us about your current role 32 00:01:21.070 --> 00:01:22.094 at ExxonMobil? 33 00:01:22.094 --> 00:01:23.060 34 00:01:23.060 --> 00:01:26.210 SARAH KARTHIGAN: I am currently responsible for leading 35 00:01:26.210 --> 00:01:30.220 the design and execution of self-healing strategies for IT 36 00:01:30.220 --> 00:01:33.240 operations, using artificial intelligence. 37 00:01:33.240 --> 00:01:36.680 Self-healing, at its core, is proactively 38 00:01:36.680 --> 00:01:40.020 monitoring, detecting, and remediating issues 39 00:01:40.020 --> 00:01:41.510 without human intervention. 40 00:01:41.510 --> 00:01:43.510 SAM RANSBOTHAM: How did you end up in that role? 41 00:01:43.510 --> 00:01:43.910 42 00:01:43.910 --> 00:01:46.450 SARAH KARTHIGAN: My background is in electrical engineering, 43 00:01:46.450 --> 00:01:50.960 and I started at ExxonMobil as a technical lead. 44 00:01:50.960 --> 00:01:56.000 I then went down the management career path, but one of my jobs 45 00:01:56.000 --> 00:01:58.470 took me up to Clinton, New Jersey, 46 00:01:58.470 --> 00:02:02.490 to support the corporate strategic research function. 47 00:02:02.490 --> 00:02:05.450 So it is there I got exposed to data 48 00:02:05.450 --> 00:02:08.130 science, artificial intelligence, and machine 49 00:02:08.130 --> 00:02:08.630 learning. 50 00:02:08.630 --> 00:02:11.470 I was a part of several pilots where 51 00:02:11.470 --> 00:02:15.520 we were assessing artificial intelligence capabilities. 52 00:02:15.520 --> 00:02:18.780 This inspired me to go back to school, 53 00:02:18.780 --> 00:02:22.730 and I pursued my graduate certification at Harvard 54 00:02:22.730 --> 00:02:24.090 in data science. 55 00:02:24.090 --> 00:02:28.140 One thing led to another, and I came back to Houston 56 00:02:28.140 --> 00:02:31.500 to take on my data science manager role. 57 00:02:31.500 --> 00:02:33.580 SAM RANSBOTHAM: Maybe let's start with an example 58 00:02:33.580 --> 00:02:35.970 of a project that -- maybe self-healing, 59 00:02:35.970 --> 00:02:39.360 maybe one of these projects -- [is] an example of a concrete 60 00:02:39.360 --> 00:02:43.480 way that you and your team have used artificial intelligence 61 00:02:43.480 --> 00:02:46.328 in a way that you couldn't have done before artificial 62 00:02:46.328 --> 00:02:46.870 intelligence. 63 00:02:46.870 --> 00:02:49.220 SARAH KARTHIGAN: There are plenty of opportunities 64 00:02:49.220 --> 00:02:51.890 for artificial intelligence in the energy sector, 65 00:02:51.890 --> 00:02:55.010 but before I actually give you some examples, 66 00:02:55.010 --> 00:02:57.100 I think it's worthwhile to just understand 67 00:02:57.100 --> 00:02:59.980 the scale of operations in the energy sector. 68 00:02:59.980 --> 00:03:03.210 So, starting with the basics here, 69 00:03:03.210 --> 00:03:07.620 energy is constantly evolving, and when 70 00:03:07.620 --> 00:03:11.750 you think about energy, it underpins every area 71 00:03:11.750 --> 00:03:13.740 of modern life, right? 72 00:03:13.740 --> 00:03:18.410 So when you think about mobility or economic prosperity 73 00:03:18.410 --> 00:03:23.300 or social progress, access to energy underpins all of that. 74 00:03:23.300 --> 00:03:26.800 And at its core, what we do here at ExxonMobil 75 00:03:26.800 --> 00:03:33.180 is ensure that we are able to offer reliable, affordable 76 00:03:33.180 --> 00:03:34.520 energy to the masses. 77 00:03:34.520 --> 00:03:38.260 So the scale of energy itself is quite unimaginable, 78 00:03:38.260 --> 00:03:42.050 and the data that we work with is also massive. 79 00:03:42.050 --> 00:03:45.170 Big data is not new to the energy sector, 80 00:03:45.170 --> 00:03:48.880 so we deal with just huge volumes of data. 81 00:03:48.880 --> 00:03:52.260 Without artificial intelligence, without data science, 82 00:03:52.260 --> 00:03:53.760 or without machine learning, you can 83 00:03:53.760 --> 00:03:59.040 imagine the amount of effort that goes into just processing 84 00:03:59.040 --> 00:04:00.810 and analyzing that data. 85 00:04:00.810 --> 00:04:02.790 And with artificial intelligence, 86 00:04:02.790 --> 00:04:06.670 it is such a big, big, big advantage. 87 00:04:06.670 --> 00:04:09.970 The potential that AI carries with respect 88 00:04:09.970 --> 00:04:13.250 to just overall improving efficiency and cost 89 00:04:13.250 --> 00:04:15.940 effectiveness is huge. 90 00:04:15.940 --> 00:04:19.410 We also use artificial intelligence for areas 91 00:04:19.410 --> 00:04:22.100 where we are able to automate manual tasks, 92 00:04:22.100 --> 00:04:25.560 thereby improving safety and productivity. 93 00:04:25.560 --> 00:04:29.500 And if we are able to get people [out] of harm's way, 94 00:04:29.500 --> 00:04:32.070 that's a huge application for artificial intelligence 95 00:04:32.070 --> 00:04:33.270 in the energy sector. 96 00:04:33.270 --> 00:04:36.310 Additionally, ExxonMobil is an energy company, 97 00:04:36.310 --> 00:04:39.380 but at its core, again, we are a technology company, 98 00:04:39.380 --> 00:04:43.770 and so we can use AI to help our scientists 99 00:04:43.770 --> 00:04:47.380 and engineers in their decision-making process. 100 00:04:47.380 --> 00:04:50.080 We are able to augment their decision-making, 101 00:04:50.080 --> 00:04:52.710 connect the dots, and help discover 102 00:04:52.710 --> 00:04:56.190 insights of value [for] them at a much faster pace, 103 00:04:56.190 --> 00:04:58.190 so there are plenty of applications. 104 00:04:58.190 --> 00:05:01.890 My team and I, we have worked on several use cases. 105 00:05:01.890 --> 00:05:04.880 And again, when you think about big data, 106 00:05:04.880 --> 00:05:08.130 clearly you can think of potential applications 107 00:05:08.130 --> 00:05:10.620 of deep learning when it comes to image processing. 108 00:05:10.620 --> 00:05:13.120 Now whether that's [at] the front end of the value chain -- 109 00:05:13.120 --> 00:05:16.040 you know, you can start with seismic image processing 110 00:05:16.040 --> 00:05:19.050 to even leak and flare detection -- 111 00:05:19.050 --> 00:05:23.460 so we can use artificial intelligence for just, again, 112 00:05:23.460 --> 00:05:25.160 plenty of use cases. 113 00:05:25.160 --> 00:05:26.640 So that's one side of things. 114 00:05:26.640 --> 00:05:28.940 You can also use artificial intelligence -- 115 00:05:28.940 --> 00:05:34.620 and we have used it for demand sensing, for dynamic pricing, 116 00:05:34.620 --> 00:05:36.640 for dynamic revenue management. 117 00:05:36.640 --> 00:05:39.420 Also, we have used it for trading. 118 00:05:39.420 --> 00:05:42.100 So there [are] just so many different applications 119 00:05:42.100 --> 00:05:43.720 that my team has been involved in. 120 00:05:43.720 --> 00:05:48.680 SHERVIN KHODABANDEH: Sarah, tell us a bit about self-healing. 121 00:05:48.680 --> 00:05:56.230 I think you mentioned building AI systems that can preempt 122 00:05:56.230 --> 00:06:00.180 issues or problems or errors or faults -- 123 00:06:00.180 --> 00:06:01.930 I don't want to put words in your mouth -- 124 00:06:01.930 --> 00:06:03.700 without human intervention. 125 00:06:03.700 --> 00:06:06.500 Could you give us some examples of those? 126 00:06:06.500 --> 00:06:09.360 SARAH KARTHIGAN: It all starts with monitoring, right? 127 00:06:09.360 --> 00:06:11.860 How well can we monitor our systems, 128 00:06:11.860 --> 00:06:15.700 capture the right type of data, and then integrate 129 00:06:15.700 --> 00:06:19.240 data, which is probably sitting across silos today? 130 00:06:19.240 --> 00:06:22.250 It all begins with that: capturing the data 131 00:06:22.250 --> 00:06:24.620 and bringing it all together and integrating it 132 00:06:24.620 --> 00:06:26.650 so you're able to have visibility 133 00:06:26.650 --> 00:06:29.960 across the different silos that we have in place. 134 00:06:29.960 --> 00:06:31.800 It starts with observability. 135 00:06:31.800 --> 00:06:33.900 And then, once you have the data in place, 136 00:06:33.900 --> 00:06:36.470 now we are talking about: How can we utilize the data? 137 00:06:36.470 --> 00:06:38.010 How can we analyze it? 138 00:06:38.010 --> 00:06:40.480 How can we teach a machine? 139 00:06:40.480 --> 00:06:45.210 How can we train a machine to extract insights out 140 00:06:45.210 --> 00:06:48.550 of that data, to look at patterns, to see what typically 141 00:06:48.550 --> 00:06:50.910 happens before an incident occurs? 142 00:06:50.910 --> 00:06:54.230 It is able to look for those patterns. 143 00:06:54.230 --> 00:06:58.310 It's able to understand the history and detect anomalies, 144 00:06:58.310 --> 00:07:02.790 and thereby it is able to prompt -- either an end user, 145 00:07:02.790 --> 00:07:05.040 or you can just go ahead and close the loop out with 146 00:07:05.040 --> 00:07:08.590 automation altogether -- and kick off the necessary 147 00:07:08.590 --> 00:07:12.000 automations that need to happen, need to occur, 148 00:07:12.000 --> 00:07:15.430 so we are able to remediate the issue even before it becomes 149 00:07:15.430 --> 00:07:16.160 an issue. 150 00:07:16.160 --> 00:07:18.650 That is kind of the life cycle of self-healing. 151 00:07:18.650 --> 00:07:20.830 SHERVIN KHODABANDEH: Yeah, that's very helpful. 152 00:07:20.830 --> 00:07:24.180 And tell us a bit about the number 153 00:07:24.180 --> 00:07:25.670 of use cases, if you will. 154 00:07:25.670 --> 00:07:30.010 How big is this group's span of impact and work? 155 00:07:30.010 --> 00:07:32.560 SARAH KARTHIGAN: There are multiple groups 156 00:07:32.560 --> 00:07:34.850 within ExxonMobil, because, as you were saying, 157 00:07:34.850 --> 00:07:37.060 given the scale of the company, it's 158 00:07:37.060 --> 00:07:40.240 not possible to just centralize all of the data science 159 00:07:40.240 --> 00:07:44.730 capability in just one group, so we do have data scientists. 160 00:07:44.730 --> 00:07:48.010 We have AI engineers -- machine learning engineers -- 161 00:07:48.010 --> 00:07:51.120 embedded into the different business functions so they are 162 00:07:51.120 --> 00:07:53.970 able to work very closely with the business. 163 00:07:53.970 --> 00:07:57.200 And the opportunities -- there are many. 164 00:07:57.200 --> 00:08:01.400 We are working on a myriad of those use cases, 165 00:08:01.400 --> 00:08:03.015 and they only continue to grow. 166 00:08:03.015 --> 00:08:04.890 SAM RANSBOTHAM: Who initiates these projects? 167 00:08:04.890 --> 00:08:06.735 Are these things that your group comes up 168 00:08:06.735 --> 00:08:08.860 with, or [do] the business units bring them to you? 169 00:08:08.860 --> 00:08:11.360 What's the working relationship there? 170 00:08:11.360 --> 00:08:14.800 SARAH KARTHIGAN: The nature of the AI project, as well as 171 00:08:14.800 --> 00:08:16.800 who initiates them. 172 00:08:16.800 --> 00:08:19.360 It typically comes down to where a business 173 00:08:19.360 --> 00:08:23.430 line is in their AI adoption and utilization journey. 174 00:08:23.430 --> 00:08:26.690 If they are in the early stages, what you will see 175 00:08:26.690 --> 00:08:31.340 is typically they are looking at a few potential use cases. 176 00:08:31.340 --> 00:08:34.960 They are exploring a few enterprise-scale opportunities. 177 00:08:34.960 --> 00:08:36.620 That's where it kind of starts. 178 00:08:36.620 --> 00:08:40.860 But as they continue down that maturity curve, 179 00:08:40.860 --> 00:08:42.640 you will notice that now we're talking 180 00:08:42.640 --> 00:08:46.290 about systemic introduction of AI capabilities 181 00:08:46.290 --> 00:08:48.160 into core businesses. 182 00:08:48.160 --> 00:08:51.990 We're talking about true enterprise-scale opportunities, 183 00:08:51.990 --> 00:08:55.860 so we are able to drive data-driven decisions. 184 00:08:55.860 --> 00:08:58.500 And so, depending on where the business line 185 00:08:58.500 --> 00:09:00.630 is in their journey, that dictates 186 00:09:00.630 --> 00:09:03.200 the nature of the project as well as who initiates it. 187 00:09:03.200 --> 00:09:05.810 The more mature a business line is, 188 00:09:05.810 --> 00:09:08.700 the more the business lines initiate the projects 189 00:09:08.700 --> 00:09:09.200 themselves. 190 00:09:09.200 --> 00:09:11.580 SAM RANSBOTHAM: What's an example of one that someone has 191 00:09:11.580 --> 00:09:17.270 initiated, or can you give us just a very specific "Before AI 192 00:09:17.270 --> 00:09:20.500 they were doing X, and then they came along and we said, 193 00:09:20.500 --> 00:09:23.570 'Hey, let's use artificial intelligence and then we can do 194 00:09:23.570 --> 00:09:24.710 Y'?" 195 00:09:24.710 --> 00:09:26.100 What's the difference? 196 00:09:26.100 --> 00:09:29.060 And can you give us some concretes around one of those? 197 00:09:29.060 --> 00:09:32.390 SARAH KARTHIGAN: I'll start with a simple example. 198 00:09:32.390 --> 00:09:33.930 I touched on this earlier. 199 00:09:33.930 --> 00:09:36.620 ExxonMobil is a very data-rich company, 200 00:09:36.620 --> 00:09:39.390 so big data is not new to us. 201 00:09:39.390 --> 00:09:42.460 There's data that is locked up in salt mines, 202 00:09:42.460 --> 00:09:45.060 so we have huge volumes of data. 203 00:09:45.060 --> 00:09:49.060 In the past, some of our geoscientists 204 00:09:49.060 --> 00:09:53.860 and geophysicists, they had to process 205 00:09:53.860 --> 00:09:58.100 a lot of unstructured data, pretty much manually. 206 00:09:58.100 --> 00:10:01.740 And they were the ones who were connecting the dots. 207 00:10:01.740 --> 00:10:03.600 These were the subject matter experts, 208 00:10:03.600 --> 00:10:06.570 so they were ingesting all of this unstructured data, 209 00:10:06.570 --> 00:10:08.420 and they were connecting the dots, 210 00:10:08.420 --> 00:10:12.880 and they were identifying the right place for us 211 00:10:12.880 --> 00:10:14.290 to go pursue. 212 00:10:14.290 --> 00:10:16.830 But now, with the introduction of artificial intelligence, 213 00:10:16.830 --> 00:10:20.110 we were able to build an intelligent system that, 214 00:10:20.110 --> 00:10:22.180 using natural language processing, 215 00:10:22.180 --> 00:10:25.140 had us able to ingest huge volumes of data. 216 00:10:25.140 --> 00:10:28.440 And we're able to train that system 217 00:10:28.440 --> 00:10:30.730 to look for the right type of patterns 218 00:10:30.730 --> 00:10:34.420 and to help augment the decisions that a geoscientist 219 00:10:34.420 --> 00:10:36.140 or a geophysicist would make. 220 00:10:36.140 --> 00:10:40.592 So that is one example of how we use machine learning insight. 221 00:10:40.592 --> 00:10:42.800 SHERVIN KHODABANDEH: I was going to ask you, Sarah -- 222 00:10:42.800 --> 00:10:46.600 it seems like there is a large amount of human-AI 223 00:10:46.600 --> 00:10:50.730 collaboration that has to happen in this example that you gave, 224 00:10:50.730 --> 00:10:55.160 because we've got to imagine that a series of decisions that 225 00:10:55.160 --> 00:10:59.380 used to be performed by human experts and geologists 226 00:10:59.380 --> 00:11:03.080 and engineers that is, over time, 227 00:11:03.080 --> 00:11:07.890 being augmented and maybe even entirely automatically 228 00:11:07.890 --> 00:11:13.810 performed by AI must have gone through a pretty robust journey 229 00:11:13.810 --> 00:11:18.160 to get to a level where those experts are comfortable 230 00:11:18.160 --> 00:11:22.910 and actually seek out the machine rather than rely 231 00:11:22.910 --> 00:11:23.890 on their judgment. 232 00:11:23.890 --> 00:11:27.600 So comment a bit about how that process happens 233 00:11:27.600 --> 00:11:32.640 and how you bring the experts and the geologists 234 00:11:32.640 --> 00:11:35.630 and the engineers and others from the old-school way 235 00:11:35.630 --> 00:11:36.810 to the new-school way. 236 00:11:36.810 --> 00:11:37.880 What does that feel like? 237 00:11:37.880 --> 00:11:40.740 SARAH KARTHIGAN: It is a journey, 238 00:11:40.740 --> 00:11:44.540 and it starts with, No. 239 00:11:44.540 --> 00:11:50.450 1, understanding what is the appetite for new, 240 00:11:50.450 --> 00:11:55.380 emerging solutions with the end-user base, 241 00:11:55.380 --> 00:11:59.430 because this is not just a technology challenge; 242 00:11:59.430 --> 00:12:02.520 this is very much a cultural challenge. 243 00:12:02.520 --> 00:12:09.000 And then, of course, we make sure that we have advocates 244 00:12:09.000 --> 00:12:15.270 in the business before we start on any of these AI pilots, AI 245 00:12:15.270 --> 00:12:19.430 solutions because ultimately, the end 246 00:12:19.430 --> 00:12:21.283 users need to be bought in. 247 00:12:21.283 --> 00:12:22.950 They shouldn't be fighting the solution. 248 00:12:22.950 --> 00:12:24.900 They should very much be the ones 249 00:12:24.900 --> 00:12:27.330 who are adopting those solutions and who 250 00:12:27.330 --> 00:12:30.090 are helping propagate the changes that this 251 00:12:30.090 --> 00:12:31.120 would produce. 252 00:12:31.120 --> 00:12:36.160 We have seen that having a very robust management of change 253 00:12:36.160 --> 00:12:40.590 process is crucial for the adoption of an AI solution, 254 00:12:40.590 --> 00:12:42.950 for it to become a success. 255 00:12:42.950 --> 00:12:46.190 And what we have also learned is, 256 00:12:46.190 --> 00:12:49.670 giving the end users an under-the-hood experience 257 00:12:49.670 --> 00:12:52.670 of what the tech actually does -- what it brings -- 258 00:12:52.670 --> 00:12:54.130 is extremely helpful. 259 00:12:54.130 --> 00:12:55.710 They are able to see that this is 260 00:12:55.710 --> 00:12:57.960 going to augment what they're doing [and is] not going 261 00:12:57.960 --> 00:12:58.720 to replace them. 262 00:12:58.720 --> 00:13:00.460 SAM RANSBOTHAM: What is their reaction? 263 00:13:00.460 --> 00:13:02.255 When you give them this solution that 264 00:13:02.255 --> 00:13:04.630 does a lot of what they have been used to [doing] before, 265 00:13:04.630 --> 00:13:05.560 what is their reaction? 266 00:13:05.560 --> 00:13:06.210 How do they feel? 267 00:13:06.210 --> 00:13:06.918 What do they say? 268 00:13:06.918 --> 00:13:08.850 SARAH KARTHIGAN: They actually love it 269 00:13:08.850 --> 00:13:11.320 when they realize that the machine is actually 270 00:13:11.320 --> 00:13:12.270 helping them. 271 00:13:12.270 --> 00:13:15.130 And sometimes it is able to even lead them 272 00:13:15.130 --> 00:13:20.840 to areas that they may have not checked themselves. 273 00:13:20.840 --> 00:13:23.400 I have seen that the partnership goes really, 274 00:13:23.400 --> 00:13:27.110 really well once they understand the value that the new solution 275 00:13:27.110 --> 00:13:28.360 is able to bring to the table. 276 00:13:28.360 --> 00:13:31.790 SHERVIN KHODABANDEH: You led, actually, in your response 277 00:13:31.790 --> 00:13:37.670 to this question with several nontechnical factors first, 278 00:13:37.670 --> 00:13:38.170 right? 279 00:13:38.170 --> 00:13:43.370 So, "What's your appetite, what's the openness to change, 280 00:13:43.370 --> 00:13:45.710 and how badly do you want it?" 281 00:13:45.710 --> 00:13:47.180 Which is really quite insightful, 282 00:13:47.180 --> 00:13:49.590 because over the last 10 years, it's 283 00:13:49.590 --> 00:13:54.290 just been indexed so much toward the technical side of things, 284 00:13:54.290 --> 00:13:57.700 and then the change management becomes an afterthought, 285 00:13:57.700 --> 00:14:00.590 and I was really energized that you actually 286 00:14:00.590 --> 00:14:03.900 led with the change management: "Before I do anything, 287 00:14:03.900 --> 00:14:05.830 before I write a single line of code, 288 00:14:05.830 --> 00:14:08.200 how badly do you want it?" 289 00:14:08.200 --> 00:14:11.090 I want to follow on the appetite question. 290 00:14:11.090 --> 00:14:15.620 The first time I was offered sushi, my appetite for it 291 00:14:15.620 --> 00:14:16.550 was zero. 292 00:14:16.550 --> 00:14:21.060 But when somebody effectively forced me to try it, 293 00:14:21.060 --> 00:14:23.090 then it sort of became my [go-to] food. 294 00:14:23.090 --> 00:14:30.600 So how do you balance that act of not forcing the end user, 295 00:14:30.600 --> 00:14:36.760 but also helping them understand that what they think 296 00:14:36.760 --> 00:14:39.380 their appetite is before they try 297 00:14:39.380 --> 00:14:42.715 it is going to be different than what their appetite will 298 00:14:42.715 --> 00:14:43.590 be after they try it? 299 00:14:43.590 --> 00:14:47.130 SARAH KARTHIGAN: When I first founded the group, 300 00:14:47.130 --> 00:14:51.470 when I had my first set of data scientists, 301 00:14:51.470 --> 00:14:55.790 we actually met with quite a lot of skepticism, to your point -- 302 00:14:55.790 --> 00:14:59.570 so a lot of people thinking, "All of this is just hype. 303 00:14:59.570 --> 00:15:00.530 Why are we doing this? 304 00:15:00.530 --> 00:15:01.850 We know what we are doing. 305 00:15:01.850 --> 00:15:05.250 We have done what we do very successfully. 306 00:15:05.250 --> 00:15:07.380 So why do we have to change it?" 307 00:15:07.380 --> 00:15:09.860 So when we started out, it really 308 00:15:09.860 --> 00:15:12.680 came down to demonstrating the art of the possible. 309 00:15:12.680 --> 00:15:17.190 We were knocking [on] a lot of doors and asking people, 310 00:15:17.190 --> 00:15:19.030 "Hey, just give us your data. 311 00:15:19.030 --> 00:15:21.660 And you don't have to even engage with us," 312 00:15:21.660 --> 00:15:25.060 because folks were at that time a little bit 313 00:15:25.060 --> 00:15:28.470 skeptical about the amount of time it would require 314 00:15:28.470 --> 00:15:32.980 on their part, and they were not necessarily ready to offer that 315 00:15:32.980 --> 00:15:34.020 at the get-go. 316 00:15:34.020 --> 00:15:37.190 So we started out with, "Just give us your data, 317 00:15:37.190 --> 00:15:41.370 and let us come back to you with what we can discover on our own 318 00:15:41.370 --> 00:15:44.430 and see if that is of interest to you or not." 319 00:15:44.430 --> 00:15:47.570 SHERVIN KHODABANDEH: And now you have many sushi restaurants? 320 00:15:47.570 --> 00:15:48.820 SARAH KARTHIGAN: Very much so. 321 00:15:48.820 --> 00:15:51.700 SAM RANSBOTHAM: An all-you-can-eat buffet. 322 00:15:51.700 --> 00:15:55.780 So, let's say that you've got these people somewhat convinced 323 00:15:55.780 --> 00:15:57.690 and interested, and then you start 324 00:15:57.690 --> 00:15:59.990 to put things into production. 325 00:15:59.990 --> 00:16:01.598 How do you keep them going? 326 00:16:01.598 --> 00:16:02.890 How do you keep them improving? 327 00:16:02.890 --> 00:16:05.230 How do you keep them continuously getting better? 328 00:16:05.230 --> 00:16:07.210 Do you have processes around that, 329 00:16:07.210 --> 00:16:09.710 and, if so, how is that organized? 330 00:16:09.710 --> 00:16:11.430 SARAH KARTHIGAN: I'll tell you this much: 331 00:16:11.430 --> 00:16:14.090 It's been an interesting learning experience. 332 00:16:14.090 --> 00:16:16.630 Because it's one thing to go build out a model. 333 00:16:16.630 --> 00:16:19.120 It's one thing to go ahead and create a prototype 334 00:16:19.120 --> 00:16:20.540 and have everything working. 335 00:16:20.540 --> 00:16:22.410 But it's another thing altogether 336 00:16:22.410 --> 00:16:24.670 when you're trying to operationalize it. 337 00:16:24.670 --> 00:16:29.330 After you operationalize AI solutions, what we have learned 338 00:16:29.330 --> 00:16:30.050 is, No. 339 00:16:30.050 --> 00:16:33.370 1, [in order] to make sure that it is fully integrated 340 00:16:33.370 --> 00:16:36.050 into the business processes, there are several things 341 00:16:36.050 --> 00:16:39.750 that you have to be aware of and keep tabs on. 342 00:16:39.750 --> 00:16:44.130 We ensure, after a solution has been operationalized, 343 00:16:44.130 --> 00:16:46.440 that it is being monitored. 344 00:16:46.440 --> 00:16:48.140 So that is extremely important. 345 00:16:48.140 --> 00:16:51.450 Now, we learned very quickly [that] you cannot monitor all 346 00:16:51.450 --> 00:16:54.770 the features of the model, so there are some features that 347 00:16:54.770 --> 00:16:59.440 you have to home in on that have the potential to disrupt -- 348 00:16:59.440 --> 00:17:02.590 to, I would say, not necessarily break the model, 349 00:17:02.590 --> 00:17:06.089 but it has the greatest potential to impact 350 00:17:06.089 --> 00:17:06.920 the predictions. 351 00:17:06.920 --> 00:17:09.210 So we want to home in on those types of features 352 00:17:09.210 --> 00:17:14.200 and monitor them and see if concept drift is setting in, 353 00:17:14.200 --> 00:17:18.020 because once a model moves into production, 354 00:17:18.020 --> 00:17:19.160 it starts degrading. 355 00:17:19.160 --> 00:17:20.200 That's the reality. 356 00:17:20.200 --> 00:17:23.089 So we need to ensure that we are keeping our eyes on the model 357 00:17:23.089 --> 00:17:26.470 to make sure that the predictions are still accurate, 358 00:17:26.470 --> 00:17:28.800 that they are still useful. 359 00:17:28.800 --> 00:17:30.960 We also make sure that our models 360 00:17:30.960 --> 00:17:34.610 are being retrained with the latest and the greatest data. 361 00:17:34.610 --> 00:17:39.240 We are looking into adopting a weighting mechanism 362 00:17:39.240 --> 00:17:42.860 so that more recent data is weighted [more] heavily 363 00:17:42.860 --> 00:17:45.350 in retraining a model than older data. 364 00:17:45.350 --> 00:17:48.620 And we're also looking into continuous improvement, 365 00:17:48.620 --> 00:17:51.350 continuous training, and continuous learning 366 00:17:51.350 --> 00:17:52.800 methodologies for our models. 367 00:17:52.800 --> 00:17:56.020 So these are some things that we do once a solution has 368 00:17:56.020 --> 00:17:57.218 been productized. 369 00:17:57.218 --> 00:17:59.010 SAM RANSBOTHAM: Within your organization -- 370 00:17:59.010 --> 00:18:01.550 and that's about how the models get better -- 371 00:18:01.550 --> 00:18:04.930 how do you help the end users get better? 372 00:18:04.930 --> 00:18:08.230 You mentioned initially working with them to make sure that 373 00:18:08.230 --> 00:18:11.430 it's not too much resistance to even consider trying a model -- 374 00:18:11.430 --> 00:18:14.800 that's Shervin even trying sushi in the first place -- 375 00:18:14.800 --> 00:18:18.490 but how do you get them to appreciate the finer culinary 376 00:18:18.490 --> 00:18:19.070 aspects? 377 00:18:19.070 --> 00:18:21.020 I mean, maybe for all we know, Shervin's 378 00:18:21.020 --> 00:18:23.810 stuck on the same piece of sushi that he started with years 379 00:18:23.810 --> 00:18:26.790 and years ago, but there [are] lots of other types out there. 380 00:18:26.790 --> 00:18:30.250 How are you growing that understanding in the user base? 381 00:18:30.250 --> 00:18:32.825 SARAH KARTHIGAN: We have several efforts in progress 382 00:18:32.825 --> 00:18:34.450 within the company where we are looking 383 00:18:34.450 --> 00:18:38.440 at upskilling our employees, making sure that we 384 00:18:38.440 --> 00:18:41.390 are able to train them on the latest and the greatest 385 00:18:41.390 --> 00:18:43.890 emerging technologies so they have enough 386 00:18:43.890 --> 00:18:46.610 of an understanding of what AI offers, 387 00:18:46.610 --> 00:18:49.430 what are the potential use cases we can consider. 388 00:18:49.430 --> 00:18:52.558 So there's a lot of training work that is happening. 389 00:18:52.558 --> 00:18:54.350 SAM RANSBOTHAM: What are you excited about? 390 00:18:54.350 --> 00:18:55.767 I mean, what's new and what are we 391 00:18:55.767 --> 00:18:58.470 going to read about tomorrow that ExxonMobil is doing 392 00:18:58.470 --> 00:18:59.880 with artificial intelligence? 393 00:18:59.880 --> 00:19:02.560 What's something you're excited about, either a technology 394 00:19:02.560 --> 00:19:03.225 or a project? 395 00:19:03.225 --> 00:19:04.212 396 00:19:04.212 --> 00:19:06.170 SARAH KARTHIGAN: What I'm really excited about, 397 00:19:06.170 --> 00:19:09.100 and what I hope you get to read about soon, 398 00:19:09.100 --> 00:19:13.120 is this self-healing pilot that we are gearing up to do. 399 00:19:13.120 --> 00:19:17.050 The self-healing pilot is looking 400 00:19:17.050 --> 00:19:22.750 at taking an application that is end-user facing and seeing 401 00:19:22.750 --> 00:19:27.550 how many of these self-healing wins we can realize. 402 00:19:27.550 --> 00:19:31.340 We have been investing our time in building out 403 00:19:31.340 --> 00:19:36.830 the foundation, the fabric that is important to really bring 404 00:19:36.830 --> 00:19:38.910 this whole solution together, so now we 405 00:19:38.910 --> 00:19:42.530 are very much excited about testing that out and putting 406 00:19:42.530 --> 00:19:44.580 the strategy into action. 407 00:19:44.580 --> 00:19:47.660 SHERVIN KHODABANDEH: Sarah, as you think about your own team 408 00:19:47.660 --> 00:19:51.420 -- building and cultivating and expanding that team -- 409 00:19:51.420 --> 00:19:55.240 two questions: What are you looking for in the candidates 410 00:19:55.240 --> 00:19:56.320 that you're bringing in? 411 00:19:56.320 --> 00:20:00.070 What are some technical and nontechnical capabilities 412 00:20:00.070 --> 00:20:01.070 you're looking for? 413 00:20:01.070 --> 00:20:02.480 That's my first question. 414 00:20:02.480 --> 00:20:03.350 And No. 415 00:20:03.350 --> 00:20:06.720 2 is, how do you keep them interested and excited 416 00:20:06.720 --> 00:20:11.120 in data science and AI, with everything that's going on 417 00:20:11.120 --> 00:20:14.430 and all the other options out there for them? 418 00:20:14.430 --> 00:20:16.840 SARAH KARTHIGAN: Let me start with answering 419 00:20:16.840 --> 00:20:17.730 your second question. 420 00:20:17.730 --> 00:20:19.970 So, how do we keep them interested? 421 00:20:19.970 --> 00:20:22.410 We keep them interested by exposing them 422 00:20:22.410 --> 00:20:24.920 to diverse use cases. 423 00:20:24.920 --> 00:20:27.880 You don't have to leave the company 424 00:20:27.880 --> 00:20:30.970 to work on a finance problem. 425 00:20:30.970 --> 00:20:34.300 There are opportunities here within the company. 426 00:20:34.300 --> 00:20:37.660 And so just the myriad of use cases that the data 427 00:20:37.660 --> 00:20:40.660 scientist gets to work on, gets to solve, 428 00:20:40.660 --> 00:20:44.200 is what I have found that keeps them excited, 429 00:20:44.200 --> 00:20:47.550 that wants them to continue their career here 430 00:20:47.550 --> 00:20:48.570 within the company. 431 00:20:48.570 --> 00:20:53.960 So that is our secret to retaining talent internally. 432 00:20:53.960 --> 00:20:56.700 As far as what do I look for in a candidate? 433 00:20:56.700 --> 00:21:01.090 I am quite keen on diversity. 434 00:21:01.090 --> 00:21:05.840 I don't want a team that is an echo chamber. 435 00:21:05.840 --> 00:21:09.760 I specifically go seek out skills 436 00:21:09.760 --> 00:21:13.080 that are in adjacent areas. 437 00:21:13.080 --> 00:21:16.820 I have had data scientists on my team whose background 438 00:21:16.820 --> 00:21:18.700 is biostatistics. 439 00:21:18.700 --> 00:21:23.520 I have even had people with English and political majors. 440 00:21:23.520 --> 00:21:26.310 Of course, now I am looking for people with data science 441 00:21:26.310 --> 00:21:29.310 skills, too, so either they had an undergrad degree 442 00:21:29.310 --> 00:21:33.060 in that area but then they also studied data science. 443 00:21:33.060 --> 00:21:35.170 I go seek out those types of candidates 444 00:21:35.170 --> 00:21:38.080 because it's extremely important to have 445 00:21:38.080 --> 00:21:40.650 very diverse viewpoints at the table 446 00:21:40.650 --> 00:21:43.250 when you are trying to solve a problem. 447 00:21:43.250 --> 00:21:47.240 I am looking for someone who's curious, who is very much 448 00:21:47.240 --> 00:21:49.510 interested in problem-solving. 449 00:21:49.510 --> 00:21:53.550 And, again, what excites them is challenging problems, 450 00:21:53.550 --> 00:21:57.990 and we're talking about a scale that is truly unimaginable. 451 00:21:57.990 --> 00:21:59.690 SHERVIN KHODABANDEH: Sarah, you've 452 00:21:59.690 --> 00:22:02.300 been named a leader in tech. 453 00:22:02.300 --> 00:22:05.670 You've been named one of [the] 25 most influential women 454 00:22:05.670 --> 00:22:08.370 in energy, in tech. 455 00:22:08.370 --> 00:22:13.990 What do you think companies could do more of to ensure 456 00:22:13.990 --> 00:22:19.280 a more fair gender balance in data science roles, 457 00:22:19.280 --> 00:22:23.100 and what do you think data scientists out there -- 458 00:22:23.100 --> 00:22:25.900 female data scientists that are just starting their careers -- 459 00:22:25.900 --> 00:22:28.240 could be doing more of? 460 00:22:28.240 --> 00:22:32.780 SARAH KARTHIGAN: I would say it all starts with providing 461 00:22:32.780 --> 00:22:34.350 equal opportunities. 462 00:22:34.350 --> 00:22:38.680 I am here because I got the opportunity 463 00:22:38.680 --> 00:22:42.700 to demonstrate what I can do, what I'm capable of doing. 464 00:22:42.700 --> 00:22:46.550 Making sure that, that window of opportunity 465 00:22:46.550 --> 00:22:51.380 is truly open for both women and men is crucial, 466 00:22:51.380 --> 00:22:53.360 so that's where it all starts. 467 00:22:53.360 --> 00:22:57.970 For an aspiring data scientist, for girls 468 00:22:57.970 --> 00:23:00.920 in middle school, high school, who are even considering 469 00:23:00.920 --> 00:23:04.040 pursuing a STEM career, my encouragement 470 00:23:04.040 --> 00:23:06.200 would be, yes, absolutely, we need you. 471 00:23:06.200 --> 00:23:10.280 Women bring a perspective that is so different 472 00:23:10.280 --> 00:23:15.610 and that is extremely needed in the work environment. 473 00:23:15.610 --> 00:23:19.440 And especially -- we talked about responsible AI. 474 00:23:19.440 --> 00:23:25.070 It is important to have that type of a diverse perspective 475 00:23:25.070 --> 00:23:29.700 right from the get-go -- right from building a strategy, 476 00:23:29.700 --> 00:23:31.260 all the way to execution. 477 00:23:31.260 --> 00:23:33.310 It should not be an afterthought. 478 00:23:33.310 --> 00:23:37.250 You shouldn't try to slap on "Hey, let me go ahead 479 00:23:37.250 --> 00:23:40.000 and make sure I address diversity 480 00:23:40.000 --> 00:23:41.090 and inclusion at the end." 481 00:23:41.090 --> 00:23:42.700 No; that's not how it works. 482 00:23:42.700 --> 00:23:46.930 You start with that, and that is crucial. 483 00:23:46.930 --> 00:23:49.760 And women play a key role in making that happen. 484 00:23:49.760 --> 00:23:52.410 SHERVIN KHODABANDEH: And what do you think women 485 00:23:52.410 --> 00:23:56.760 in data science who are either just starting their careers 486 00:23:56.760 --> 00:24:00.300 or are in their academic training, what 487 00:24:00.300 --> 00:24:03.990 do you think they could be doing to seek out 488 00:24:03.990 --> 00:24:06.100 the right opportunities for themselves? 489 00:24:06.100 --> 00:24:07.555 What's your advice for them? 490 00:24:07.555 --> 00:24:08.770 491 00:24:08.770 --> 00:24:14.160 SARAH KARTHIGAN: I would say that ensure you have really 492 00:24:14.160 --> 00:24:20.770 good examples of either a capstone project or experiences 493 00:24:20.770 --> 00:24:24.823 with internships or co-op opportunities -- 494 00:24:24.823 --> 00:24:26.740 whatever you want to call those experiences -- 495 00:24:26.740 --> 00:24:30.050 with companies where you have dealt with real data. 496 00:24:30.050 --> 00:24:33.380 I think that absolutely augments your resume. 497 00:24:33.380 --> 00:24:38.440 And then, on top of that, once you have found that entry point 498 00:24:38.440 --> 00:24:43.030 into a company, just feel free to speak up and bring 499 00:24:43.030 --> 00:24:45.850 your solutions very vocally to the table. 500 00:24:45.850 --> 00:24:46.910 That's what I would say. 501 00:24:46.910 --> 00:24:49.150 SAM RANSBOTHAM: Today we learned a lot 502 00:24:49.150 --> 00:24:52.400 about starting with the organizational aspects 503 00:24:52.400 --> 00:24:54.530 of an artificial intelligence change 504 00:24:54.530 --> 00:24:56.410 versus the technical aspects. 505 00:24:56.410 --> 00:24:59.070 [We] learned about leading with the idea of showing people 506 00:24:59.070 --> 00:25:01.310 what's possible and what the potential can be 507 00:25:01.310 --> 00:25:03.400 from artificial intelligence. 508 00:25:03.400 --> 00:25:05.590 We learned about the many steps in the process 509 00:25:05.590 --> 00:25:08.090 of data that are fraught with peril 510 00:25:08.090 --> 00:25:10.338 but organizations can overcome. 511 00:25:10.338 --> 00:25:12.130 And I really appreciate you taking the time 512 00:25:12.130 --> 00:25:13.300 to talk with us today, Sarah. 513 00:25:13.300 --> 00:25:14.080 Thanks for joining us 514 00:25:14.080 --> 00:25:14.850 SHERVIN KHODABANDEH: Thank you, Sarah. 515 00:25:14.850 --> 00:25:16.080 SARAH KARTHIGAN: It's been my pleasure. 516 00:25:16.080 --> 00:25:16.780 Thank you. 517 00:25:16.780 --> 00:25:18.890 SAM RANSBOTHAM: In our next episode, 518 00:25:18.890 --> 00:25:20.840 we'll talk with Doug Hamilton about how 519 00:25:20.840 --> 00:25:23.690 Nasdaq uses AI to mitigate high-risk situations. 520 00:25:23.690 --> 00:25:24.730 Please join us. 521 00:25:24.730 --> 00:25:27.410 ALLISON RYDER: Thanks for listening 522 00:25:27.410 --> 00:25:28.910 to Me, Myself, and AI. 523 00:25:28.910 --> 00:25:31.360 We believe, like you, that the conversation 524 00:25:31.360 --> 00:25:33.590 about AI implementation doesn't start and stop 525 00:25:33.590 --> 00:25:34.770 with this podcast. 526 00:25:34.770 --> 00:25:37.260 That's why we've created a group on LinkedIn, specifically 527 00:25:37.260 --> 00:25:38.380 for leaders like you. 528 00:25:38.380 --> 00:25:41.120 It's called AI for Leaders, and if you join us, 529 00:25:41.120 --> 00:25:43.140 you can chat with show creators and hosts, 530 00:25:43.140 --> 00:25:46.770 ask your own questions, share insights, and gain access 531 00:25:46.770 --> 00:25:49.250 to valuable resources about AI implementation 532 00:25:49.250 --> 00:25:51.350 from MIT SMR and BCG. 533 00:25:51.350 --> 00:25:56.420 You can access it by visiting mitsmr.com/AIforLeaders. 534 00:25:56.420 --> 00:25:59.190 We'll put that link in the show notes, 535 00:25:59.190 --> 00:26:01.620 and we hope to see you there. 536 00:26:01.620 --> 00:26:07.000