WEBVTT 1 00:00:02.100 --> 00:00:05.210 We hear a lot about the bias AI can exacerbate. 2 00:00:05.210 --> 00:00:08.510 But AI can help organizations reduce bias too. 3 00:00:08.510 --> 00:00:11.190 Find out how when we talk with JoAnn Stonier, 4 00:00:11.190 --> 00:00:13.113 chief data officer at MasterCard. 5 00:00:14.190 --> 00:00:16.220 Welcome to "Me, Myself, and AI," 6 00:00:16.220 --> 00:00:19.410 a podcast on artificial intelligence in business. 7 00:00:19.410 --> 00:00:20.930 Each episode, we introduce you 8 00:00:20.930 --> 00:00:23.280 to someone innovating with AI. 9 00:00:23.280 --> 00:00:26.140 I'm Sam Ransbotham, professor of information systems 10 00:00:26.140 --> 00:00:27.810 at Boston College. 11 00:00:27.810 --> 00:00:29.240 I'm also the guest editor 12 00:00:29.240 --> 00:00:32.100 for the "AI and Business Strategy" Big Idea program 13 00:00:32.100 --> 00:00:34.900 at MIT Sloan Management Review. 14 00:00:34.900 --> 00:00:38.800 And I'm Shervin Khodabandeh, senior partner with BCG. 15 00:00:38.800 --> 00:00:42.590 And I call it BCG's AI practice in North America. 16 00:00:42.590 --> 00:00:43.850 And together, 17 00:00:43.850 --> 00:00:48.090 MITSMR and BCG have been researching AI for five years, 18 00:00:48.090 --> 00:00:50.270 interviewing hundreds of practitioners 19 00:00:50.270 --> 00:00:52.300 and surveying thousands of companies 20 00:00:52.300 --> 00:00:55.080 on what it takes to build and to deploy 21 00:00:55.080 --> 00:00:58.280 and scale AI capabilities across the organization, 22 00:00:58.280 --> 00:01:00.843 and really transform the way organizations operate. 23 00:01:03.240 --> 00:01:05.370 Today we're talking with JoAnn Stonier. 24 00:01:05.370 --> 00:01:07.600 JoAnn is chief data officer at MasterCard. 25 00:01:07.600 --> 00:01:09.750 JoAnne, thanks for taking the time to talk with us today. 26 00:01:09.750 --> 00:01:10.710 Welcome. 27 00:01:10.710 --> 00:01:12.800 Thank you, happy to be here. 28 00:01:12.800 --> 00:01:15.480 JoAnn, let's start with your current role at MasterCard. 29 00:01:15.480 --> 00:01:18.030 Can you give us a quick overview of what you do? 30 00:01:18.030 --> 00:01:19.820 Sure, currently as you said, 31 00:01:19.820 --> 00:01:22.010 the chief data officer for the firm. 32 00:01:22.010 --> 00:01:24.240 And me and my team are responsible 33 00:01:24.240 --> 00:01:26.930 for ensuring that MasterCard's information assets 34 00:01:26.930 --> 00:01:28.810 are available for information 35 00:01:28.810 --> 00:01:31.920 while navigating current and future data risk. 36 00:01:31.920 --> 00:01:34.620 So my team has a very broad mandate. 37 00:01:34.620 --> 00:01:38.340 We work on helping the firm develop our data strategy. 38 00:01:38.340 --> 00:01:40.680 And then work on all the different aspects 39 00:01:40.680 --> 00:01:44.610 of data management including data governance, data quality, 40 00:01:44.610 --> 00:01:47.760 as well as enabling things like artificial intelligence, 41 00:01:47.760 --> 00:01:49.070 machine learning, 42 00:01:49.070 --> 00:01:51.820 and then we also help design and operate 43 00:01:51.820 --> 00:01:53.880 some of our enterprise data platforms. 44 00:01:53.880 --> 00:01:55.170 It's a very broad-based role. 45 00:01:55.170 --> 00:01:56.950 We work also on data compliance. 46 00:01:56.950 --> 00:01:58.750 And how do you embed compliance 47 00:01:58.750 --> 00:02:02.450 and responsible data practices right into product design. 48 00:02:02.450 --> 00:02:04.900 We start at the very beginning of data sourcing 49 00:02:04.900 --> 00:02:07.970 all the way through to product creation and enablement. 50 00:02:07.970 --> 00:02:09.100 It's a lot of fun. 51 00:02:09.100 --> 00:02:11.270 Because our show is about artificial intelligence 52 00:02:11.270 --> 00:02:13.040 let me pick up on that aspect. 53 00:02:13.040 --> 00:02:15.610 Can you give us examples of something you're excited about 54 00:02:15.610 --> 00:02:18.060 that MasterCard is doing with artificial intelligence? 55 00:02:18.060 --> 00:02:21.090 Oh my gosh, I've had so many conversations just this week 56 00:02:21.090 --> 00:02:23.060 about artificial intelligence. 57 00:02:23.060 --> 00:02:26.030 Most of them centered on minimization of bias 58 00:02:26.030 --> 00:02:28.660 as well as how do we build an inclusive future? 59 00:02:28.660 --> 00:02:30.630 The conversations that really excite me 60 00:02:30.630 --> 00:02:31.530 is how the whole firm 61 00:02:31.530 --> 00:02:33.970 is really getting behind this idea and notion. 62 00:02:33.970 --> 00:02:37.270 I've had conversations with our product development team 63 00:02:37.270 --> 00:02:39.520 and how do we develop a broad based playbook. 64 00:02:39.520 --> 00:02:42.650 So that everybody in the organization really understands 65 00:02:42.650 --> 00:02:44.820 how do you begin to really think about design 66 00:02:44.820 --> 00:02:46.220 at its very inception 67 00:02:46.220 --> 00:02:48.990 so that you're really thinking about inclusive concepts? 68 00:02:48.990 --> 00:02:50.100 We've had conversations 69 00:02:50.100 --> 00:02:51.730 with our people and capabilities team 70 00:02:51.730 --> 00:02:54.470 or what's a more commonly known as human resources. 71 00:02:54.470 --> 00:02:56.680 Talking about the skillsets of the future, 72 00:02:56.680 --> 00:02:58.690 how are we going to not only have 73 00:02:58.690 --> 00:03:00.870 a more inclusive workforce at MasterCard. 74 00:03:00.870 --> 00:03:02.290 And what do we need to do 75 00:03:02.290 --> 00:03:04.540 to provide education opportunities 76 00:03:04.540 --> 00:03:06.950 both inside the firm but also outside the firm. 77 00:03:06.950 --> 00:03:08.430 So that we can create 78 00:03:08.430 --> 00:03:10.530 the right kind of profile of individuals 79 00:03:10.530 --> 00:03:12.950 so that they have the skill sets that we need. 80 00:03:12.950 --> 00:03:14.610 But also how do we upscale, 81 00:03:14.610 --> 00:03:17.450 how do we really begin to create the opportunities 82 00:03:17.450 --> 00:03:19.660 to have the right kinds of conversations? 83 00:03:19.660 --> 00:03:23.400 We're also working really hard on our ethical AI process. 84 00:03:23.400 --> 00:03:25.410 So there's so many different aspects 85 00:03:25.410 --> 00:03:28.040 of what we're doing around artificial intelligence. 86 00:03:28.040 --> 00:03:32.240 Not just in our products and solutions in fraud and cyber, 87 00:03:32.240 --> 00:03:35.600 in general analytics and in biometric solutions 88 00:03:35.600 --> 00:03:37.093 around digital identity. 89 00:03:38.423 --> 00:03:41.870 It's really an interesting time to do this work. 90 00:03:41.870 --> 00:03:44.580 And to really do it in a way that I think needs to last 91 00:03:44.580 --> 00:03:45.973 for the generations ahead. 92 00:03:46.840 --> 00:03:48.240 JoAnn that's really interesting. 93 00:03:48.240 --> 00:03:50.720 I'm particularly excited 94 00:03:50.720 --> 00:03:54.080 that one of the things you talked about is use of AI 95 00:03:54.080 --> 00:03:56.230 to prevent bias. 96 00:03:56.230 --> 00:03:57.780 There's been a lot of conversations 97 00:03:57.780 --> 00:04:02.780 around the unintended bias of AI and how to manage it. 98 00:04:02.970 --> 00:04:05.250 But I heard you also referred to it 99 00:04:05.250 --> 00:04:08.930 as a tool that can actually help uncover biases. 100 00:04:08.930 --> 00:04:10.530 Can you comment more about that? 101 00:04:11.510 --> 00:04:13.110 Yeah, but it's hard, right? 102 00:04:13.110 --> 00:04:14.430 This is a lot of hard work. 103 00:04:14.430 --> 00:04:18.440 I do a lot of conversations both with academic institutions, 104 00:04:18.440 --> 00:04:20.940 other civil society organizations. 105 00:04:20.940 --> 00:04:23.850 Right, we're early days of AI and machine learning. 106 00:04:23.850 --> 00:04:26.939 I think we're probably at generation maybe 1.5 107 00:04:26.939 --> 00:04:28.920 heading into generation 2. 108 00:04:28.920 --> 00:04:31.280 But I think the events of this past year 109 00:04:31.280 --> 00:04:33.810 have taught us that we really need to pay attention 110 00:04:33.810 --> 00:04:38.430 to how we are designing products and solutions for society. 111 00:04:38.430 --> 00:04:41.980 And that our data sets are really important 112 00:04:41.980 --> 00:04:44.410 in what are we feeding into the machines 113 00:04:44.410 --> 00:04:47.280 and how do we design our algorithmic processes 114 00:04:47.280 --> 00:04:50.650 that we also were feeding into the intelligence 115 00:04:50.650 --> 00:04:52.350 and what is it going to learn from us? 116 00:04:52.350 --> 00:04:55.130 And so I try to explain to people 117 00:04:55.130 --> 00:04:57.490 our first generation of data analytics. 118 00:04:57.490 --> 00:05:01.920 We were creating algorithms or analytic questions 119 00:05:01.920 --> 00:05:05.990 that really asked the question of, this is the question, 120 00:05:05.990 --> 00:05:09.330 if this condition exists in the dataset 121 00:05:09.330 --> 00:05:11.270 then do something, right? 122 00:05:11.270 --> 00:05:13.730 So we were looking for the condition to exist 123 00:05:13.730 --> 00:05:16.069 in the data set and then we were acting- 124 00:05:16.069 --> 00:05:17.420 Rules-based. 125 00:05:17.420 --> 00:05:19.420 Right, rules-based, exactly. 126 00:05:19.420 --> 00:05:21.180 Artificial intelligence and machine learning 127 00:05:21.180 --> 00:05:23.820 actually flips that a little bit on the head. 128 00:05:23.820 --> 00:05:26.100 Instead we take the dataset 129 00:05:26.100 --> 00:05:28.450 and say: what does the data tell us? 130 00:05:28.450 --> 00:05:29.960 What does the data tell us 131 00:05:29.960 --> 00:05:32.210 and what does the machine then learn from that? 132 00:05:32.210 --> 00:05:33.610 Well, the challenge of that though 133 00:05:33.610 --> 00:05:36.260 is if you don't understand what's in the data, right? 134 00:05:36.260 --> 00:05:38.220 And the condition of that data 135 00:05:38.220 --> 00:05:40.180 as you bring it into the inquiry, 136 00:05:40.180 --> 00:05:43.430 and you don't understand how those two things fit, 137 00:05:43.430 --> 00:05:46.510 then you wind up with a biased perspective. 138 00:05:46.510 --> 00:05:50.070 Now, it's unintended, it's inherent, right, to your point. 139 00:05:50.070 --> 00:05:53.180 But nevertheless, it's something that we have to back up 140 00:05:53.180 --> 00:05:57.860 and readjust our lenses as we begin to look at our processes 141 00:05:57.860 --> 00:06:00.120 around artificial intelligence. 142 00:06:00.120 --> 00:06:02.000 So we start with the data sets. 143 00:06:02.000 --> 00:06:04.880 And we understand that data sets are going to have 144 00:06:04.880 --> 00:06:06.780 all sorts of bias in them and that's okay. 145 00:06:06.780 --> 00:06:08.690 First of all, bias always gets 146 00:06:09.601 --> 00:06:12.440 kind of a very gripping reaction, right? 147 00:06:12.440 --> 00:06:13.770 I use the example all the time, 148 00:06:13.770 --> 00:06:17.120 if you go back to like the 1910 voter roles 149 00:06:17.120 --> 00:06:18.750 in the United States, 150 00:06:18.750 --> 00:06:20.480 that's a valid data set. 151 00:06:20.480 --> 00:06:23.530 You may use that for whatever purpose you may have 152 00:06:23.530 --> 00:06:28.350 for evaluating something that happened in 1910 or 1911. 153 00:06:28.350 --> 00:06:31.400 But you need to know that inherently, 154 00:06:31.400 --> 00:06:34.770 that data set is going to miss women. 155 00:06:34.770 --> 00:06:37.710 It's going to be missing people of color. 156 00:06:37.710 --> 00:06:40.630 It's going to be missing parts of society. 157 00:06:40.630 --> 00:06:43.020 As long as you know that 158 00:06:43.020 --> 00:06:47.440 then you can design an inquiry that is fit for purpose. 159 00:06:47.440 --> 00:06:50.520 The problem is if you don't remember that 160 00:06:50.520 --> 00:06:52.670 or you're not mindful of that, 161 00:06:52.670 --> 00:06:57.160 then you have an inquiry that's gonna learn off of a dataset 162 00:06:57.160 --> 00:07:01.010 that is missing characteristics that's going to be important 163 00:07:01.010 --> 00:07:03.030 to whatever that other inquiry is. 164 00:07:03.030 --> 00:07:04.890 Those are some of the ways that I think 165 00:07:04.890 --> 00:07:07.500 we can actually begin to design a better future. 166 00:07:07.500 --> 00:07:09.980 But it means really being very mindful 167 00:07:09.980 --> 00:07:13.210 of what's inherent in the dataset, 168 00:07:13.210 --> 00:07:17.220 what's there, what's missing, what also can be imputed. 169 00:07:17.220 --> 00:07:18.620 We can talk about that, 170 00:07:18.620 --> 00:07:21.570 what kind of variables can be created by the machine, right? 171 00:07:21.570 --> 00:07:23.650 It can be imputed as well. 172 00:07:23.650 --> 00:07:25.130 And so all of those things 173 00:07:25.130 --> 00:07:26.900 are things that we're looking at at MasterCard. 174 00:07:26.900 --> 00:07:29.760 So we have a very specific framework that we're using 175 00:07:29.760 --> 00:07:31.660 around ethical AI. 176 00:07:31.660 --> 00:07:34.190 But then we're really getting into the nitty gritty 177 00:07:34.190 --> 00:07:36.580 around different kinds of contexts. 178 00:07:36.580 --> 00:07:37.730 So I wanna build on that 179 00:07:37.730 --> 00:07:40.280 because I think it's a very critical topic 180 00:07:40.280 --> 00:07:42.130 to so many of our listeners. 181 00:07:42.130 --> 00:07:45.080 So you talked about as long as you know 182 00:07:45.080 --> 00:07:46.800 you could look for it. 183 00:07:46.800 --> 00:07:48.740 But then the things you don't know 184 00:07:48.740 --> 00:07:51.470 and then you'd find out in the hindsight, 185 00:07:51.470 --> 00:07:53.640 how do you prepare for those? 186 00:07:53.640 --> 00:07:55.810 Well, I think you have to be aware 187 00:07:55.810 --> 00:07:58.750 again, for what purpose to what end. 188 00:07:58.750 --> 00:08:01.160 And now let's go look at the datasets. 189 00:08:01.160 --> 00:08:05.410 And this is where you've gotta backup the data food chain 190 00:08:05.410 --> 00:08:07.640 to understand your data lineage. 191 00:08:07.640 --> 00:08:10.680 Understand the quality, the consistency 192 00:08:10.680 --> 00:08:15.460 of the data sets that are going into the analysis. 193 00:08:15.460 --> 00:08:16.610 And then this is not easy. 194 00:08:16.610 --> 00:08:19.030 So nothing I'm saying, I want everybody to hear me clearly 195 00:08:19.030 --> 00:08:20.510 nothing I'm saying is easy, right? 196 00:08:20.510 --> 00:08:23.170 Everything requires a lot of scrutiny and time. 197 00:08:23.170 --> 00:08:25.810 But that's easiest if you're doing it yourself. 198 00:08:25.810 --> 00:08:26.950 But if a vendor is doing this 199 00:08:26.950 --> 00:08:29.560 and is presenting this to you as a combined solution, 200 00:08:29.560 --> 00:08:31.150 you need to be even more curious 201 00:08:31.150 --> 00:08:34.170 about what is going into this recipe. 202 00:08:34.170 --> 00:08:38.040 Every one of these elements becomes super important, why? 203 00:08:38.040 --> 00:08:39.980 Because if we do it when we're building it, 204 00:08:39.980 --> 00:08:42.510 those of us who are the data scientists and the data expert, 205 00:08:42.510 --> 00:08:44.700 the data designers, right? 206 00:08:44.700 --> 00:08:47.630 We understand it because when we're gonna be looking at 207 00:08:47.630 --> 00:08:50.300 the information models and the outputs as they come out 208 00:08:50.300 --> 00:08:51.370 in the first generations, 209 00:08:51.370 --> 00:08:52.850 we're gonna be looking for the model drift. 210 00:08:52.850 --> 00:08:55.910 We're gonna be looking to see: is this an accurate output? 211 00:08:55.910 --> 00:08:58.380 Is this truly the result 212 00:08:58.380 --> 00:09:01.900 or is it because of some inequity in the input? 213 00:09:01.900 --> 00:09:04.550 Which is okay if it's accurate, that's okay. 214 00:09:04.550 --> 00:09:06.360 Because the machine's gonna then start learning 215 00:09:06.360 --> 00:09:08.930 and we're gonna put more data and more information. 216 00:09:08.930 --> 00:09:11.630 But if it's incorrect and we haven't caught it 217 00:09:11.630 --> 00:09:15.160 it could get amplified to the incorrect result. 218 00:09:15.160 --> 00:09:16.390 And it could have, 219 00:09:16.390 --> 00:09:18.180 and this is what we also really care about 220 00:09:18.180 --> 00:09:20.280 that's when we get to the output analysis - 221 00:09:20.280 --> 00:09:22.810 it could amplify really bad outcomes. 222 00:09:22.810 --> 00:09:26.360 And the outcomes can be significant or insignificant 223 00:09:26.360 --> 00:09:29.460 depending upon if it's being designed for an individual 224 00:09:29.460 --> 00:09:32.100 or if it's being designed for groups. 225 00:09:32.100 --> 00:09:33.430 Or if it's just being something... 226 00:09:33.430 --> 00:09:35.230 I talk about how I suspect 227 00:09:35.230 --> 00:09:37.550 that when the airlines were first modeling 228 00:09:37.550 --> 00:09:39.740 the size of the overhead bins, 229 00:09:39.740 --> 00:09:42.630 it was on very average heights, okay? 230 00:09:42.630 --> 00:09:45.350 But the average height did not include enough women. 231 00:09:45.350 --> 00:09:48.740 How many ladies climb on seats to get their luggage 232 00:09:48.740 --> 00:09:50.320 at the very back of the bend 233 00:09:50.320 --> 00:09:52.410 or need to rely on one of you nice gentlemen 234 00:09:52.410 --> 00:09:56.160 to grab our overhead suitcases down to assist us, right? 235 00:09:56.160 --> 00:09:59.350 Averages sometimes are the worst result 236 00:09:59.350 --> 00:10:01.660 because they're not fit for purpose. 237 00:10:01.660 --> 00:10:05.370 I think you're talking about injecting an entire new layer 238 00:10:05.370 --> 00:10:08.318 in the data analysis or data usage process, right? 239 00:10:08.318 --> 00:10:09.500 Maybe. (chuckles) 240 00:10:09.500 --> 00:10:12.340 Because, was it not the case, Sam 241 00:10:12.340 --> 00:10:16.980 that 20 years ago people thought of data as neutrals. 242 00:10:16.980 --> 00:10:19.970 Like data is neutral, you use what you want. 243 00:10:19.970 --> 00:10:21.820 But data could be potentially harmful. 244 00:10:21.820 --> 00:10:23.830 And I think what you're talking about is, 245 00:10:23.830 --> 00:10:27.160 you're trying to create a mindset shift 246 00:10:27.160 --> 00:10:31.040 that says it's not just data comes in, answers go out. 247 00:10:31.040 --> 00:10:33.330 But you have to be mindful of why you're using it, 248 00:10:33.330 --> 00:10:35.240 how you're using it. 249 00:10:35.240 --> 00:10:36.700 And this mindset shift 250 00:10:36.700 --> 00:10:38.680 just permeates not just your organization 251 00:10:38.680 --> 00:10:40.420 but also you're saying your vendors 252 00:10:40.420 --> 00:10:42.680 and other parts of the ecosystem. 253 00:10:42.680 --> 00:10:44.360 It's a huge undertaking. 254 00:10:44.360 --> 00:10:46.110 It is a huge undertaking. 255 00:10:46.110 --> 00:10:47.930 So those are some challenging aspects, 256 00:10:47.930 --> 00:10:48.830 what's rewarding? 257 00:10:48.830 --> 00:10:50.210 What kind of exciting things happen? 258 00:10:50.210 --> 00:10:51.797 These seem a little bit sort of like, 259 00:10:51.797 --> 00:10:53.380 "Oh gosh, we have to worry about this, 260 00:10:53.380 --> 00:10:55.400 have to worry about that!" 261 00:10:55.400 --> 00:10:56.840 Get us excited about the fun part. 262 00:10:56.840 --> 00:10:59.520 Oh, I think all of this is fun, I do. 263 00:10:59.520 --> 00:11:00.560 But I'm a data geek, right? 264 00:11:00.560 --> 00:11:04.160 So what's fun is actually seeing my firm 265 00:11:04.160 --> 00:11:05.680 come to life around this. 266 00:11:05.680 --> 00:11:08.470 These are actually very exciting conversations 267 00:11:08.470 --> 00:11:09.540 inside of the firm. 268 00:11:09.540 --> 00:11:10.640 I don't mean to make them sound 269 00:11:10.640 --> 00:11:13.140 like just risk conversations at all. 270 00:11:13.140 --> 00:11:15.320 It's intellectually challenging work. 271 00:11:15.320 --> 00:11:17.090 But we all agree that we think 272 00:11:17.090 --> 00:11:19.380 it's going to bring us to better places. 273 00:11:19.380 --> 00:11:21.580 And already is bringing us to better places 274 00:11:21.580 --> 00:11:22.493 in product design. 275 00:11:23.370 --> 00:11:26.230 You talked about the workforce aspect of it 276 00:11:26.230 --> 00:11:29.380 and that technology is necessary 277 00:11:29.380 --> 00:11:31.720 but is as far from sufficient. 278 00:11:31.720 --> 00:11:33.930 We heard from one of our other speakers, 279 00:11:33.930 --> 00:11:37.400 the president of 1-800-Flowers about learning quotient, 280 00:11:37.400 --> 00:11:39.700 the desire to wanna learn. 281 00:11:39.700 --> 00:11:41.640 How do you bring all of that together 282 00:11:41.640 --> 00:11:44.100 in terms of the future of the workforce 283 00:11:44.100 --> 00:11:45.870 that you guys are looking at 284 00:11:45.870 --> 00:11:49.290 and the kind of talent and skillset and attributes 285 00:11:49.290 --> 00:11:52.210 that are going to be successful in different roles? 286 00:11:52.210 --> 00:11:55.720 Are you taking actions in that direction? 287 00:11:55.720 --> 00:11:56.700 We really look for people 288 00:11:56.700 --> 00:11:59.710 who also can design controls and processes 289 00:11:59.710 --> 00:12:03.570 and make sure that we can navigate those. 290 00:12:03.570 --> 00:12:06.470 We live in a world of connected ecosystems 291 00:12:06.470 --> 00:12:08.730 and so we're only as good as our partners. 292 00:12:08.730 --> 00:12:10.860 But it means those handshakes, right? 293 00:12:10.860 --> 00:12:13.800 Those connections are super important to understand. 294 00:12:13.800 --> 00:12:16.490 And so how do you make those inquiries 295 00:12:16.490 --> 00:12:18.040 of other organizations 296 00:12:18.040 --> 00:12:20.850 so that you can create those connected ecosystems 297 00:12:20.850 --> 00:12:23.440 that are only going to grow in size and scale 298 00:12:23.440 --> 00:12:24.630 for the future? 299 00:12:24.630 --> 00:12:25.990 And how do we help design those? 300 00:12:25.990 --> 00:12:28.250 How do we be the leaders in designing some of those 301 00:12:28.250 --> 00:12:30.190 is really, really important for us. 302 00:12:30.190 --> 00:12:32.760 So it really is this meshing. 303 00:12:32.760 --> 00:12:35.260 I talked about being like a three-sided triangle 304 00:12:35.260 --> 00:12:36.780 of data skills- 305 00:12:36.780 --> 00:12:40.070 Is there a different kind? (all laughing) 306 00:12:40.070 --> 00:12:41.270 But we sit in the middle, right? 307 00:12:41.270 --> 00:12:42.610 We sit in the middle of the business. 308 00:12:42.610 --> 00:12:45.020 We sit with technology skills. 309 00:12:45.020 --> 00:12:47.160 As well as there's an awful lot of policy 310 00:12:47.160 --> 00:12:49.060 that we need to understand often. 311 00:12:49.060 --> 00:12:51.140 It's the other design constraint 312 00:12:51.140 --> 00:12:55.210 of where are law's evolving, where is the restrictions? 313 00:12:55.210 --> 00:12:58.410 The other type of restrictions that we need to know. 314 00:12:58.410 --> 00:13:00.040 Is it gonna be on soil? 315 00:13:00.040 --> 00:13:01.170 Can we not use the data 316 00:13:01.170 --> 00:13:03.390 because it's contractually restricted? 317 00:13:03.390 --> 00:13:06.500 Or it's restricted because it's a certain type of data 318 00:13:06.500 --> 00:13:08.010 it's financial data, right? 319 00:13:08.010 --> 00:13:11.000 It only can be used for one specific type of purpose. 320 00:13:11.000 --> 00:13:14.380 Or it can only be used at an aggregated level for example. 321 00:13:14.380 --> 00:13:16.080 Or it can be used at a segment level. 322 00:13:16.080 --> 00:13:19.010 Those types of restrictions as well at that compliance level 323 00:13:19.010 --> 00:13:20.250 also needs to be understood. 324 00:13:20.250 --> 00:13:21.770 So sitting in the middle of the middle 325 00:13:21.770 --> 00:13:22.770 is sometimes really hard. 326 00:13:22.770 --> 00:13:24.300 So that's why you have to be able to admit 327 00:13:24.300 --> 00:13:26.753 you don't know one aspect of that triangle. 328 00:13:27.660 --> 00:13:29.740 So you mentioned the word design, 329 00:13:29.740 --> 00:13:32.810 a couple of times at least. (JoAnne laughs) 330 00:13:32.810 --> 00:13:34.560 Here it comes. (Sam laughs) 331 00:13:34.560 --> 00:13:37.147 Can you tell us little about how you got to your role? 332 00:13:37.147 --> 00:13:39.060 Because I think the design thinking 333 00:13:39.060 --> 00:13:41.110 is coming from your background. 334 00:13:41.110 --> 00:13:44.053 Can you share with us how you got to where you are? 335 00:13:44.890 --> 00:13:45.723 I think it's probably easier 336 00:13:45.723 --> 00:13:46.710 if I just go quickly backwards. 337 00:13:46.710 --> 00:13:49.020 So prior to being the chief data officer 338 00:13:49.020 --> 00:13:51.640 I was the chief privacy officer for the firm. 339 00:13:51.640 --> 00:13:53.660 Which was a lot of fun. 340 00:13:53.660 --> 00:13:55.330 I enjoyed that role immensely. 341 00:13:55.330 --> 00:13:58.700 I helped the company become GDPR compliant. 342 00:13:58.700 --> 00:14:02.110 Which really was, I think the moment in time 343 00:14:02.110 --> 00:14:05.100 when many many companies were coming to the place 344 00:14:05.100 --> 00:14:08.570 where they needed to operationalize a lot of data risk. 345 00:14:08.570 --> 00:14:10.060 And we had been on that journey. 346 00:14:10.060 --> 00:14:12.200 We had a whole process called privacy by design. 347 00:14:12.200 --> 00:14:13.890 We still use that process. 348 00:14:13.890 --> 00:14:15.933 Lots of companies use that phrase 349 00:14:15.933 --> 00:14:17.360 because it's a regulatory phrase. 350 00:14:17.360 --> 00:14:19.810 But we had already been 351 00:14:19.810 --> 00:14:21.540 looking at our products and solutions. 352 00:14:21.540 --> 00:14:24.410 To try to understand how could we embed privacy and security 353 00:14:24.410 --> 00:14:27.350 into their very design and into their fabric. 354 00:14:27.350 --> 00:14:29.680 And we still do that to this day. 355 00:14:29.680 --> 00:14:31.810 But as we were doing the compliance work 356 00:14:31.810 --> 00:14:35.820 and planning it out for the GDPR, the European privacy law, 357 00:14:35.820 --> 00:14:39.460 we were realizing we were gonna need additional platforms 358 00:14:39.460 --> 00:14:40.990 and systems to be built. 359 00:14:40.990 --> 00:14:42.080 And as I was doing that 360 00:14:42.080 --> 00:14:43.897 I kept saying to anyone who had listened, 361 00:14:43.897 --> 00:14:45.330 "Who's gonna own all the data 362 00:14:45.330 --> 00:14:47.370 because I really need somebody to speak with?" 363 00:14:47.370 --> 00:14:48.697 And they just kept patting me on the head saying, 364 00:14:48.697 --> 00:14:52.190 "You just keep designing everything and it'll be fine." 365 00:14:52.190 --> 00:14:55.110 So here we are that I'm the first chief data officer. 366 00:14:55.110 --> 00:14:58.160 But it was kind of an expansion 367 00:14:58.160 --> 00:14:59.920 and then a severing of a lot of the work 368 00:14:59.920 --> 00:15:01.170 that I had been doing. 369 00:15:01.170 --> 00:15:03.070 But I came to be the chief privacy officer. 370 00:15:03.070 --> 00:15:04.840 I was previously the chief privacy officer 371 00:15:04.840 --> 00:15:06.040 for American Express. 372 00:15:06.040 --> 00:15:09.930 I came to that role after a pretty good size career 373 00:15:09.930 --> 00:15:11.930 at American Express as well. 374 00:15:11.930 --> 00:15:13.390 Working in a variety of roles 375 00:15:13.390 --> 00:15:14.840 I understood how that firm works. 376 00:15:14.840 --> 00:15:16.040 So financial services 377 00:15:16.040 --> 00:15:18.130 is something I've been doing for a while. 378 00:15:18.130 --> 00:15:21.780 But the design piece comes in because of 9/11. 379 00:15:21.780 --> 00:15:24.430 I had the misfortune of being downtown that day. 380 00:15:24.430 --> 00:15:27.590 And the American Express building is right across the street 381 00:15:27.590 --> 00:15:30.223 from was the World Trade Center. 382 00:15:31.163 --> 00:15:34.010 So saw a lot and lost colleagues that day. 383 00:15:34.010 --> 00:15:36.630 And we were all relocated for several months 384 00:15:36.630 --> 00:15:38.470 while the building was repaired. 385 00:15:38.470 --> 00:15:41.047 And so on New Years of the next, I don't know 386 00:15:41.047 --> 00:15:44.740 it was probably like January 4th actually of 2002. 387 00:15:44.740 --> 00:15:46.350 I was thinking, well, there's a lot of people 388 00:15:46.350 --> 00:15:48.760 who were not alive that New Year's. 389 00:15:48.760 --> 00:15:50.870 And I thought, well, what can I do? 390 00:15:50.870 --> 00:15:51.990 What haven't I done in my life 391 00:15:51.990 --> 00:15:54.490 that I really want to go accomplish? 392 00:15:54.490 --> 00:15:56.990 And so I decided to go back to design school. 393 00:15:56.990 --> 00:15:58.010 This is after law school. 394 00:15:58.010 --> 00:16:00.580 So I got my law transcripts, were part of the process. 395 00:16:00.580 --> 00:16:03.370 Which was fun getting them to send my transcripts 396 00:16:03.370 --> 00:16:04.570 to a design school. 397 00:16:04.570 --> 00:16:06.090 But I got a design degree. 398 00:16:06.090 --> 00:16:07.120 And I'm really glad I did 399 00:16:07.120 --> 00:16:09.590 because it's made me a better business person. 400 00:16:09.590 --> 00:16:12.070 And it's given me a whole different way of thinking 401 00:16:12.070 --> 00:16:13.800 about problem solving. 402 00:16:13.800 --> 00:16:15.010 And then I've had the good fortune. 403 00:16:15.010 --> 00:16:17.700 So I do, yes I do interior design work. 404 00:16:17.700 --> 00:16:20.180 And yes I can help you with your kitchen and your bathroom 405 00:16:20.180 --> 00:16:22.330 and all the other things you wanna do during COVID. 406 00:16:22.330 --> 00:16:24.490 I've helped several colleagues pick out tile 407 00:16:24.490 --> 00:16:25.370 and other things. 408 00:16:25.370 --> 00:16:26.540 All right, we'll talk after the podcast about that. 409 00:16:26.540 --> 00:16:29.130 We can talk after the podcast about that. 410 00:16:29.130 --> 00:16:30.700 But I also then had the good fortune 411 00:16:30.700 --> 00:16:33.080 to meet the Dean at Pratt Institute. 412 00:16:33.080 --> 00:16:36.120 And I teach in their design management master's program. 413 00:16:36.120 --> 00:16:37.490 And I teach business strategy 414 00:16:37.490 --> 00:16:39.720 and I've taught other courses in that program. 415 00:16:39.720 --> 00:16:41.010 And that's also helped me 416 00:16:41.010 --> 00:16:43.190 really evolve my design thinking as well. 417 00:16:43.190 --> 00:16:45.130 And so all of these things, 418 00:16:45.130 --> 00:16:47.680 yes, I have a law degree, yes, I'm a data person. 419 00:16:47.680 --> 00:16:50.270 But having the design thinking as well 420 00:16:50.270 --> 00:16:52.460 really makes you not think of things as problems. 421 00:16:52.460 --> 00:16:54.210 So just constraints around 422 00:16:54.210 --> 00:16:56.010 which you have to design things 423 00:16:56.010 --> 00:16:57.410 and that they will shift over time. 424 00:16:57.410 --> 00:16:59.550 So it's no different than the electric is over here 425 00:16:59.550 --> 00:17:00.680 and the plumbing is over here 426 00:17:00.680 --> 00:17:02.490 and you only have so much space. 427 00:17:02.490 --> 00:17:04.480 So how can you utilize this, right? 428 00:17:04.480 --> 00:17:05.870 It's the same thing for, 429 00:17:05.870 --> 00:17:07.700 I can only use the data in this way 430 00:17:07.700 --> 00:17:11.380 and I want to achieve an outcome, how do I do that? 431 00:17:11.380 --> 00:17:14.750 And so it is the same kind of strategic toolkit. 432 00:17:14.750 --> 00:17:16.527 You just kind of flip it for, 433 00:17:16.527 --> 00:17:19.669 "Okay, well, the powder room has to be only this big." 434 00:17:19.669 --> 00:17:22.210 (chuckles) So that's one challenge. 435 00:17:22.210 --> 00:17:27.210 Or the outcome is to design a look alike model for fraud 436 00:17:27.440 --> 00:17:30.660 that only utilizes synthetic data, right? 437 00:17:30.660 --> 00:17:32.150 Or something along those lines. 438 00:17:32.150 --> 00:17:34.570 But it does kind of give you a little bit of a can-do spirit 439 00:17:34.570 --> 00:17:37.150 because you figure something has to be possible. 440 00:17:37.150 --> 00:17:38.730 Because there's raw material in the world. 441 00:17:38.730 --> 00:17:41.180 So that's kind of how I approach things. 442 00:17:41.180 --> 00:17:42.310 Very well said. 443 00:17:42.310 --> 00:17:44.230 That's a very weird triangle. 444 00:17:44.230 --> 00:17:45.976 Seems the triangle is a very weird- 445 00:17:45.976 --> 00:17:49.600 My triangle is probably very weird, I get a lot. 446 00:17:49.600 --> 00:17:52.270 Well, that's a bit of a theme though. 447 00:17:52.270 --> 00:17:53.510 Other people we've talked to 448 00:17:53.510 --> 00:17:56.290 they're bringing in these experiences from other places. 449 00:17:56.290 --> 00:17:59.430 And very often they're not super technical things. 450 00:17:59.430 --> 00:18:00.740 We're talking about artificial intelligence 451 00:18:00.740 --> 00:18:03.470 and you might think that it would quickly go down a path 452 00:18:03.470 --> 00:18:04.930 of extreme technical. 453 00:18:04.930 --> 00:18:09.890 But I think it's just as likely to go down a path of design. 454 00:18:09.890 --> 00:18:11.710 Yeah, I know and what we find on my team 455 00:18:11.710 --> 00:18:14.013 and it generally is that if you find people 456 00:18:14.013 --> 00:18:15.560 that have kind of that combination 457 00:18:15.560 --> 00:18:17.030 of right brain left brain. 458 00:18:17.030 --> 00:18:20.050 So I have lawyers and engineers on my team, 459 00:18:20.050 --> 00:18:21.540 they're both degrees. 460 00:18:21.540 --> 00:18:23.200 It's a really interesting match 461 00:18:23.200 --> 00:18:27.210 of being able to translate the business to the technical, 462 00:18:27.210 --> 00:18:28.870 the technical to the business, right? 463 00:18:28.870 --> 00:18:31.590 Translating the regulatory to the business 464 00:18:31.590 --> 00:18:32.920 or the regulatory to the legal. 465 00:18:32.920 --> 00:18:36.010 It kind of works as a giant translator role. 466 00:18:36.010 --> 00:18:37.960 And if you think about it, that's kind of the moment 467 00:18:37.960 --> 00:18:39.810 that we're in right now. 468 00:18:39.810 --> 00:18:44.420 Is the ability to be fluid in translating concepts 469 00:18:44.420 --> 00:18:46.920 from one domain into another. 470 00:18:46.920 --> 00:18:48.140 I think it works. 471 00:18:48.140 --> 00:18:50.620 But I do think that some of those competencies 472 00:18:50.620 --> 00:18:52.640 are gonna be equally important or more important 473 00:18:52.640 --> 00:18:56.560 as we continue to evolve, as we begin to develop. 474 00:18:56.560 --> 00:18:59.650 They'll be the core skill sets that we've always had, 475 00:18:59.650 --> 00:19:02.150 leadership skills and problem-solving skills 476 00:19:02.150 --> 00:19:03.250 and analytics skills. 477 00:19:03.250 --> 00:19:05.620 But I do think that the ability to translate that 478 00:19:05.620 --> 00:19:08.420 and then also be able to derive from the output, right? 479 00:19:08.420 --> 00:19:11.090 Whether it's from a dashboard or reports or whatever 480 00:19:11.090 --> 00:19:12.140 I think that's also gonna be 481 00:19:12.140 --> 00:19:13.340 really important business skills 482 00:19:13.340 --> 00:19:15.770 for any businessman or woman. 483 00:19:15.770 --> 00:19:16.800 You said right now, 484 00:19:16.800 --> 00:19:18.820 you said like that's the place we are right now. 485 00:19:18.820 --> 00:19:20.040 What do you think has caused that? 486 00:19:20.040 --> 00:19:22.140 What's making that so important right now? 487 00:19:23.190 --> 00:19:25.120 I don't know that I was emphasizing that. 488 00:19:25.120 --> 00:19:27.120 But I think when I look over the past year 489 00:19:27.120 --> 00:19:28.090 and I think about, 490 00:19:28.090 --> 00:19:31.870 we're just on the year mark of COVID for us. 491 00:19:31.870 --> 00:19:33.800 I can remember when we went virtual. 492 00:19:33.800 --> 00:19:35.950 I remember so many of the customer and client calls 493 00:19:35.950 --> 00:19:38.460 that I was helping to fill at the time. 494 00:19:38.460 --> 00:19:41.310 Companies were in a moment, right? 495 00:19:41.310 --> 00:19:43.400 So many merchants were going online. 496 00:19:43.400 --> 00:19:44.810 If they didn't have a huge presence 497 00:19:44.810 --> 00:19:46.410 they needed to create one. 498 00:19:46.410 --> 00:19:48.630 Business models were really shaken. 499 00:19:48.630 --> 00:19:50.200 But everybody was looking for data 500 00:19:50.200 --> 00:19:54.170 to try to help make some decisions, right? 501 00:19:54.170 --> 00:19:55.560 And they needed guidance, 502 00:19:55.560 --> 00:19:57.700 if they didn't have the data, they were looking for data. 503 00:19:57.700 --> 00:19:58.680 If they had the data, 504 00:19:58.680 --> 00:20:01.000 they were looking for guidance on how to interpret it. 505 00:20:01.000 --> 00:20:03.050 If they knew how to interpret it, 506 00:20:03.050 --> 00:20:05.300 they were looking for lookalike information 507 00:20:05.300 --> 00:20:07.890 to make sure they were making the right decisions. 508 00:20:07.890 --> 00:20:10.410 So if you just look at that maturity curve 509 00:20:10.410 --> 00:20:12.730 we have been thrust forward in time. 510 00:20:12.730 --> 00:20:14.150 So I think it would have been happening 511 00:20:14.150 --> 00:20:17.160 maybe in a less compressed way. 512 00:20:17.160 --> 00:20:19.050 I think we've been in a time and compression 513 00:20:19.050 --> 00:20:21.700 where data and digital has mattered. 514 00:20:21.700 --> 00:20:23.820 And so we can't undo that. 515 00:20:23.820 --> 00:20:27.410 And so I think these skillsets matter more and more 516 00:20:27.410 --> 00:20:29.810 just because of the compression we've been under 517 00:20:29.810 --> 00:20:30.820 as a society. 518 00:20:30.820 --> 00:20:33.510 And so we're going to see that 519 00:20:33.510 --> 00:20:37.430 be part of what this next normal or next generation 520 00:20:37.430 --> 00:20:39.040 because I don't really like the word normal. 521 00:20:39.040 --> 00:20:41.640 Next generation, this is now part and parcel 522 00:20:41.640 --> 00:20:43.420 of how we interact with each other. 523 00:20:43.420 --> 00:20:47.290 We will never go back to just being in person. 524 00:20:47.290 --> 00:20:49.740 This kind of connecting digitally 525 00:20:49.740 --> 00:20:53.480 will always now be part of our mandate, right? 526 00:20:53.480 --> 00:20:54.700 And part of the toolbox. 527 00:20:54.700 --> 00:20:57.820 So I just think that there will be more of a need 528 00:20:57.820 --> 00:21:00.310 to be able to interpret and to utilize 529 00:21:00.310 --> 00:21:03.050 than we had a year ago. 530 00:21:03.050 --> 00:21:05.700 That seems like your superpower is taking some situation 531 00:21:05.700 --> 00:21:08.850 whether it's 911 and turning it into a design school. 532 00:21:08.850 --> 00:21:10.900 Or taking a pandemic and turning it 533 00:21:10.900 --> 00:21:13.340 into a more fluid approach to things. 534 00:21:13.340 --> 00:21:15.500 It sounds like your super power in many cases. 535 00:21:15.500 --> 00:21:17.080 Thank you. (chuckles) 536 00:21:17.080 --> 00:21:18.293 I'll take that one. 537 00:21:20.280 --> 00:21:21.860 JoAnne, thank you so much for talking to us. 538 00:21:21.860 --> 00:21:23.600 You've really picked up on a lot of points. 539 00:21:23.600 --> 00:21:24.810 We've covered a lot of ground. 540 00:21:24.810 --> 00:21:26.660 Thanks for taking the time to talk with us today. 541 00:21:26.660 --> 00:21:27.910 Yeah, thank you so much. 542 00:21:27.910 --> 00:21:29.873 You're welcome. Thanks for having me. 543 00:21:29.873 --> 00:21:32.456 (gentle music) 544 00:21:35.780 --> 00:21:36.910 Trevor, that was great. 545 00:21:36.910 --> 00:21:38.610 JoAnne covered a lot today. 546 00:21:38.610 --> 00:21:41.060 What struck you as particularly interesting? 547 00:21:41.060 --> 00:21:44.730 A lot of things struck me in a positive way. 548 00:21:44.730 --> 00:21:47.810 I liked how she talked about design thinking. 549 00:21:47.810 --> 00:21:51.180 She talked about emotional quotient. 550 00:21:51.180 --> 00:21:54.410 And I actually think that's been part of her secret sauce 551 00:21:54.410 --> 00:21:55.910 and part of MasterCard success 552 00:21:55.910 --> 00:21:58.430 with these kinds of initiatives. 553 00:21:58.430 --> 00:21:59.680 Like if you look in the industry, 554 00:21:59.680 --> 00:22:03.890 the CDO role is actually somewhat precarious role. 555 00:22:03.890 --> 00:22:06.990 Because there's very successful CDOs 556 00:22:06.990 --> 00:22:11.080 and there's many CDOs that just don't have that vision 557 00:22:11.080 --> 00:22:13.140 or that ability or that autonomy 558 00:22:13.140 --> 00:22:16.570 or sort of that multi-sided triangle as she talked about. 559 00:22:16.570 --> 00:22:18.440 And so they don't succeed 560 00:22:18.440 --> 00:22:20.035 and they last for a couple of years. 561 00:22:20.035 --> 00:22:23.260 And so I think there's something to be said for that. 562 00:22:23.260 --> 00:22:25.100 What did you think about, Sam 563 00:22:25.100 --> 00:22:29.090 on her comments about implementation the challenges? 564 00:22:29.090 --> 00:22:30.090 Yeah, exactly. 565 00:22:30.090 --> 00:22:31.530 I thought it was interesting that 566 00:22:31.530 --> 00:22:33.030 they've not changed what they're doing 567 00:22:33.030 --> 00:22:34.960 in terms of AI projects. 568 00:22:34.960 --> 00:22:37.410 Because of some of the fact that it's hard 569 00:22:37.410 --> 00:22:39.682 I think they recognize- They've doubled down on it. 570 00:22:39.682 --> 00:22:41.890 They've doubled down on it. 571 00:22:41.890 --> 00:22:44.580 That in itself is a little bit of a validation 572 00:22:44.580 --> 00:22:46.160 that what they're doing they believe in. 573 00:22:46.160 --> 00:22:48.973 That even though it's hard they're still continuing. 574 00:22:50.550 --> 00:22:51.850 I think they're picking projects 575 00:22:51.850 --> 00:22:53.290 based on their strategic focus 576 00:22:53.290 --> 00:22:55.880 rather than picking a project to do AI. 577 00:22:55.880 --> 00:22:58.560 That says something that just keeps coming up. 578 00:22:58.560 --> 00:23:02.390 No one's out here saying, "Hey, let's do some AI today. 579 00:23:02.390 --> 00:23:05.040 It's Monday morning, let's do some AI." 580 00:23:05.040 --> 00:23:06.790 These people are trying to deliver 581 00:23:06.790 --> 00:23:10.170 on things that are in their organization's strategy. 582 00:23:10.170 --> 00:23:13.750 And it turns out that in many cases, AI is the tool. 583 00:23:13.750 --> 00:23:16.850 But if it's not, that's okay too. But it often is. 584 00:23:16.850 --> 00:23:20.800 Strategy with AI, not strategy for AI. 585 00:23:20.800 --> 00:23:21.810 Where did I read that? 586 00:23:21.810 --> 00:23:23.948 Yeah, I know, right? Didn't you write it? 587 00:23:23.948 --> 00:23:26.140 (chuckles) That was all David. 588 00:23:26.140 --> 00:23:28.960 The other thing I thought was quite insightful 589 00:23:28.960 --> 00:23:32.520 and she went there very early on is about 590 00:23:33.420 --> 00:23:37.940 trust and ethical use of AI. 591 00:23:37.940 --> 00:23:40.980 And sort of reigning in the AI solution 592 00:23:40.980 --> 00:23:42.100 to make sure it's ethical. 593 00:23:42.100 --> 00:23:46.540 But also using it to uncover hidden bias. 594 00:23:46.540 --> 00:23:48.400 So that you could be more inclusive, 595 00:23:48.400 --> 00:23:51.560 you could be more aware of the workforce 596 00:23:51.560 --> 00:23:54.050 of your customers, of the ecosystem. 597 00:23:54.050 --> 00:23:56.510 What I liked was particularly insightful there 598 00:23:56.510 --> 00:24:01.510 was the need not to view all data as equally neutral. 599 00:24:02.400 --> 00:24:05.890 And that depending on where the data goes to 600 00:24:05.890 --> 00:24:07.800 very harmful things can come out of it. 601 00:24:07.800 --> 00:24:11.020 Which necessitates another layer. 602 00:24:11.020 --> 00:24:11.853 She hinted on it, right? 603 00:24:11.853 --> 00:24:15.890 There's a layer of awareness and governance. 604 00:24:15.890 --> 00:24:20.800 And a mix of tech artifacts as well as process protocols 605 00:24:20.800 --> 00:24:25.800 to make sure that the outputs are ethical and unbiased. 606 00:24:26.260 --> 00:24:29.920 And she talked a lot about the need for mindset change 607 00:24:29.920 --> 00:24:30.980 for that to work. 608 00:24:30.980 --> 00:24:33.950 That people have to ask, what am I using it for? 609 00:24:33.950 --> 00:24:34.950 What is it gonna do? 610 00:24:34.950 --> 00:24:36.100 What could go wrong? 611 00:24:36.100 --> 00:24:39.510 And that constantly probing 612 00:24:39.510 --> 00:24:44.400 for things that can go wrong so that you could preempt it. 613 00:24:44.400 --> 00:24:47.400 But also she saw it as a tool to prevent bias. 614 00:24:47.400 --> 00:24:49.047 We've seen so many stories out there about, 615 00:24:49.047 --> 00:24:50.930 "Oh, this AI system is biased. 616 00:24:50.930 --> 00:24:52.487 That AI system has caused bias." 617 00:24:52.487 --> 00:24:54.450 And I'm not denying those are true. 618 00:24:54.450 --> 00:24:56.668 I'm not a biased denier. 619 00:24:56.668 --> 00:24:57.501 (Trevor chuckles) 620 00:24:57.501 --> 00:25:00.294 There's an opportunity that she saw in that data 621 00:25:00.294 --> 00:25:01.940 to rectify that. 622 00:25:01.940 --> 00:25:02.973 They are a fact. 623 00:25:04.890 --> 00:25:08.490 No amount of retrofitting is gonna take that away. 624 00:25:08.490 --> 00:25:10.840 But she also said we don't ignore the data either. 625 00:25:10.840 --> 00:25:13.810 We just have to know what that provenance is. 626 00:25:13.810 --> 00:25:16.320 I think that's a great point, Sam. 627 00:25:16.320 --> 00:25:17.690 Thanks for joining us today. 628 00:25:17.690 --> 00:25:20.240 Next time Shervin and I will talk with Chris Couch. 629 00:25:20.240 --> 00:25:21.760 Chris is the senior vice president 630 00:25:21.760 --> 00:25:24.540 and chief technology officer at Cooper Standard. 631 00:25:24.540 --> 00:25:25.373 Please join us. 632 00:25:26.980 --> 00:25:29.740 Thanks for listening to "Me, Myself and AI." 633 00:25:29.740 --> 00:25:31.360 If you're enjoying the show 634 00:25:31.360 --> 00:25:33.470 take a minute to write us a review. 635 00:25:33.470 --> 00:25:35.090 If you send us a screenshot 636 00:25:35.090 --> 00:25:36.310 we'll send you a collection 637 00:25:36.310 --> 00:25:40.010 of MIT SMRs best articles on artificial intelligence 638 00:25:40.010 --> 00:25:41.910 free for a limited time. 639 00:25:41.910 --> 00:25:46.733 Send your review screenshot to smrfeedback@mit.edu. 640 00:25:46.733 --> 00:25:49.316 (gentle music)