WEBVTT 1 00:00:00.000 --> 00:00:01.842 2 00:00:01.842 --> 00:00:03.300 SAM RANSBOTHAM: When outcomes don't 3 00:00:03.300 --> 00:00:05.310 motivate artificial intelligence efforts, 4 00:00:05.310 --> 00:00:06.920 how can they be successful? 5 00:00:06.920 --> 00:00:09.970 Find out how one chief data officer thinks about AI 6 00:00:09.970 --> 00:00:11.870 on today's episode. 7 00:00:11.870 --> 00:00:14.320 JACK BERKOWITZ: I'm Jack Berkowitz from ADP, 8 00:00:14.320 --> 00:00:17.450 and you're listening to Me, Myself, and AI. 9 00:00:17.450 --> 00:00:20.210 SAM RANSBOTHAM: Welcome to Me, Myself, and AI, 10 00:00:20.210 --> 00:00:23.160 a podcast on artificial intelligence in business. 11 00:00:23.160 --> 00:00:26.980 Each episode, we introduce you to someone innovating with AI. 12 00:00:26.980 --> 00:00:31.400 I'm Sam Ransbotham, professor of analytics at Boston College. 13 00:00:31.400 --> 00:00:34.840 I'm also the AI and business strategy guest editor 14 00:00:34.840 --> 00:00:36.620 at MIT Sloan Management Review. 15 00:00:36.620 --> 00:00:38.700 SHERVIN KHODABANDEH: And I'm Shervin Khodabandeh, 16 00:00:38.700 --> 00:00:42.710 senior partner with BCG, and I colead BCG's AI practice 17 00:00:42.710 --> 00:00:43.690 in North America. 18 00:00:43.690 --> 00:00:48.200 Together, MIT SMR and BCG have been researching and publishing 19 00:00:48.200 --> 00:00:51.140 on AI for six years, interviewing hundreds 20 00:00:51.140 --> 00:00:53.260 of practitioners and surveying thousands 21 00:00:53.260 --> 00:00:56.750 of companies on what it takes to build and to deploy and scale 22 00:00:56.750 --> 00:00:59.220 AI capabilities and really transform 23 00:00:59.220 --> 00:01:00.530 the way organizations operate. 24 00:01:00.530 --> 00:01:04.400 SAM RANSBOTHAM: Today, Shervin and I 25 00:01:04.400 --> 00:01:06.360 are excited to have Jack Berkowitz, chief data 26 00:01:06.360 --> 00:01:07.370 officer at ADP. 27 00:01:07.370 --> 00:01:08.540 Jack, thanks for joining us. 28 00:01:08.540 --> 00:01:09.040 Welcome. 29 00:01:09.040 --> 00:01:10.123 JACK BERKOWITZ: Thank you. 30 00:01:10.123 --> 00:01:10.970 Glad to be here. 31 00:01:10.970 --> 00:01:12.387 SAM RANSBOTHAM: Let's get started. 32 00:01:12.387 --> 00:01:14.060 You're the chief data officer at ADP. 33 00:01:14.060 --> 00:01:16.340 Can you tell us about what that role means? 34 00:01:16.340 --> 00:01:19.500 JACK BERKOWITZ: ADP, known as Automatic Data Processing, 35 00:01:19.500 --> 00:01:24.140 is the world's largest provider of HR services, payroll, taxes, 36 00:01:24.140 --> 00:01:24.970 things like that. 37 00:01:24.970 --> 00:01:27.110 We operate in 140 countries. 38 00:01:27.110 --> 00:01:29.500 We have over 900,000 clients. 39 00:01:29.500 --> 00:01:33.100 Millions of people are getting paid from us every day. 40 00:01:33.100 --> 00:01:34.990 I sort of have a two-sided job. 41 00:01:34.990 --> 00:01:38.190 On the one hand, I'm responsible for all of the data that 42 00:01:38.190 --> 00:01:39.310 flows through our systems. 43 00:01:39.310 --> 00:01:40.435 We're a really big company. 44 00:01:40.435 --> 00:01:43.102 We have massive amounts of data, so [it involves] all the things 45 00:01:43.102 --> 00:01:45.830 that are classically associated with chief data officers -- 46 00:01:45.830 --> 00:01:48.860 things about data governance, data security, 47 00:01:48.860 --> 00:01:51.085 usage of analytics. 48 00:01:51.085 --> 00:01:53.710 The other side of my job -- and it's probably even a bigger job 49 00:01:53.710 --> 00:01:57.560 -- is I build data products, and so my team builds people 50 00:01:57.560 --> 00:02:01.310 analytics, benchmarks, compensation information, 51 00:02:01.310 --> 00:02:04.450 all [those] types of products that our clients are using 52 00:02:04.450 --> 00:02:07.237 to take decisions about the world of work every day. 53 00:02:07.237 --> 00:02:09.320 SAM RANSBOTHAM: I didn't hear the words artificial 54 00:02:09.320 --> 00:02:10.910 intelligence in there anywhere. 55 00:02:10.910 --> 00:02:12.560 How is that involved? 56 00:02:12.560 --> 00:02:15.380 JACK BERKOWITZ: I also run that for the company as well, 57 00:02:15.380 --> 00:02:18.720 but we use machine learning throughout those processes -- 58 00:02:18.720 --> 00:02:20.940 whether we're cleaning the information, 59 00:02:20.940 --> 00:02:23.840 whether we're building embedded capabilities in our HR 60 00:02:23.840 --> 00:02:26.190 applications or our payroll applications, 61 00:02:26.190 --> 00:02:29.520 whether we're doing things like aligning job titles. 62 00:02:29.520 --> 00:02:31.560 People would say, "Well, how hard can that be?" 63 00:02:31.560 --> 00:02:34.830 You know, in any given month, we pay about 21 million people. 64 00:02:34.830 --> 00:02:37.020 We have about 14 million job titles, 65 00:02:37.020 --> 00:02:40.870 and we crunch that down to between 6,000 and 8,000 job 66 00:02:40.870 --> 00:02:44.390 titles -- so [there's] an awful lot of very sophisticated 67 00:02:44.390 --> 00:02:47.380 natural language processing and machine learning to make that 68 00:02:47.380 --> 00:02:47.880 happen. 69 00:02:47.880 --> 00:02:50.213 SHERVIN KHODABANDEH: It seems like there's 70 00:02:50.213 --> 00:02:51.880 three different roles that you mentioned 71 00:02:51.880 --> 00:02:53.070 that all come together. 72 00:02:53.070 --> 00:02:56.710 And I say this because at many companies, 73 00:02:56.710 --> 00:02:59.660 there are literally three different roles for what you 74 00:02:59.660 --> 00:03:02.720 mentioned -- for data governance, for data products, 75 00:03:02.720 --> 00:03:08.950 and for AI -- which creates maybe a bit of siloed-ness 76 00:03:08.950 --> 00:03:11.420 and a bit of maybe disconnectedness, 77 00:03:11.420 --> 00:03:14.000 because all these things have to work together. 78 00:03:14.000 --> 00:03:19.450 Comment a bit, please, on how it came about that it's 79 00:03:19.450 --> 00:03:21.230 one person leading all three. 80 00:03:21.230 --> 00:03:22.890 That's my first question. 81 00:03:22.890 --> 00:03:27.680 And then, my second question is, is the AI involvement 82 00:03:27.680 --> 00:03:30.070 only in the data products, or is it 83 00:03:30.070 --> 00:03:33.560 a broader role that you have that you're also supporting 84 00:03:33.560 --> 00:03:35.170 AI for the broader enterprise? 85 00:03:35.170 --> 00:03:37.620 JACK BERKOWITZ: It's a really good question. 86 00:03:37.620 --> 00:03:41.860 The thing to know about ADP is, yes, we're a services company, 87 00:03:41.860 --> 00:03:44.600 in the sense that we provide, for example, 88 00:03:44.600 --> 00:03:48.150 payroll for about 1 in 6, or even more than that, 89 00:03:48.150 --> 00:03:51.150 people in the U.S. But we also are a SaaS [software 90 00:03:51.150 --> 00:03:54.000 as a service] product company, and because of that, 91 00:03:54.000 --> 00:03:56.550 there's a whole bunch of different development 92 00:03:56.550 --> 00:04:00.020 organizations working on building SaaS products, 93 00:04:00.020 --> 00:04:02.030 whether it's for the small businesses 94 00:04:02.030 --> 00:04:04.650 all the way up to the biggest companies in the world using 95 00:04:04.650 --> 00:04:08.770 our applications to do HR or recruiting or payroll 96 00:04:08.770 --> 00:04:10.710 or taxes, things like that. 97 00:04:10.710 --> 00:04:14.050 And because of that, this role emerged, really. 98 00:04:14.050 --> 00:04:16.399 It started as building data products, 99 00:04:16.399 --> 00:04:19.510 but to build data products and things like reporting, 100 00:04:19.510 --> 00:04:21.410 it grew the data platforms. 101 00:04:21.410 --> 00:04:25.140 And off the data platforms, it grew more and more capabilities 102 00:04:25.140 --> 00:04:28.430 in terms of doing machine learning, best practices. 103 00:04:28.430 --> 00:04:30.550 We got into the ethical use of data 104 00:04:30.550 --> 00:04:33.610 and the ethical use of machine learning and AI, 105 00:04:33.610 --> 00:04:36.400 and that allowed us to be additive in terms 106 00:04:36.400 --> 00:04:37.670 of capabilities. 107 00:04:37.670 --> 00:04:40.060 The other thing about it, then, is, OK, 108 00:04:40.060 --> 00:04:41.350 well, where's the extent? 109 00:04:41.350 --> 00:04:44.090 Because we have all of those SaaS applications, 110 00:04:44.090 --> 00:04:47.310 my teams will sometimes build the embedded capabilities 111 00:04:47.310 --> 00:04:48.750 for other applications. 112 00:04:48.750 --> 00:04:53.540 But we also enable those other development organizations 113 00:04:53.540 --> 00:04:55.510 to use the frameworks that we build. 114 00:04:55.510 --> 00:04:58.230 We, for example, build a whole bunch of machine learning 115 00:04:58.230 --> 00:05:01.460 operations capabilities -- things about bias monitoring 116 00:05:01.460 --> 00:05:04.020 and data shape monitoring -- because that makes sense to be 117 00:05:04.020 --> 00:05:07.870 done once in a company and then allow other people to take 118 00:05:07.870 --> 00:05:08.680 advantage of it. 119 00:05:08.680 --> 00:05:12.260 We've seen a massive growth in people identifying themselves 120 00:05:12.260 --> 00:05:14.745 as data scientists over the past four years. 121 00:05:14.745 --> 00:05:16.620 We've been hiring people and everything else, 122 00:05:16.620 --> 00:05:20.260 but they don't all have to learn how to do model 123 00:05:20.260 --> 00:05:21.730 deployment into production. 124 00:05:21.730 --> 00:05:23.410 SHERVIN KHODABANDEH: Very interesting. 125 00:05:23.410 --> 00:05:26.680 Is it fair to say that introduction and maybe 126 00:05:26.680 --> 00:05:29.970 scaling of AI more broadly outside of the data products 127 00:05:29.970 --> 00:05:33.640 that you do for your customers was sort of the data 128 00:05:33.640 --> 00:05:34.780 products themselves? 129 00:05:34.780 --> 00:05:36.910 The incubation of these data products 130 00:05:36.910 --> 00:05:39.430 opened the eye of the organization. 131 00:05:39.430 --> 00:05:40.680 JACK BERKOWITZ: Yeah, exactly. 132 00:05:40.680 --> 00:05:41.513 It was exactly that. 133 00:05:41.513 --> 00:05:42.880 It was this incubation. 134 00:05:42.880 --> 00:05:45.590 And then, off the incubation, we started 135 00:05:45.590 --> 00:05:49.450 to see areas of opportunity and areas of excitement. 136 00:05:49.450 --> 00:05:52.370 It wasn't really a top-down push. 137 00:05:52.370 --> 00:05:54.910 It was very much a bottom-up, where 138 00:05:54.910 --> 00:05:57.103 teams were seeing what we were achieving, 139 00:05:57.103 --> 00:05:59.270 and then other teams would come to us and say, "Hey, 140 00:05:59.270 --> 00:05:59.853 wait a second. 141 00:05:59.853 --> 00:06:01.390 We want to build a capability. 142 00:06:01.390 --> 00:06:02.680 Can you work with us?" 143 00:06:02.680 --> 00:06:05.640 And so it's really become an organic growth. 144 00:06:05.640 --> 00:06:07.800 SHERVIN KHODABANDEH: I really love this story. 145 00:06:07.800 --> 00:06:12.640 Often, I get asked to talk to groups or [do] interviews 146 00:06:12.640 --> 00:06:16.100 with media around chief data officer roles, 147 00:06:16.100 --> 00:06:18.657 and there's a question around, "What's the right chief data 148 00:06:18.657 --> 00:06:19.240 officer role?" 149 00:06:19.240 --> 00:06:22.670 And I've always been saying that role has to be really, 150 00:06:22.670 --> 00:06:24.770 really linked to the use of data, 151 00:06:24.770 --> 00:06:26.700 not just to the governance of data 152 00:06:26.700 --> 00:06:28.470 and to building things with data. 153 00:06:28.470 --> 00:06:30.330 And I think you're a great example 154 00:06:30.330 --> 00:06:34.300 of the right setup of that role and success with that role. 155 00:06:34.300 --> 00:06:35.920 JACK BERKOWITZ: It's interesting, 156 00:06:35.920 --> 00:06:38.390 because my career's all been about product development 157 00:06:38.390 --> 00:06:39.600 or outcomes. 158 00:06:39.600 --> 00:06:42.310 It's been about making sure that you have business outcomes. 159 00:06:42.310 --> 00:06:43.960 Are you building something, are people buying it, 160 00:06:43.960 --> 00:06:46.180 or are you building something and you're coming out 161 00:06:46.180 --> 00:06:47.590 with a better capability? 162 00:06:47.590 --> 00:06:52.810 We bring that product mindset to even our data governance. 163 00:06:52.810 --> 00:06:55.340 Yes, we need to do governance, but it's not 164 00:06:55.340 --> 00:06:57.440 for regulatory compliance, necessarily. 165 00:06:57.440 --> 00:07:00.957 It's really about making sure that we understand 166 00:07:00.957 --> 00:07:03.290 the information such that somebody can build a good data 167 00:07:03.290 --> 00:07:06.340 product on top or a good machine learning capability on top. 168 00:07:06.340 --> 00:07:09.290 Otherwise, why are we doing all this? 169 00:07:09.290 --> 00:07:12.230 I graduated right at the time of a recession. 170 00:07:12.230 --> 00:07:13.460 Sound familiar? 171 00:07:13.460 --> 00:07:17.370 I spent about 10 years in engineering consulting, 172 00:07:17.370 --> 00:07:21.020 mostly for DARPA, which is the Defense Advanced Research 173 00:07:21.020 --> 00:07:25.150 Projects Agency, which gets you involved in interesting things. 174 00:07:25.150 --> 00:07:28.540 From there, I decided to start a company with a few friends 175 00:07:28.540 --> 00:07:31.080 and got into the startup world, and then I 176 00:07:31.080 --> 00:07:34.680 had a great opportunity to join Oracle, maybe 11 years ago now, 177 00:07:34.680 --> 00:07:36.680 and really enjoyed my time there. 178 00:07:36.680 --> 00:07:39.890 And then I was able to bring it to ADP four years ago. 179 00:07:39.890 --> 00:07:42.490 ADP has really been the pinnacle of my career. 180 00:07:42.490 --> 00:07:45.060 I couldn't have asked for a better situation in terms 181 00:07:45.060 --> 00:07:50.110 of combining all those learnings of how you watch a user doing 182 00:07:50.110 --> 00:07:52.550 something, how you start a business 183 00:07:52.550 --> 00:07:55.450 in these little startup companies that were VC-backed, 184 00:07:55.450 --> 00:07:58.060 to the data and the technology. 185 00:07:58.060 --> 00:08:00.520 We have to run these systems 24/7. 186 00:08:00.520 --> 00:08:02.740 Companies depend on these systems 187 00:08:02.740 --> 00:08:05.600 to pay their employees, which is, one would argue, 188 00:08:05.600 --> 00:08:08.540 one of the most important things that exists in a company, 189 00:08:08.540 --> 00:08:11.250 particularly today, in today's environment. 190 00:08:11.250 --> 00:08:13.720 SHERVIN KHODABANDEH: Can you share with us 191 00:08:13.720 --> 00:08:17.960 some uses of AI in the products that you 192 00:08:17.960 --> 00:08:20.580 build for your customers, as well as 193 00:08:20.580 --> 00:08:23.650 maybe those that are broadly used for the enterprise 194 00:08:23.650 --> 00:08:27.090 or for maybe core processes or more internally focused? 195 00:08:27.090 --> 00:08:28.790 JACK BERKOWITZ: We're in the HR space, 196 00:08:28.790 --> 00:08:31.205 so [that] runs a wide range of capabilities. 197 00:08:31.205 --> 00:08:33.330 One of the things that we're doing right now [that] 198 00:08:33.330 --> 00:08:37.559 we're really excited about is, we've used that job title 199 00:08:37.559 --> 00:08:41.200 information along with a lot of other natural language 200 00:08:41.200 --> 00:08:43.580 processing to come up with a skills graph, -- 201 00:08:43.580 --> 00:08:46.220 a 100% data-driven skills graph. 202 00:08:46.220 --> 00:08:50.340 A lot of other vendors do it with hand-cranked ontologies. 203 00:08:50.340 --> 00:08:52.620 Downstream from that, that skills graph 204 00:08:52.620 --> 00:08:54.690 shows up in a variety of places, whether it's 205 00:08:54.690 --> 00:08:57.790 in recruiting applications, the employee profile 206 00:08:57.790 --> 00:08:59.930 inside of a company so that people can find 207 00:08:59.930 --> 00:09:01.620 new roles, things like that. 208 00:09:01.620 --> 00:09:05.090 We have a group that's doing recommendations for people 209 00:09:05.090 --> 00:09:06.500 in retirement programs. 210 00:09:06.500 --> 00:09:09.755 We have a big retirement program for people in small businesses. 211 00:09:09.755 --> 00:09:11.630 A lot of times, [with] small business, people 212 00:09:11.630 --> 00:09:14.100 aren't offered healthcare or retirement. 213 00:09:14.100 --> 00:09:17.130 ADP makes those services available to small-business 214 00:09:17.130 --> 00:09:18.830 people to offer to their employees. 215 00:09:18.830 --> 00:09:21.200 And there's a capability that recommends to people 216 00:09:21.200 --> 00:09:23.580 to say, "Hey, people like you will actually 217 00:09:23.580 --> 00:09:27.100 invest more or less in their retirement program." 218 00:09:27.100 --> 00:09:30.310 And so that's a machine learning-based capability. 219 00:09:30.310 --> 00:09:32.310 But then we also, just like any other company, 220 00:09:32.310 --> 00:09:34.930 use machine learning across the board in other areas. 221 00:09:34.930 --> 00:09:38.040 We do a lot of things in our sales and marketing channels. 222 00:09:38.040 --> 00:09:39.460 But, more importantly, we do a lot 223 00:09:39.460 --> 00:09:41.130 of things in our service [channels]. 224 00:09:41.130 --> 00:09:43.540 So we're doing an awful lot right now 225 00:09:43.540 --> 00:09:47.220 to create a self-service environment for our clients. 226 00:09:47.220 --> 00:09:50.480 Our ability to create a better service environment for them 227 00:09:50.480 --> 00:09:52.480 creates a better experience, right? 228 00:09:52.480 --> 00:09:56.757 They get more accurate pay or they get a better experience 229 00:09:56.757 --> 00:09:57.590 for their employees. 230 00:09:57.590 --> 00:09:59.550 In turn, that's better business for us, 231 00:09:59.550 --> 00:10:02.680 so we're spending as much money and as much time 232 00:10:02.680 --> 00:10:04.880 on making that service experience great 233 00:10:04.880 --> 00:10:07.060 as we are [on] making the core product great. 234 00:10:07.060 --> 00:10:10.270 SAM RANSBOTHAM: I know you've advocated for the idea of an AI 235 00:10:10.270 --> 00:10:11.185 ethics board. 236 00:10:11.185 --> 00:10:13.060 I'll take the counter: Why is that important? 237 00:10:13.060 --> 00:10:14.185 What's the benefit of that? 238 00:10:14.185 --> 00:10:16.440 Why bother setting up AI ethics boards? 239 00:10:16.440 --> 00:10:19.820 JACK BERKOWITZ: We started it because we just 240 00:10:19.820 --> 00:10:24.890 felt that the pace of technology and the pace of data 241 00:10:24.890 --> 00:10:27.860 probably weren't representing the values that we wanted 242 00:10:27.860 --> 00:10:29.620 to represent with our clients. 243 00:10:29.620 --> 00:10:31.470 We started it originally because we thought 244 00:10:31.470 --> 00:10:33.100 it was the right thing to do. 245 00:10:33.100 --> 00:10:36.330 Where it's gone from there has really been interesting. 246 00:10:36.330 --> 00:10:39.240 We're learning a heck of a lot, both in terms 247 00:10:39.240 --> 00:10:42.650 of our own product development but also externally 248 00:10:42.650 --> 00:10:46.880 about how to educate not just our clients but even 249 00:10:46.880 --> 00:10:52.160 our ADP associates, [and] in terms of how we evaluate where 250 00:10:52.160 --> 00:10:56.460 we want to do business, like biometrics or voice 251 00:10:56.460 --> 00:11:01.240 recognition, or even what data access rights mean. 252 00:11:01.240 --> 00:11:04.100 There's also now, three years later, 253 00:11:04.100 --> 00:11:07.030 a big regulatory push, both in the EU, 254 00:11:07.030 --> 00:11:10.312 the FTC of the United States, the EEOC [U.S. Equal Employment 255 00:11:10.312 --> 00:11:12.770 Opportunity Commission], and so we're not reactive to that. 256 00:11:12.770 --> 00:11:14.310 We know what to think about. 257 00:11:14.310 --> 00:11:16.078 We're in a great position to deal with it 258 00:11:16.078 --> 00:11:17.370 by thinking a little bit ahead. 259 00:11:17.370 --> 00:11:21.020 SHERVIN KHODABANDEH: From a setup and accountability 260 00:11:21.020 --> 00:11:22.910 perspective, do you see the topic 261 00:11:22.910 --> 00:11:28.370 of ethics and responsible AI governed by a board 262 00:11:28.370 --> 00:11:32.920 or governed by a person advised by a board? 263 00:11:32.920 --> 00:11:35.730 JACK BERKOWITZ: We're much more in the latter, 264 00:11:35.730 --> 00:11:37.730 and we want to do that for a reason. 265 00:11:37.730 --> 00:11:40.730 We bring external experts onto our board. 266 00:11:40.730 --> 00:11:43.930 We have people from the HR domain. 267 00:11:43.930 --> 00:11:46.710 We'll have people from the machine learning world. 268 00:11:46.710 --> 00:11:48.820 We'll bring [in] ethicists. 269 00:11:48.820 --> 00:11:50.960 We want that board to have freedom of thought. 270 00:11:50.960 --> 00:11:54.840 We have very structured product release processes, 271 00:11:54.840 --> 00:11:58.830 whether it's for the products that we release to clients 272 00:11:58.830 --> 00:12:01.110 or whether it's the products that we use internally. 273 00:12:01.110 --> 00:12:03.850 And we have governance throughout there, and security; 274 00:12:03.850 --> 00:12:05.730 if you can imagine, data security 275 00:12:05.730 --> 00:12:08.660 is at the top of our list at the moment. 276 00:12:08.660 --> 00:12:11.750 Also on the board is our chief privacy officer. 277 00:12:11.750 --> 00:12:15.630 So we want the board to have freedom of thought. 278 00:12:15.630 --> 00:12:17.440 And it's an adviser. 279 00:12:17.440 --> 00:12:19.950 Project teams have to present to the board 280 00:12:19.950 --> 00:12:22.070 as part of going to market for areas. 281 00:12:22.070 --> 00:12:22.150 282 00:12:22.150 --> 00:12:23.900 SAM RANSBOTHAM: It's interesting, Shervin. 283 00:12:23.900 --> 00:12:25.970 You know, Jack, I don't want to discount 284 00:12:25.970 --> 00:12:27.750 how important and sensitive your data is, 285 00:12:27.750 --> 00:12:30.450 because clearly it's very sensitive data. 286 00:12:30.450 --> 00:12:31.838 But it's interesting. 287 00:12:31.838 --> 00:12:33.380 Shervin and I talk with people; we'll 288 00:12:33.380 --> 00:12:35.220 talk with people in medical and health, 289 00:12:35.220 --> 00:12:38.690 and everyone seems to have this moment of, like, "Oh gosh; 290 00:12:38.690 --> 00:12:41.950 oh, our data is really sensitive and important." 291 00:12:41.950 --> 00:12:44.780 And I think maybe that's ubiquitous now, 292 00:12:44.780 --> 00:12:47.390 that all data seems to be like that. 293 00:12:47.390 --> 00:12:49.860 I guess a lot of people can learn from the board set up 294 00:12:49.860 --> 00:12:50.900 like this. 295 00:12:50.900 --> 00:12:52.540 JACK BERKOWITZ: It's not a bad thing 296 00:12:52.540 --> 00:12:56.122 to be protecting people's interest in their information. 297 00:12:56.122 --> 00:12:58.080 SAM RANSBOTHAM: It's really pretty fascinating, 298 00:12:58.080 --> 00:13:00.740 because there are lots of sources of information 299 00:13:00.740 --> 00:13:03.800 about salary, and people self-report in lots of areas, 300 00:13:03.800 --> 00:13:07.030 but you've got ground truths on a lot of information about what 301 00:13:07.030 --> 00:13:09.000 actually hits their bank accounts. 302 00:13:09.000 --> 00:13:11.258 It gives great insight into really 303 00:13:11.258 --> 00:13:12.550 what's going on in the economy. 304 00:13:12.550 --> 00:13:16.050 JACK BERKOWITZ: It creates a unique capability, 305 00:13:16.050 --> 00:13:19.170 both to be able to provide that information to our clients 306 00:13:19.170 --> 00:13:21.430 or to their employees or associates, 307 00:13:21.430 --> 00:13:24.620 but also to treat it properly, right? 308 00:13:24.620 --> 00:13:27.050 We have a great opportunity to treat it properly. 309 00:13:27.050 --> 00:13:28.930 And so all the levels of data security, 310 00:13:28.930 --> 00:13:31.760 all the levels of all the great things CDOs care about -- 311 00:13:31.760 --> 00:13:35.010 you know, data governance, providence, lineage -- 312 00:13:35.010 --> 00:13:38.210 we have a wonderful opportunity to practice the field. 313 00:13:38.210 --> 00:13:40.460 SAM RANSBOTHAM: One area that I think you probably 314 00:13:40.460 --> 00:13:42.852 are interested in mentioning is this idea of comparing 315 00:13:42.852 --> 00:13:44.810 DEI [diversity, equity, and inclusion] metrics. 316 00:13:44.810 --> 00:13:47.080 That's a great place that you've been 317 00:13:47.080 --> 00:13:50.640 able to provide benchmarking and give insight into what's really 318 00:13:50.640 --> 00:13:52.280 going on versus what people would 319 00:13:52.280 --> 00:13:53.880 like you to think is going on. 320 00:13:53.880 --> 00:13:54.958 JACK BERKOWITZ: Yeah. 321 00:13:54.958 --> 00:13:55.750 It's a great point. 322 00:13:55.750 --> 00:13:58.580 The company actually started that in 2017, [when it] 323 00:13:58.580 --> 00:14:01.340 published its first pay equity explorer, 324 00:14:01.340 --> 00:14:04.420 which allowed companies to take a look at how 325 00:14:04.420 --> 00:14:06.100 they were doing in terms of pay equity 326 00:14:06.100 --> 00:14:08.680 gaps for disadvantaged groups. 327 00:14:08.680 --> 00:14:11.350 Now we have the benchmarking capability that 328 00:14:11.350 --> 00:14:14.130 allows a company to see, for their location, 329 00:14:14.130 --> 00:14:17.280 for their industry, for their company size, 330 00:14:17.280 --> 00:14:21.410 how are they doing in terms of creating a diverse environment, 331 00:14:21.410 --> 00:14:23.430 and then also, how are they doing not just 332 00:14:23.430 --> 00:14:25.850 bringing people in, but actually advancing them 333 00:14:25.850 --> 00:14:26.910 in their careers? 334 00:14:26.910 --> 00:14:29.780 By bringing all that together, by using our benchmarking 335 00:14:29.780 --> 00:14:32.170 capability, by solving a problem, 336 00:14:32.170 --> 00:14:35.330 by looking at an outcome, we've had great success. 337 00:14:35.330 --> 00:14:39.340 We can run multiregression analysis down to the individual 338 00:14:39.340 --> 00:14:41.080 -- not just inside their company, 339 00:14:41.080 --> 00:14:43.680 but against the diverse population in that local 340 00:14:43.680 --> 00:14:45.210 geography or industry. 341 00:14:45.210 --> 00:14:48.580 But then we can say, "OK, here are four or five budget 342 00:14:48.580 --> 00:14:49.880 scenarios." 343 00:14:49.880 --> 00:14:51.950 Because it's one thing to say, "Hey, 344 00:14:51.950 --> 00:14:53.720 you have pay equity issues." 345 00:14:53.720 --> 00:14:57.270 But, you know, maybe the company has budgetary things, 346 00:14:57.270 --> 00:15:01.150 so they can make some choices about budgetary scenarios, 347 00:15:01.150 --> 00:15:04.000 and it tells them, "OK, if you want to close this budget, 348 00:15:04.000 --> 00:15:06.760 these are the people that you're able to cover." 349 00:15:06.760 --> 00:15:09.170 And so they basically can change that. 350 00:15:09.170 --> 00:15:11.610 They come out and then, boom, they 351 00:15:11.610 --> 00:15:14.760 can make those changes straight into people's paychecks. 352 00:15:14.760 --> 00:15:16.085 And it's a meaningful impact. 353 00:15:16.085 --> 00:15:17.710 SHERVIN KHODABANDEH: This is wonderful. 354 00:15:17.710 --> 00:15:20.373 The question I have, which would be 355 00:15:20.373 --> 00:15:21.790 very interesting for our audience, 356 00:15:21.790 --> 00:15:24.150 is, where do you get started? 357 00:15:24.150 --> 00:15:26.220 Because we see ... 358 00:15:26.220 --> 00:15:29.540 I mean, I see this in my work, [I] see this with Sam when 359 00:15:29.540 --> 00:15:34.060 we interview and research the topic of AI deployment, 360 00:15:34.060 --> 00:15:39.650 there's a question around, how much do you build capabilities 361 00:15:39.650 --> 00:15:44.570 before you begin to monetize or commercialize or build data 362 00:15:44.570 --> 00:15:48.310 products or use cases, versus how much value- 363 00:15:48.310 --> 00:15:49.920 and use case-driven you are. 364 00:15:49.920 --> 00:15:52.950 And I'm really interested in your perspective, 365 00:15:52.950 --> 00:15:56.080 both for ADP and any advice you have 366 00:15:56.080 --> 00:15:58.340 for others in the early stages of their journey. 367 00:15:58.340 --> 00:16:00.000 JACK BERKOWITZ: The way I've always 368 00:16:00.000 --> 00:16:02.330 looked at it is, if you're building a product, 369 00:16:02.330 --> 00:16:04.330 whether you're a startup company or any other, 370 00:16:04.330 --> 00:16:09.140 is, build the thread from one corner of the piece of paper 371 00:16:09.140 --> 00:16:11.380 up to the other corner of the piece of paper. 372 00:16:11.380 --> 00:16:15.360 And use that thread -- in other words, a use case or two -- 373 00:16:15.360 --> 00:16:19.050 to help you define what you need in your situation with 374 00:16:19.050 --> 00:16:21.490 your company at that time. 375 00:16:21.490 --> 00:16:24.610 You could say those are prototypes, but in my mind, 376 00:16:24.610 --> 00:16:27.280 a prototype is useless unless you actually 377 00:16:27.280 --> 00:16:30.290 try to have an impact with it, because you don't learn 378 00:16:30.290 --> 00:16:32.190 about how to measure outcome. 379 00:16:32.190 --> 00:16:35.630 You won't learn how to measure what you actually need. 380 00:16:35.630 --> 00:16:38.030 The thing that's been lost in machine learning and all 381 00:16:38.030 --> 00:16:40.520 the buzz over the past six, seven years 382 00:16:40.520 --> 00:16:45.110 is that all machine learning and all AI 383 00:16:45.110 --> 00:16:46.810 is targeted to an outcome. 384 00:16:46.810 --> 00:16:51.630 To me, it's really about fielding some capability. 385 00:16:51.630 --> 00:16:53.420 Off of fielding that capability, you'll 386 00:16:53.420 --> 00:16:55.980 learn what levels of machine learning operations you need, 387 00:16:55.980 --> 00:16:58.150 you'll learn what levels of data you need. 388 00:16:58.150 --> 00:17:01.020 And I know that there's 1,400 vendors. 389 00:17:01.020 --> 00:17:05.819 Matt Turck has the great FirstMark Capital matrixof it, 390 00:17:05.819 --> 00:17:09.420 and I remember when that was only 30 vendors, by the way. 391 00:17:09.420 --> 00:17:11.560 I know all 1,400 vendors will tell you 392 00:17:11.560 --> 00:17:13.849 [that] you need to buy their stuff right away, 393 00:17:13.849 --> 00:17:15.560 and that's just not true. 394 00:17:15.560 --> 00:17:17.130 That's just not true. 395 00:17:17.130 --> 00:17:18.700 You've got to buy some of it, though, 396 00:17:18.700 --> 00:17:20.980 in order to get that initial thing fielded. 397 00:17:20.980 --> 00:17:21.135 398 00:17:21.135 --> 00:17:22.510 SHERVIN KHODABANDEH: Yeah, I'm so 399 00:17:22.510 --> 00:17:24.960 happy you say that, because that's, honestly, 400 00:17:24.960 --> 00:17:29.590 been the unlock we've seen both in our research, Sam, where we 401 00:17:29.590 --> 00:17:34.128 see that firms that just take technology first maybe 402 00:17:34.128 --> 00:17:35.670 get a little bit of value but there's 403 00:17:35.670 --> 00:17:37.650 a big piece they don't get. 404 00:17:37.650 --> 00:17:40.290 But also, in our work at BCG, that's 405 00:17:40.290 --> 00:17:43.530 been the major unlock, to be value-driven and 406 00:17:43.530 --> 00:17:45.000 outcome-focused. 407 00:17:45.000 --> 00:17:47.340 And I like how you talk about the thread, 408 00:17:47.340 --> 00:17:49.890 because you cannot just build these things in silos, 409 00:17:49.890 --> 00:17:54.560 but it doesn't have to mean that you build the full stadium 410 00:17:54.560 --> 00:17:56.930 before the baseball [game] begins. 411 00:17:56.930 --> 00:17:58.220 You could start playing. 412 00:17:58.220 --> 00:17:59.480 JACK BERKOWITZ: Exactly. 413 00:17:59.480 --> 00:18:01.373 At a company before ADP, we used to call it 414 00:18:01.373 --> 00:18:02.790 a "Field of Dreams business plan." 415 00:18:02.790 --> 00:18:05.730 It's like, "Nobody ever invented baseball; why are we 416 00:18:05.730 --> 00:18:07.390 building a baseball field now?" 417 00:18:07.390 --> 00:18:09.780 The whole idea is, get that thread working. 418 00:18:09.780 --> 00:18:12.240 And, you know, maybe it's not all the way connected. 419 00:18:12.240 --> 00:18:14.930 Maybe you still have somebody standing up with a floppy disk 420 00:18:14.930 --> 00:18:17.620 and running to the other computer to make it all work, 421 00:18:17.620 --> 00:18:18.900 but at least you have an idea. 422 00:18:18.900 --> 00:18:21.080 And then you can broaden out that thread over time. 423 00:18:21.080 --> 00:18:21.580 That's all. 424 00:18:21.580 --> 00:18:22.997 SAM RANSBOTHAM: For our listeners, 425 00:18:22.997 --> 00:18:24.700 floppy disks were things that people 426 00:18:24.700 --> 00:18:27.140 had to put into computers to store information. 427 00:18:27.140 --> 00:18:27.640 428 00:18:27.640 --> 00:18:29.195 JACK BERKOWITZ: Yeah, thank you, Sam. 429 00:18:29.195 --> 00:18:29.820 Thank you, Sam. 430 00:18:29.820 --> 00:18:33.260 You can see it -- you'll actually see it on icons on old 431 00:18:33.260 --> 00:18:35.740 Macs and old PCs, so, yeah. 432 00:18:35.740 --> 00:18:37.720 SHERVIN KHODABANDEH: Jack, thank you so much. 433 00:18:37.720 --> 00:18:40.050 This has been so wonderful and insightful. 434 00:18:40.050 --> 00:18:43.920 This brings us to the next section of our show, where 435 00:18:43.920 --> 00:18:45.303 we ask you five questions. 436 00:18:45.303 --> 00:18:46.220 JACK BERKOWITZ: Great. 437 00:18:46.220 --> 00:18:48.730 SHERVIN KHODABANDEH: And we expect some quick reactions 438 00:18:48.730 --> 00:18:50.100 to these questions. 439 00:18:50.100 --> 00:18:51.930 So I'll start. 440 00:18:51.930 --> 00:18:54.073 What's your proudest AI moment? 441 00:18:54.073 --> 00:18:54.900 442 00:18:54.900 --> 00:18:56.870 JACK BERKOWITZ: My proudest AI moment 443 00:18:56.870 --> 00:18:59.178 is when my algorithm went into production. 444 00:18:59.178 --> 00:19:00.470 SHERVIN KHODABANDEH: Very good. 445 00:19:00.470 --> 00:19:01.887 SAM RANSBOTHAM: That ties in well. 446 00:19:01.887 --> 00:19:02.387 447 00:19:02.387 --> 00:19:04.620 SHERVIN KHODABANDEH: What worries you about AI? 448 00:19:04.620 --> 00:19:07.930 JACK BERKOWITZ: I think the next reoccurrence of the AI winter. 449 00:19:07.930 --> 00:19:10.170 Having been through it the first time 450 00:19:10.170 --> 00:19:13.450 and seeing that come short. 451 00:19:13.450 --> 00:19:15.396 Let's not overpromise. 452 00:19:15.396 --> 00:19:16.313 453 00:19:16.313 --> 00:19:17.480 SHERVIN KHODABANDEH: Mm-hmm. 454 00:19:17.480 --> 00:19:20.440 Your favorite activity that involves no technology. 455 00:19:20.440 --> 00:19:24.240 JACK BERKOWITZ: Kayaking on the Chattahoochee River. 456 00:19:24.240 --> 00:19:26.110 SAM RANSBOTHAM: Ah, been there, done that! 457 00:19:26.110 --> 00:19:28.210 I've kayaked Chattahoochee many a time. 458 00:19:28.210 --> 00:19:30.500 I'm actually from Atlanta, from Smyrna. 459 00:19:30.500 --> 00:19:33.070 SHERVIN KHODABANDEH: The first career you wanted: 460 00:19:33.070 --> 00:19:34.820 What did you want to be when you grew up? 461 00:19:34.820 --> 00:19:37.720 JACK BERKOWITZ: I wanted to be an astronaut, 462 00:19:37.720 --> 00:19:39.800 like every other kid born in the '60s. 463 00:19:39.800 --> 00:19:40.740 464 00:19:40.740 --> 00:19:42.430 SHERVIN KHODABANDEH: Your greatest wish 465 00:19:42.430 --> 00:19:43.540 for AI in the future? 466 00:19:43.540 --> 00:19:48.480 JACK BERKOWITZ: My greatest wish is that we 467 00:19:48.480 --> 00:19:52.210 help people live better lives. 468 00:19:52.210 --> 00:19:52.268 469 00:19:52.268 --> 00:19:53.560 SHERVIN KHODABANDEH: Thank you. 470 00:19:53.560 --> 00:19:54.240 Very insightful. 471 00:19:54.240 --> 00:19:55.960 SAM RANSBOTHAM: Jack, great meeting you. 472 00:19:55.960 --> 00:19:57.710 I think there's a lot here that people can learn, 473 00:19:57.710 --> 00:20:00.360 particularly some of the details about how you're organized. 474 00:20:00.360 --> 00:20:02.300 I think that's something a lot of people can learn from. 475 00:20:02.300 --> 00:20:04.050 We really appreciate your taking the time. 476 00:20:04.050 --> 00:20:05.540 JACK BERKOWITZ: Thanks, Sam. 477 00:20:05.540 --> 00:20:08.560 I really appreciate the conversation. 478 00:20:08.560 --> 00:20:10.160 SAM RANSBOTHAM: Thanks for joining us. 479 00:20:10.160 --> 00:20:13.400 Next time, Shervin and I talk with Ameen Kazerouni, 480 00:20:13.400 --> 00:20:17.780 chief data and analytics officer at Orangetheory Fitness. 481 00:20:17.780 --> 00:20:19.550 ALLISON RYDER: Thanks for listening 482 00:20:19.550 --> 00:20:21.050 to Me, Myself, and AI. 483 00:20:21.050 --> 00:20:23.500 We believe, like you, that the conversation 484 00:20:23.500 --> 00:20:25.730 about AI implementation doesn't start and stop 485 00:20:25.730 --> 00:20:26.847 with this podcast. 486 00:20:26.847 --> 00:20:28.680 That's why we've created a group on LinkedIn 487 00:20:28.680 --> 00:20:30.520 specifically for listeners like you. 488 00:20:30.520 --> 00:20:33.260 It's called AI for Leaders, and if you join us, 489 00:20:33.260 --> 00:20:35.280 you can chat with show creators and hosts, 490 00:20:35.280 --> 00:20:38.000 ask your own questions, share your insights, 491 00:20:38.000 --> 00:20:40.650 and gain access to valuable resources about AI 492 00:20:40.650 --> 00:20:43.490 implementation from MIT SMR and BCG. 493 00:20:43.490 --> 00:20:48.610 You can access it by visiting mitsmr.com/AIforLeaders. 494 00:20:48.610 --> 00:20:51.330 We'll put that link in the show notes, 495 00:20:51.330 --> 00:20:53.760 and we hope to see you there. 496 00:20:53.760 --> 00:20:59.000