WEBVTT 1 00:00:00.000 --> 00:00:00.750 2 00:00:00.750 --> 00:00:03.300 SAM RANSBOTHAM: Real-time data collection 3 00:00:03.300 --> 00:00:06.250 means organizations can make many more informed choices 4 00:00:06.250 --> 00:00:07.790 based on metrics. 5 00:00:07.790 --> 00:00:10.090 But when do they still need humans? 6 00:00:10.090 --> 00:00:12.190 Find out on today's episode 7 00:00:12.190 --> 00:00:15.140 AMEEN KAZEROUNI: I'm Ameen Kazerouni from Orangetheory 8 00:00:15.140 --> 00:00:17.840 Fitness, and you're listening to Me, Myself, and AI. 9 00:00:17.840 --> 00:00:21.560 SAM RANSBOTHAM: Welcome to Me, Myself, and AI, 10 00:00:21.560 --> 00:00:24.570 a podcast on artificial intelligence in business. 11 00:00:24.570 --> 00:00:28.330 Each episode, we introduce you to someone innovating with AI. 12 00:00:28.330 --> 00:00:32.630 I'm Sam Ransbotham, professor of analytics at Boston College. 13 00:00:32.630 --> 00:00:36.180 I'm also the AI and business strategy guest editor 14 00:00:36.180 --> 00:00:37.790 at MIT Sloan Management Review. 15 00:00:37.790 --> 00:00:40.090 SHERVIN KHODABANDEH: And I'm Shervin Khodabandeh, 16 00:00:40.090 --> 00:00:44.060 senior partner with BCG, and I colead BCG's AI practice 17 00:00:44.060 --> 00:00:45.050 in North America. 18 00:00:45.050 --> 00:00:49.550 Together, MIT SMR and BCG have been researching and publishing 19 00:00:49.550 --> 00:00:52.490 on AI for six years, interviewing hundreds 20 00:00:52.490 --> 00:00:54.610 of practitioners and surveying thousands 21 00:00:54.610 --> 00:00:58.090 of companies on what it takes to build and to deploy and scale 22 00:00:58.090 --> 00:01:00.560 AI capabilities and really transform 23 00:01:00.560 --> 00:01:01.890 the way organizations operate. 24 00:01:01.890 --> 00:01:04.349 SAM RANSBOTHAM: Today, Shervin and I 25 00:01:04.349 --> 00:01:07.290 are excited to be joined by Ameen Kazerouni, chief data 26 00:01:07.290 --> 00:01:09.622 and analytics officer for Orangetheory Fitness. 27 00:01:09.622 --> 00:01:10.830 Ameen, thanks for joining us. 28 00:01:10.830 --> 00:01:11.210 Welcome. 29 00:01:11.210 --> 00:01:12.240 AMEEN KAZEROUNI: Thank you, Sam. 30 00:01:12.240 --> 00:01:13.190 It's great to be here. 31 00:01:13.190 --> 00:01:14.500 Excited for the conversation. 32 00:01:14.500 --> 00:01:16.333 SAM RANSBOTHAM: Currently, you lead the data 33 00:01:16.333 --> 00:01:18.580 and analytics function at Orangetheory Fitness. 34 00:01:18.580 --> 00:01:20.690 Maybe tell us a little bit about the organization. 35 00:01:20.690 --> 00:01:22.060 AMEEN KAZEROUNI: Absolutely. 36 00:01:22.060 --> 00:01:26.110 Orangetheory Fitness is a heart-rate-based, total-body 37 00:01:26.110 --> 00:01:27.100 group workout. 38 00:01:27.100 --> 00:01:31.500 It combines science, great coaching, technology, 39 00:01:31.500 --> 00:01:35.210 and it's designed to provide what 40 00:01:35.210 --> 00:01:37.660 we like to think of as a more vibrant life. 41 00:01:37.660 --> 00:01:42.180 The workout's developed to motivate each individual member 42 00:01:42.180 --> 00:01:44.945 to achieve their desired results. 43 00:01:44.945 --> 00:01:46.320 And, you know, if you're starting 44 00:01:46.320 --> 00:01:50.150 on your wellness journey or you're a seasoned fitness 45 00:01:50.150 --> 00:01:55.280 enthusiast, each OTF workout creates a community 46 00:01:55.280 --> 00:01:59.780 of shared experience but also uses heart-rate-based training 47 00:01:59.780 --> 00:02:03.090 to allow you to experience the workout in a way 48 00:02:03.090 --> 00:02:06.260 that's most comfortable for you, and that's honestly 49 00:02:06.260 --> 00:02:07.310 where my role comes in. 50 00:02:07.310 --> 00:02:11.410 There's a tremendous amount of second-by-second telemetry data 51 00:02:11.410 --> 00:02:13.810 from the fitness equipment, from the heart 52 00:02:13.810 --> 00:02:16.060 rate monitors, that allow us to create 53 00:02:16.060 --> 00:02:19.550 the most curated, personalized kind of boutique fitness 54 00:02:19.550 --> 00:02:20.737 experience in the world. 55 00:02:20.737 --> 00:02:21.570 That's Orangetheory. 56 00:02:21.570 --> 00:02:23.640 SAM RANSBOTHAM: All right, what do you do with all this? 57 00:02:23.640 --> 00:02:25.223 You've collected all this data, you've 58 00:02:25.223 --> 00:02:27.900 got it, this telemetry, you've got heart rate information, 59 00:02:27.900 --> 00:02:29.880 I assume, since you're heart-rate-based. 60 00:02:29.880 --> 00:02:31.137 How does the process work? 61 00:02:31.137 --> 00:02:32.220 Take us through the steps. 62 00:02:32.220 --> 00:02:32.720 63 00:02:32.720 --> 00:02:36.190 AMEEN KAZEROUNI: Most Orangetheory members 64 00:02:36.190 --> 00:02:39.480 in the studio will be wearing what we call an OTbeat heart 65 00:02:39.480 --> 00:02:41.580 rate monitor, which is a proprietary piece 66 00:02:41.580 --> 00:02:42.940 of wearable technology. 67 00:02:42.940 --> 00:02:45.010 There's two purposes to that. 68 00:02:45.010 --> 00:02:50.390 One is, it gives you a real-live feedback loop as to how you're 69 00:02:50.390 --> 00:02:53.530 performing, what intensity level you're outputting 70 00:02:53.530 --> 00:02:56.330 in the studio, but it also allows the coach 71 00:02:56.330 --> 00:02:59.200 to see the intensity level that you're outputting in the studio 72 00:02:59.200 --> 00:03:02.870 and help effectively provide a kind of personalized fitness 73 00:03:02.870 --> 00:03:06.800 training experience in a studio, in a group setting. 74 00:03:06.800 --> 00:03:12.060 My role is focused on kind of unlocking that telemetry data 75 00:03:12.060 --> 00:03:15.520 and helping personalize the experience even more. 76 00:03:15.520 --> 00:03:18.900 A really cool example is that we recently launched 77 00:03:18.900 --> 00:03:22.520 a personalized max heart rate algorithm, 78 00:03:22.520 --> 00:03:28.580 and members now experience a much more curated experience 79 00:03:28.580 --> 00:03:33.090 in the studio, and that's allowing us to use proprietary 80 00:03:33.090 --> 00:03:35.810 algorithms to determine what the max heart rate -- 81 00:03:35.810 --> 00:03:39.120 which is a physiological term for the maximum output that 82 00:03:39.120 --> 00:03:42.380 your heart can beat at -- is for an individual member. 83 00:03:42.380 --> 00:03:44.830 And percentages of that max heart rate 84 00:03:44.830 --> 00:03:47.480 tell you which heart rate zone you're training in. 85 00:03:47.480 --> 00:03:50.450 So anaerobic training versus aerobic training 86 00:03:50.450 --> 00:03:52.100 have different physiological impacts. 87 00:03:52.100 --> 00:03:54.580 Time spent in different intensity zones 88 00:03:54.580 --> 00:03:59.040 have been proven to have varying effects on longevity and health 89 00:03:59.040 --> 00:03:59.540 in general. 90 00:03:59.540 --> 00:04:02.560 And by being able to personalize that for a member, 91 00:04:02.560 --> 00:04:04.790 we're able to make this experience even 92 00:04:04.790 --> 00:04:09.890 more curated in the studio, while most places that leverage 93 00:04:09.890 --> 00:04:14.580 max heart rate will rely on a generic age-based kind 94 00:04:14.580 --> 00:04:15.460 of equation. 95 00:04:15.460 --> 00:04:19.970 And as you can imagine, every 30-year-old or 40-year-old 96 00:04:19.970 --> 00:04:21.180 doesn't have the same heart. 97 00:04:21.180 --> 00:04:23.430 So things like that are an example 98 00:04:23.430 --> 00:04:26.150 of how we curate the experience for our members 99 00:04:26.150 --> 00:04:27.245 using this data. 100 00:04:27.245 --> 00:04:29.620 SHERVIN KHODABANDEH: That is a super-cool example, right? 101 00:04:29.620 --> 00:04:32.510 Not the average for your age and gender. 102 00:04:32.510 --> 00:04:35.220 And then that goes by bands of 10 anyway, right? 103 00:04:35.220 --> 00:04:35.543 104 00:04:35.543 --> 00:04:36.460 AMEEN KAZEROUNI: Yeah. 105 00:04:36.460 --> 00:04:41.480 SHERVIN KHODABANDEH: As if all 50-year-old males exactly 106 00:04:41.480 --> 00:04:42.630 have the same ability. 107 00:04:42.630 --> 00:04:44.630 AMEEN KAZEROUNI: Yeah. 108 00:04:44.630 --> 00:04:46.810 It makes the experience safer, it 109 00:04:46.810 --> 00:04:49.120 makes you more aware of what you're 110 00:04:49.120 --> 00:04:51.530 doing, what your capability is, and you 111 00:04:51.530 --> 00:04:53.910 see that cardiorespiratory fitness climb 112 00:04:53.910 --> 00:04:55.720 over your time with the program. 113 00:04:55.720 --> 00:04:58.710 SHERVIN KHODABANDEH: And I like what's sort of inherent in what 114 00:04:58.710 --> 00:05:02.980 you're saying -- the rapid feedback, right? 115 00:05:02.980 --> 00:05:06.560 Within a few seconds, you get feedback, but also, 116 00:05:06.560 --> 00:05:10.430 in a broader, symbolic sense, what you've been proposing 117 00:05:10.430 --> 00:05:13.880 is more and more experimentation in general 118 00:05:13.880 --> 00:05:15.640 as you build AI algorithms. 119 00:05:15.640 --> 00:05:18.860 So it's not just the algo, but it's also experimentation 120 00:05:18.860 --> 00:05:20.810 because you get feedback. 121 00:05:20.810 --> 00:05:24.270 The philosophy of how to build AI models, 122 00:05:24.270 --> 00:05:27.400 which relies on feedback, you're also 123 00:05:27.400 --> 00:05:31.587 translating that into a real use case based on rapid feedback. 124 00:05:31.587 --> 00:05:32.670 There's some poetry there. 125 00:05:32.670 --> 00:05:34.770 AMEEN KAZEROUNI: I didn't think of it that way, 126 00:05:34.770 --> 00:05:36.420 but absolutely there's poetry there. 127 00:05:36.420 --> 00:05:39.540 I think that one of the beauties of artificial intelligence 128 00:05:39.540 --> 00:05:41.770 algorithms is that they're so reliant on things 129 00:05:41.770 --> 00:05:44.580 they've seen before and constant feedback loops to get smarter. 130 00:05:44.580 --> 00:05:48.040 And that's exactly what the Orangetheory experience is 131 00:05:48.040 --> 00:05:50.630 indexed on is, as the members are getting stronger, 132 00:05:50.630 --> 00:05:54.120 we're getting smarter and making sure that we move with them. 133 00:05:54.120 --> 00:05:57.460 SHERVIN KHODABANDEH: Tell us more a bit about your data 134 00:05:57.460 --> 00:06:00.570 science development philosophy and how 135 00:06:00.570 --> 00:06:06.000 you balance experimentation with more methodical 136 00:06:06.000 --> 00:06:09.070 "get the algorithm to be perfect before you launch it." 137 00:06:09.070 --> 00:06:10.930 Just give us a sense of the trade-offs 138 00:06:10.930 --> 00:06:13.100 and how you and your teams think about that. 139 00:06:13.100 --> 00:06:14.850 AMEEN KAZEROUNI: I love that question, 140 00:06:14.850 --> 00:06:16.260 and it's almost like I'll take it 141 00:06:16.260 --> 00:06:21.030 even a step outside of developing algorithms. 142 00:06:21.030 --> 00:06:23.570 I'm a firm believer that companies 143 00:06:23.570 --> 00:06:29.000 have started indexing so heavily on collecting as much data 144 00:06:29.000 --> 00:06:32.100 as is humanly possible; as much as their compute 145 00:06:32.100 --> 00:06:33.980 and their storage costs, and their boards 146 00:06:33.980 --> 00:06:36.460 and their investors, will allow them to, 147 00:06:36.460 --> 00:06:37.990 companies collect data. 148 00:06:37.990 --> 00:06:40.630 And I think that what happens is, once you have the data, 149 00:06:40.630 --> 00:06:43.070 the expectation is, "Let's jump to machine learning; 150 00:06:43.070 --> 00:06:44.360 let's jump to AI." 151 00:06:44.360 --> 00:06:47.370 And I would argue that those ... 152 00:06:47.370 --> 00:06:49.370 How many debates have you been in on "What's AI, 153 00:06:49.370 --> 00:06:52.110 and what's machine learning, and what do the words mean?" 154 00:06:52.110 --> 00:06:57.140 My philosophy is to go and find those mundane, repetitive tasks 155 00:06:57.140 --> 00:07:00.000 and automate them first with your data, where possible. 156 00:07:00.000 --> 00:07:03.720 Go and find intuitive, gut-based decisions 157 00:07:03.720 --> 00:07:06.750 that your stakeholders and verticals 158 00:07:06.750 --> 00:07:09.800 are uncomfortable making based off of intuition 159 00:07:09.800 --> 00:07:11.460 and would love to make off of data, 160 00:07:11.460 --> 00:07:16.040 and make that data democratized, clean, and available. 161 00:07:16.040 --> 00:07:20.030 And after that, you start the whole machine learning journey. 162 00:07:20.030 --> 00:07:24.210 And I think when it comes to machine learning and AI, 163 00:07:24.210 --> 00:07:26.690 and developing an algorithm in particular, 164 00:07:26.690 --> 00:07:29.370 it really depends on the context and the domain in which 165 00:07:29.370 --> 00:07:31.950 you're working -- whether you're focusing on that precision, 166 00:07:31.950 --> 00:07:33.830 whether you're focusing on that recall -- 167 00:07:33.830 --> 00:07:36.860 and it really depends on what the implications 168 00:07:36.860 --> 00:07:38.310 of the prediction are. 169 00:07:38.310 --> 00:07:41.340 But generally speaking, I always err 170 00:07:41.340 --> 00:07:44.270 on the side of, if it's safe enough, experiment 171 00:07:44.270 --> 00:07:48.570 and learn rather than relying on training and validation sets 172 00:07:48.570 --> 00:07:49.890 to chase perfection. 173 00:07:49.890 --> 00:07:52.435 It's kind of my rule-of-thumb philosophy. 174 00:07:52.435 --> 00:07:54.560 SAM RANSBOTHAM: There's something also interesting, 175 00:07:54.560 --> 00:07:55.977 and I know you've shifted slightly 176 00:07:55.977 --> 00:07:58.540 into thinking about how your organization uses data, 177 00:07:58.540 --> 00:08:01.050 but maybe going back to how you're using it 178 00:08:01.050 --> 00:08:04.890 within the studios, how do you incorporate data from outside? 179 00:08:04.890 --> 00:08:08.090 You know, let's say that I've been training and working 180 00:08:08.090 --> 00:08:12.000 and improving within the studio, but little 181 00:08:12.000 --> 00:08:13.980 do you know that I've hurt my foot 182 00:08:13.980 --> 00:08:16.270 or I've pulled a leg muscle. 183 00:08:16.270 --> 00:08:18.540 How does that sort of outside information come in? 184 00:08:18.540 --> 00:08:20.590 AMEEN KAZEROUNI: I like that question a lot, 185 00:08:20.590 --> 00:08:23.230 Sam, because I've got two answers there. 186 00:08:23.230 --> 00:08:28.070 One is, the coach is the hero at Orangetheory. 187 00:08:28.070 --> 00:08:29.950 And the way we think about it is, 188 00:08:29.950 --> 00:08:32.559 there's 20 to 30 people in a class, 189 00:08:32.559 --> 00:08:35.650 but you don't want to think of it as group fitness as much 190 00:08:35.650 --> 00:08:37.870 as you want to think of it as -- probably, 191 00:08:37.870 --> 00:08:40.460 at least I think of it as -- subsidized personal training. 192 00:08:40.460 --> 00:08:43.390 So you've got a pretty good personal relationship 193 00:08:43.390 --> 00:08:46.090 with that coach, and there's no algorithm 194 00:08:46.090 --> 00:08:48.360 that I'm going to build that's going to be better 195 00:08:48.360 --> 00:08:50.370 than you telling your coach, "Hey, I 196 00:08:50.370 --> 00:08:51.618 hurt my foot last night. 197 00:08:51.618 --> 00:08:52.910 What do you think I should do?" 198 00:08:52.910 --> 00:08:56.140 And we've got alternative, low-impact cardio equipment 199 00:08:56.140 --> 00:08:59.620 in the studio so you don't get on a treadmill 200 00:08:59.620 --> 00:09:01.560 and start trying to run 8 miles an hour. 201 00:09:01.560 --> 00:09:04.480 In fact, one of the things the coach says 202 00:09:04.480 --> 00:09:06.120 before every class starts is, "If you 203 00:09:06.120 --> 00:09:08.120 got any orthopedic issues, please let me know," 204 00:09:08.120 --> 00:09:09.860 and so on and so forth. 205 00:09:09.860 --> 00:09:13.100 So I think that one thing that I've 206 00:09:13.100 --> 00:09:15.150 learned about data over the course of my career 207 00:09:15.150 --> 00:09:17.300 is that data is valuable, and it's 208 00:09:17.300 --> 00:09:18.860 really good to make decisions with, 209 00:09:18.860 --> 00:09:23.010 but if there's a human in the loop that can scalably provide 210 00:09:23.010 --> 00:09:26.720 the answer instead, it's likely going 211 00:09:26.720 --> 00:09:29.390 be hard to beat that expert with an algorithm. 212 00:09:29.390 --> 00:09:32.010 So don't get in the way of the expert. 213 00:09:32.010 --> 00:09:34.070 The second part of that answer is, 214 00:09:34.070 --> 00:09:35.850 let's say you had a bunch of caffeine, 215 00:09:35.850 --> 00:09:38.820 or you ran a marathon the previous day, 216 00:09:38.820 --> 00:09:43.760 or you didn't sleep very well; all those external factors 217 00:09:43.760 --> 00:09:46.140 affect your cardiac output when you're in the studio. 218 00:09:46.140 --> 00:09:49.400 So that real-time feedback loop allows you and the coach 219 00:09:49.400 --> 00:09:50.600 to modulate yourself. 220 00:09:50.600 --> 00:09:53.590 One thing that we encourage is, if you're not feeling it, 221 00:09:53.590 --> 00:09:55.220 take a "green day." 222 00:09:55.220 --> 00:09:55.830 ... 223 00:09:55.830 --> 00:09:58.910 The zones are broken up into colors, 224 00:09:58.910 --> 00:10:01.750 orange and red being the anaerobic zones -- 225 00:10:01.750 --> 00:10:04.810 zones 4 and 5 -- green being your zone 3, 226 00:10:04.810 --> 00:10:06.460 blue being your zone 2. 227 00:10:06.460 --> 00:10:09.770 So we recommend, don't shoot for those 12 to 20 minutes 228 00:10:09.770 --> 00:10:11.300 in the orange and red zone. 229 00:10:11.300 --> 00:10:13.170 Look at your heart rate zones, listen 230 00:10:13.170 --> 00:10:15.800 to that real-time feedback loop, and take a green day 231 00:10:15.800 --> 00:10:16.600 if you need to. 232 00:10:16.600 --> 00:10:19.230 So it's a bit of a cop-out answer there, 233 00:10:19.230 --> 00:10:21.570 but we've got the coach, we've got the member, 234 00:10:21.570 --> 00:10:23.250 and we've got a real-time feedback loop. 235 00:10:23.250 --> 00:10:26.810 And when there's an intuitive answer like that right in front 236 00:10:26.810 --> 00:10:30.750 of you, I don't think you should interfere with an algorithm, 237 00:10:30.750 --> 00:10:32.668 is kind of the thought process there. 238 00:10:32.668 --> 00:10:35.210 SAM RANSBOTHAM: I don't think that's a cop-out answer at all, 239 00:10:35.210 --> 00:10:38.670 because historically, you had a OK, 240 00:10:38.670 --> 00:10:40.140 you could have a personal trainer, 241 00:10:40.140 --> 00:10:41.598 or you could be in a group setting. 242 00:10:41.598 --> 00:10:44.060 I think I personally identify [with] this because you could 243 00:10:44.060 --> 00:10:46.100 either teach someone one-on-one as a tutor 244 00:10:46.100 --> 00:10:48.240 or you could teach someone in a giant classroom. 245 00:10:48.240 --> 00:10:50.710 And I'm just excited about applications 246 00:10:50.710 --> 00:10:55.230 like this that are letting us pull together the best of both 247 00:10:55.230 --> 00:10:56.160 of those worlds. 248 00:10:56.160 --> 00:10:58.770 I'm hungry for it in education, but I can see that 249 00:10:58.770 --> 00:11:00.520 you're making a lot of progress on that -- 250 00:11:00.520 --> 00:11:02.850 and Shervin and I have talked with Peloton as well. 251 00:11:02.850 --> 00:11:04.800 They're starting to think about how 252 00:11:04.800 --> 00:11:09.873 you get to an individual-level experience at scale, 253 00:11:09.873 --> 00:11:12.040 and that seems like what you're really trying to do. 254 00:11:12.040 --> 00:11:12.880 AMEEN KAZEROUNI: Absolutely. 255 00:11:12.880 --> 00:11:14.460 I couldn't have put it better myself. 256 00:11:14.460 --> 00:11:15.550 I think that's exactly it. 257 00:11:15.550 --> 00:11:17.175 SHERVIN KHODABANDEH: I like that a lot. 258 00:11:17.175 --> 00:11:19.150 This is all about rapid feedback, 259 00:11:19.150 --> 00:11:21.830 which is a cornerstone of building expertise 260 00:11:21.830 --> 00:11:25.580 in any system, whether it's chess, or whether it's running, 261 00:11:25.580 --> 00:11:27.100 or whether it's machine learning. 262 00:11:27.100 --> 00:11:29.240 The more you combine experimentation 263 00:11:29.240 --> 00:11:32.530 and rapid feedback and the human in the loop 264 00:11:32.530 --> 00:11:36.230 and apply that across different industries, the more 265 00:11:36.230 --> 00:11:37.780 opportunity and values you're going 266 00:11:37.780 --> 00:11:41.580 to unleash for personalized everything, not just fitness 267 00:11:41.580 --> 00:11:46.200 or shopping for pants but also education and everything else. 268 00:11:46.200 --> 00:11:47.803 You're on to something here. 269 00:11:47.803 --> 00:11:48.970 AMEEN KAZEROUNI: Absolutely. 270 00:11:48.970 --> 00:11:50.730 In my prior life, I was in retail. 271 00:11:50.730 --> 00:11:53.550 And when you think about the role of machine 272 00:11:53.550 --> 00:11:55.750 learning and deep learning in retail, 273 00:11:55.750 --> 00:11:58.120 it's that: You're trying to re-create 274 00:11:58.120 --> 00:12:02.410 the in-person shopping experience on a website. 275 00:12:02.410 --> 00:12:04.240 Like, how do we curate this? 276 00:12:04.240 --> 00:12:06.050 How do we create a personal shopper? 277 00:12:06.050 --> 00:12:07.980 How do we create that boutique experience? 278 00:12:07.980 --> 00:12:09.900 How do we predict the size correctly? 279 00:12:09.900 --> 00:12:11.400 How do we use AR [augmented reality] 280 00:12:11.400 --> 00:12:13.820 so you can see what a shoe looks like on your foot? 281 00:12:13.820 --> 00:12:18.070 We're always trying to close that gap and arrive back 282 00:12:18.070 --> 00:12:21.030 at what the real thing would be like -- personalized, 283 00:12:21.030 --> 00:12:21.960 but at scale. 284 00:12:21.960 --> 00:12:24.760 And I think that's the secret sauce there: 285 00:12:24.760 --> 00:12:28.150 a rapid feedback loop, algorithmic support 286 00:12:28.150 --> 00:12:31.190 with large volumes of data, but then also 287 00:12:31.190 --> 00:12:34.643 not trying to circumnavigate around the expert, 288 00:12:34.643 --> 00:12:35.560 the human in the loop. 289 00:12:35.560 --> 00:12:37.190 SHERVIN KHODABANDEH: Exactly. 290 00:12:37.190 --> 00:12:38.880 And I think that's really critical, Sam, 291 00:12:38.880 --> 00:12:41.455 because I remember when we did the first few years of the [AI 292 00:12:41.455 --> 00:12:43.122 report]( https://sloanreview.mit.edu/big 293 00:12:43.122 --> 00:12:45.164 ideas/artificial intelligence business strategy), 294 00:12:45.164 --> 00:12:46.462 like back 2018-2019. 295 00:12:46.462 --> 00:12:47.330 296 00:12:47.330 --> 00:12:48.540 SAM RANSBOTHAM: So long ago. 297 00:12:48.540 --> 00:12:48.678 298 00:12:48.678 --> 00:12:50.970 SHERVIN KHODABANDEH: There were still a lot of folks -- 299 00:12:50.970 --> 00:12:52.440 I mean, that's only three years ago, 300 00:12:52.440 --> 00:12:53.560 but a lot of folks still today are thinking of AI 301 00:12:53.560 --> 00:12:56.920 as that which replaces human, and that which must automate. 302 00:12:56.920 --> 00:13:01.610 And the more we think that narrowly, the more outcomes 303 00:13:01.610 --> 00:13:04.770 we're leaving on the table, the more successful workforces 304 00:13:04.770 --> 00:13:07.100 and humans we're disenfranchising, 305 00:13:07.100 --> 00:13:10.900 and also the more opportunities we're leaving 306 00:13:10.900 --> 00:13:13.060 unaddressed because we think, "Well, there's 307 00:13:13.060 --> 00:13:15.420 no way I can fully replace a human here, 308 00:13:15.420 --> 00:13:17.025 so I'm not going to do it." 309 00:13:17.025 --> 00:13:19.400 And there are so many of these things where it's not just 310 00:13:19.400 --> 00:13:22.700 AI versus human but it's AI and human. 311 00:13:22.700 --> 00:13:23.145 312 00:13:23.145 --> 00:13:24.020 AMEEN KAZEROUNI: Yep. 313 00:13:24.020 --> 00:13:28.790 You know, I think that there is a conflation of companies 314 00:13:28.790 --> 00:13:30.650 where the product is AI. 315 00:13:30.650 --> 00:13:33.240 Like, when you think of AI, you think of Tesla, 316 00:13:33.240 --> 00:13:35.630 and there's different industries all of them play in, 317 00:13:35.630 --> 00:13:38.030 but there's a very core part of the product that's 318 00:13:38.030 --> 00:13:40.758 a standalone piece of intellectual property 319 00:13:40.758 --> 00:13:42.800 that's heavily rooted in artificial intelligence, 320 00:13:42.800 --> 00:13:46.290 and that's not how a majority of the world 321 00:13:46.290 --> 00:13:48.000 is going to use artificial intelligence. 322 00:13:48.000 --> 00:13:51.785 And I think that's one of the kind of key differences in what 323 00:13:51.785 --> 00:13:53.410 you were just talking about, Shervin -- 324 00:13:53.410 --> 00:13:57.600 is that people try and use AI the way those companies use AI 325 00:13:57.600 --> 00:13:59.540 and see them as the gold standard, 326 00:13:59.540 --> 00:14:01.730 but our product is not AI. 327 00:14:01.730 --> 00:14:07.440 Our product is a curated, science-backed, coach-inspired 328 00:14:07.440 --> 00:14:11.090 fitness experience that's just merely augmented 329 00:14:11.090 --> 00:14:12.750 in parts by AI. 330 00:14:12.750 --> 00:14:14.150 SAM RANSBOTHAM: This is something 331 00:14:14.150 --> 00:14:15.550 that Shervin and I are pretty excited about. 332 00:14:15.550 --> 00:14:17.050 I don't want to foreshadow too much, 333 00:14:17.050 --> 00:14:21.150 but we're thinking about these mini uses 334 00:14:21.150 --> 00:14:24.327 of AI beyond just this kind of poster use of AI 335 00:14:24.327 --> 00:14:25.410 that you're talking about. 336 00:14:25.410 --> 00:14:28.430 I mean, yes, we all are attracted 337 00:14:28.430 --> 00:14:31.610 to the Boston Dynamics robots that look very cool, 338 00:14:31.610 --> 00:14:35.510 but there's a lot going on that isn't that level, 339 00:14:35.510 --> 00:14:37.753 and there's so much value in those. 340 00:14:37.753 --> 00:14:39.920 And I think you're starting to capture some of that. 341 00:14:39.920 --> 00:14:43.030 SHERVIN KHODABANDEH: I wanted to segue from this. 342 00:14:43.030 --> 00:14:47.200 We talked quite a lot about the importance of human 343 00:14:47.200 --> 00:14:52.160 in the loop, experimentation, being hypothesis-driven, 344 00:14:52.160 --> 00:14:53.380 all these things you said. 345 00:14:53.380 --> 00:14:58.850 Maybe tell us a bit around the operating model and the ways 346 00:14:58.850 --> 00:15:04.150 of working in a company that is not an AI product company 347 00:15:04.150 --> 00:15:08.140 but is a company like yours with a strong mission. 348 00:15:08.140 --> 00:15:13.440 What does it take to take a use case and bring it to life? 349 00:15:13.440 --> 00:15:15.150 AMEEN KAZEROUNI: I think it honestly 350 00:15:15.150 --> 00:15:18.850 comes down to three things. 351 00:15:18.850 --> 00:15:21.530 It's having the data; you can't really 352 00:15:21.530 --> 00:15:23.930 get around not having the data. 353 00:15:23.930 --> 00:15:25.290 Investment. 354 00:15:25.290 --> 00:15:27.540 I think a big mistake companies make 355 00:15:27.540 --> 00:15:30.830 is not investing in data engineers 356 00:15:30.830 --> 00:15:33.360 early, thinking that you can just sprinkle 357 00:15:33.360 --> 00:15:36.510 AI like some kind of magic powder on raw data sets 358 00:15:36.510 --> 00:15:38.280 and it's going to produce something. 359 00:15:38.280 --> 00:15:42.717 I think data engineers are a critical commodity that you 360 00:15:42.717 --> 00:15:45.050 want to invest in early so your data is at a point where 361 00:15:45.050 --> 00:15:46.150 you can actually use it. 362 00:15:46.150 --> 00:15:48.190 So that investment is really important. 363 00:15:48.190 --> 00:15:50.280 Wherever it's coming from, there needs 364 00:15:50.280 --> 00:15:54.850 to be a serious decision made to invest in your data practice 365 00:15:54.850 --> 00:15:56.900 if you're going really try and build one. 366 00:15:56.900 --> 00:15:59.260 And then finally, it's buy-in. 367 00:15:59.260 --> 00:16:02.850 When the product is not AI, you're convincing domain 368 00:16:02.850 --> 00:16:06.070 experts -- in our case, fitness experts -- 369 00:16:06.070 --> 00:16:07.840 that have been doing this for a long time, 370 00:16:07.840 --> 00:16:10.710 that are formally educated in these fields of study, 371 00:16:10.710 --> 00:16:14.770 that are always going to know more about the product than you 372 00:16:14.770 --> 00:16:19.380 do, that an algorithm is going to help them and make their job 373 00:16:19.380 --> 00:16:20.367 easier. 374 00:16:20.367 --> 00:16:21.950 And I think that that relationship can 375 00:16:21.950 --> 00:16:25.370 be a beautiful partnership or it can be an extremely 376 00:16:25.370 --> 00:16:26.660 antagonistic one. 377 00:16:26.660 --> 00:16:28.600 One of the things that I've kind of strived 378 00:16:28.600 --> 00:16:30.840 toward in my role at Orangetheory 379 00:16:30.840 --> 00:16:34.998 is to have a strong partnership with our template design 380 00:16:34.998 --> 00:16:37.540 team, our workout design team, because at the end of the day, 381 00:16:37.540 --> 00:16:40.570 they are the protectors and designers of that product, 382 00:16:40.570 --> 00:16:43.020 and we're, again, just a tool that's supporting them. 383 00:16:43.020 --> 00:16:46.750 Their buy-in is very critical, because their understanding 384 00:16:46.750 --> 00:16:50.970 of the algorithms is what then makes it to coaches in learning 385 00:16:50.970 --> 00:16:52.650 and development material. 386 00:16:52.650 --> 00:16:56.060 At the end of the day, we've got thousands of coaches 387 00:16:56.060 --> 00:17:00.280 across 24 countries explaining that max heart rate algorithm 388 00:17:00.280 --> 00:17:01.320 that I mentioned. 389 00:17:01.320 --> 00:17:04.050 And they're not explaining it like a piece of mathematics; 390 00:17:04.050 --> 00:17:07.130 they're explaining it like a piece of exercise fitness, 391 00:17:07.130 --> 00:17:08.869 like a piece of exercise physiology. 392 00:17:08.869 --> 00:17:11.040 And that requires that buy-in. 393 00:17:11.040 --> 00:17:13.140 Like, the AI, the data team, and the fitness team 394 00:17:13.140 --> 00:17:15.829 need to be in lockstep; otherwise, it's 395 00:17:15.829 --> 00:17:16.552 destined to fail. 396 00:17:16.552 --> 00:17:18.760 SHERVIN KHODABANDEH: And what does that mean in terms 397 00:17:18.760 --> 00:17:22.050 of the talent and the team -- the technical team that that 398 00:17:22.050 --> 00:17:24.160 you oversee and you hire and recruit? 399 00:17:24.160 --> 00:17:27.119 AMEEN KAZEROUNI: That they're difficult to find, 400 00:17:27.119 --> 00:17:28.440 is what it means. 401 00:17:28.440 --> 00:17:31.250 I think there's already a scarcity 402 00:17:31.250 --> 00:17:32.820 of talent in this space. 403 00:17:32.820 --> 00:17:37.490 I think in a mission-driven, purpose-driven company 404 00:17:37.490 --> 00:17:41.360 like Orangetheory, you'd think it's harder to find, 405 00:17:41.360 --> 00:17:44.320 but it's actually easier in the sense 406 00:17:44.320 --> 00:17:47.450 that if you find someone that's aligned with the mission, 407 00:17:47.450 --> 00:17:51.380 it's almost exciting to them that there's an opportunity 408 00:17:51.380 --> 00:17:54.030 to apply that skill set on what they considered 409 00:17:54.030 --> 00:17:57.500 an outside-the-job passion. 410 00:17:57.500 --> 00:18:01.450 But we've also focused on our data organization 411 00:18:01.450 --> 00:18:03.000 being a separate entity. 412 00:18:03.000 --> 00:18:06.430 So we've got my role, chief data and analytics officer, 413 00:18:06.430 --> 00:18:09.720 running a data organization reporting in to our CEO. 414 00:18:09.720 --> 00:18:12.660 And we've got our chief digital and technology officer 415 00:18:12.660 --> 00:18:15.510 running a separate digital and technology organization. 416 00:18:15.510 --> 00:18:17.510 And what's really powerful about that 417 00:18:17.510 --> 00:18:20.330 is that we're able to riff off of each other 418 00:18:20.330 --> 00:18:23.060 and have one team provide building 419 00:18:23.060 --> 00:18:26.010 blocks to the other team and vice versa. 420 00:18:26.010 --> 00:18:31.130 And what you'd imagine creates an interesting working 421 00:18:31.130 --> 00:18:34.180 experience actually drives a lot of velocity 422 00:18:34.180 --> 00:18:36.720 and drives a really cool partnership that's 423 00:18:36.720 --> 00:18:38.650 very exciting to be a part of as well. 424 00:18:38.650 --> 00:18:42.050 So I think it's all about that: partnerships and buy-ins 425 00:18:42.050 --> 00:18:43.760 and collaboration across teams. 426 00:18:43.760 --> 00:18:44.140 427 00:18:44.140 --> 00:18:45.640 SHERVIN KHODABANDEH: Very well said. 428 00:18:45.640 --> 00:18:47.670 Ameen, we have a special segment here 429 00:18:47.670 --> 00:18:53.220 where we ask you five rapid-fire-style questions. 430 00:18:53.220 --> 00:18:55.880 Just give us the first thing that comes to your mind. 431 00:18:55.880 --> 00:18:58.650 The key thing is intuitive, whatever 432 00:18:58.650 --> 00:19:01.300 comes your mind, and short and sweet answers. 433 00:19:01.300 --> 00:19:02.132 So, ready for that? 434 00:19:02.132 --> 00:19:03.340 AMEEN KAZEROUNI: Let's do it. 435 00:19:03.340 --> 00:19:03.970 SHERVIN KHODABANDEH: All right. 436 00:19:03.970 --> 00:19:05.800 What's your proudest AI moment? 437 00:19:05.800 --> 00:19:10.440 AMEEN KAZEROUNI: When we solved it using a linear regression. 438 00:19:10.440 --> 00:19:12.030 SHERVIN KHODABANDEH: Love it. 439 00:19:12.030 --> 00:19:12.900 All right. 440 00:19:12.900 --> 00:19:15.150 What worries you about AI? 441 00:19:15.150 --> 00:19:19.090 AMEEN KAZEROUNI: A lack of consultation 442 00:19:19.090 --> 00:19:19.960 with domain experts. 443 00:19:19.960 --> 00:19:20.398 444 00:19:20.398 --> 00:19:21.690 SHERVIN KHODABANDEH: Well said. 445 00:19:21.690 --> 00:19:26.160 Your favorite activity that involves no technology? 446 00:19:26.160 --> 00:19:28.582 AMEEN KAZEROUNI: Hiking. 447 00:19:28.582 --> 00:19:29.312 448 00:19:29.312 --> 00:19:31.020 SHERVIN KHODABANDEH: Was that a question? 449 00:19:31.020 --> 00:19:32.670 AMEEN KAZEROUNI: It's a very simple one. 450 00:19:32.670 --> 00:19:34.878 I was going to say Orangetheory, but there's too much 451 00:19:34.878 --> 00:19:35.720 technology in there. 452 00:19:35.720 --> 00:19:39.615 SHERVIN KHODABANDEH: The first career you wanted -- like, 453 00:19:39.615 --> 00:19:41.240 what you wanted to be when you grew up. 454 00:19:41.240 --> 00:19:41.896 455 00:19:41.896 --> 00:19:43.604 AMEEN KAZEROUNI: Environmental biologist. 456 00:19:43.604 --> 00:19:44.380 457 00:19:44.380 --> 00:19:46.120 SHERVIN KHODABANDEH: Your greatest wish 458 00:19:46.120 --> 00:19:47.680 for AI in the future? 459 00:19:47.680 --> 00:19:48.930 AMEEN KAZEROUNI: More access. 460 00:19:48.930 --> 00:19:48.965 461 00:19:48.965 --> 00:19:50.090 SHERVIN KHODABANDEH: Great. 462 00:19:50.090 --> 00:19:51.670 SAM RANSBOTHAM: Actually, I've got to follow up: 463 00:19:51.670 --> 00:19:52.300 Access for who? 464 00:19:52.300 --> 00:19:53.008 Who needs access? 465 00:19:53.008 --> 00:19:55.880 AMEEN KAZEROUNI: I think it would be great 466 00:19:55.880 --> 00:20:01.260 if some of the simpler portions of AI that unlocked decisioning 467 00:20:01.260 --> 00:20:03.230 off of data that companies have collected 468 00:20:03.230 --> 00:20:07.220 was easier to tap into without the financial and human capital 469 00:20:07.220 --> 00:20:09.430 that you're required to invest as an organization. 470 00:20:09.430 --> 00:20:14.770 I think the general efficiency of the world will just go up, 471 00:20:14.770 --> 00:20:15.430 you know. 472 00:20:15.430 --> 00:20:17.733 SHERVIN KHODABANDEH: More open-source kind of stuff. 473 00:20:17.733 --> 00:20:18.900 AMEEN KAZEROUNI: Yeah, yeah. 474 00:20:18.900 --> 00:20:22.100 SHERVIN KHODABANDEH: Ameen this has been exceedingly insightful 475 00:20:22.100 --> 00:20:23.835 and a lot of fun. 476 00:20:23.835 --> 00:20:25.210 Thank you for making time for us. 477 00:20:25.210 --> 00:20:26.360 SAM RANSBOTHAM: Yeah, thanks for coming. 478 00:20:26.360 --> 00:20:27.390 AMEEN KAZEROUNI: Thanks for having me. 479 00:20:27.390 --> 00:20:28.307 This was a lot of fun. 480 00:20:28.307 --> 00:20:29.300 I truly enjoyed it. 481 00:20:29.300 --> 00:20:29.800 482 00:20:29.800 --> 00:20:31.750 SAM RANSBOTHAM: On our next episode, 483 00:20:31.750 --> 00:20:34.480 we'll speak with Khatereh Khodavirdi, senior director 484 00:20:34.480 --> 00:20:36.480 of data science and analytics at PayPal, 485 00:20:36.480 --> 00:20:39.120 where she oversees data teams at Venmo and Honey. 486 00:20:39.120 --> 00:20:39.820 Please join us. 487 00:20:39.820 --> 00:20:39.892 488 00:20:39.892 --> 00:20:41.350 ALLISON RYDER: Thanks for listening 489 00:20:41.350 --> 00:20:42.850 to Me, Myself, and AI. 490 00:20:42.850 --> 00:20:45.310 We believe, like you, that the conversation 491 00:20:45.310 --> 00:20:47.530 about AI implementation doesn't start and stop 492 00:20:47.530 --> 00:20:48.647 with this podcast. 493 00:20:48.647 --> 00:20:50.480 That's why we've created a group on LinkedIn 494 00:20:50.480 --> 00:20:52.320 specifically for listeners like you. 495 00:20:52.320 --> 00:20:55.070 It's called AI for Leaders, and if you join us, 496 00:20:55.070 --> 00:20:57.090 you can chat with show creators and hosts, 497 00:20:57.090 --> 00:20:59.800 ask your own questions, share your insights, 498 00:20:59.800 --> 00:21:02.440 and gain access to valuable resources about AI 499 00:21:02.440 --> 00:21:05.280 implementation from MIT SMR and BCG. 500 00:21:05.280 --> 00:21:10.410 You can access it by visiting mitsmr.com/AIforLeaders. 501 00:21:10.410 --> 00:21:13.130 We'll put that link in the show notes, 502 00:21:13.130 --> 00:21:15.560 and we hope to see you there. 503 00:21:15.560 --> 00:21:21.000