WEBVTT 1 00:00:00.000 --> 00:00:00.900 2 00:00:00.900 --> 00:00:02.920 SAM RANSBOTHAM: Many people have already 3 00:00:02.920 --> 00:00:05.150 invited artificial intelligence into their homes 4 00:00:05.150 --> 00:00:08.550 with voice assistants like Siri and Alexa, but how can 5 00:00:08.550 --> 00:00:10.750 we individually benefit from computer vision? 6 00:00:10.750 --> 00:00:14.170 Today we talk with Sanjay Nichani, vice president of AI 7 00:00:14.170 --> 00:00:16.320 and computer vision at Peloton Interactive, 8 00:00:16.320 --> 00:00:18.270 about a new product that incorporates 9 00:00:18.270 --> 00:00:21.110 AI for fitness coaching. 10 00:00:21.110 --> 00:00:23.860 Welcome to Me, Myself, and AI, a podcast 11 00:00:23.860 --> 00:00:26.050 on artificial intelligence in business. 12 00:00:26.050 --> 00:00:29.750 Each episode, we introduce you to someone innovating with AI. 13 00:00:29.750 --> 00:00:33.060 I'm Sam Ransbotham, professor of information systems 14 00:00:33.060 --> 00:00:34.380 at Boston College. 15 00:00:34.380 --> 00:00:37.770 I'm also the guest editor for the AI and Business Strategy 16 00:00:37.770 --> 00:00:41.410 Big Idea program at MIT Sloan Management Review. 17 00:00:41.410 --> 00:00:43.910 SHERVIN KHODABANDEH: And I'm Shervin Khodabandeh, 18 00:00:43.910 --> 00:00:47.990 senior partner with BCG, and I colead BCG's AI practice 19 00:00:47.990 --> 00:00:49.170 in North America. 20 00:00:49.170 --> 00:00:52.700 Together, MIT SMR and BCG have been 21 00:00:52.700 --> 00:00:56.130 researching AI for five years, interviewing hundreds 22 00:00:56.130 --> 00:00:58.470 of practitioners and surveying thousands 23 00:00:58.470 --> 00:01:02.580 of companies on what it takes to build and to deploy and scale 24 00:01:02.580 --> 00:01:06.100 AI capabilities and really transform 25 00:01:06.100 --> 00:01:08.160 the way organizations operate. 26 00:01:08.160 --> 00:01:10.520 SAM RANSBOTHAM: Today, Shervin and I 27 00:01:10.520 --> 00:01:12.640 are talking with Sanjay Nichani, vice president 28 00:01:12.640 --> 00:01:14.640 of artificial intelligence and computer vision 29 00:01:14.640 --> 00:01:15.840 at Peloton Interactive. 30 00:01:15.840 --> 00:01:17.140 Sanjay, thanks for joining us. 31 00:01:17.140 --> 00:01:17.640 Welcome. 32 00:01:17.640 --> 00:01:18.240 33 00:01:18.240 --> 00:01:19.190 SANJAY NICHANI: Thank you for having me. 34 00:01:19.190 --> 00:01:20.100 SAM RANSBOTHAM: Sanjay, can you tell us 35 00:01:20.100 --> 00:01:21.558 about your current role at Peloton? 36 00:01:21.558 --> 00:01:22.500 37 00:01:22.500 --> 00:01:24.750 SANJAY NICHANI: I lead the AI and computer vision team 38 00:01:24.750 --> 00:01:25.560 at Peloton. 39 00:01:25.560 --> 00:01:28.950 Peloton's mission is to use technology and design 40 00:01:28.950 --> 00:01:32.440 to connect the world through fitness, empowering people 41 00:01:32.440 --> 00:01:34.060 to be the best version of themselves 42 00:01:34.060 --> 00:01:35.440 anywhere and anytime. 43 00:01:35.440 --> 00:01:38.910 Most people recognize this via our bikes and treadmills 44 00:01:38.910 --> 00:01:39.720 that we sell. 45 00:01:39.720 --> 00:01:44.110 We have world-class instructors that teach some really awesome 46 00:01:44.110 --> 00:01:46.870 content related to cardio and strength and yoga 47 00:01:46.870 --> 00:01:49.930 and meditation, and this gets streamed not only 48 00:01:49.930 --> 00:01:53.530 to your bikes, treadmills, but also onto your digital apps -- 49 00:01:53.530 --> 00:01:55.670 whether it's iPhone or Android -- 50 00:01:55.670 --> 00:01:58.260 and also it's available on TV, as an app. 51 00:01:58.260 --> 00:02:01.250 We're here to make people happy and healthy 52 00:02:01.250 --> 00:02:02.370 and change their lives. 53 00:02:02.370 --> 00:02:04.935 SAM RANSBOTHAM: Tell us a bit about your education 54 00:02:04.935 --> 00:02:05.560 and background. 55 00:02:05.560 --> 00:02:07.018 I know you studied at Babson, which 56 00:02:07.018 --> 00:02:09.580 is down the street from us at MIT in Cambridge, actually. 57 00:02:09.580 --> 00:02:12.438 Take us on your path to get to Peloton 58 00:02:12.438 --> 00:02:14.730 and what got you interested in artificial intelligence. 59 00:02:14.730 --> 00:02:15.178 60 00:02:15.178 --> 00:02:17.720 SANJAY NICHANI: I would say that my career is sort of divided 61 00:02:17.720 --> 00:02:19.010 into four phases. 62 00:02:19.010 --> 00:02:21.690 The first one was more around using 63 00:02:21.690 --> 00:02:24.250 computer vision for manufacturing and factory 64 00:02:24.250 --> 00:02:25.060 automation. 65 00:02:25.060 --> 00:02:29.000 And then my second phase was more around security and access 66 00:02:29.000 --> 00:02:32.390 control; that's where a lot of 3D computer vision applications 67 00:02:32.390 --> 00:02:35.850 are used for finding people in revolving doors 68 00:02:35.850 --> 00:02:38.090 and retail stores and so on. 69 00:02:38.090 --> 00:02:40.170 And then I did a little bit of a stint 70 00:02:40.170 --> 00:02:43.920 in identity verification and document forensics. 71 00:02:43.920 --> 00:02:46.550 So now I'm in my fourth stint, working in the fitness space, 72 00:02:46.550 --> 00:02:48.145 which I'm really excited about. 73 00:02:48.145 --> 00:02:49.520 With all this experience I've had 74 00:02:49.520 --> 00:02:50.590 in computer vision over the years, 75 00:02:50.590 --> 00:02:52.486 just bringing it into the space is exciting. 76 00:02:52.486 --> 00:02:52.933 77 00:02:52.933 --> 00:02:54.850 SAM RANSBOTHAM: I think we all are, obviously, 78 00:02:54.850 --> 00:02:57.740 as you said, very familiar with Peloton and the bike. 79 00:02:57.740 --> 00:03:00.620 What other things are going on that we may not be aware of? 80 00:03:00.620 --> 00:03:02.640 What kinds of uses of artificial intelligence 81 00:03:02.640 --> 00:03:04.990 are you using there that maybe we can't see? 82 00:03:04.990 --> 00:03:07.407 SANJAY NICHANI: Peloton Guide -- this is something that we 83 00:03:07.407 --> 00:03:08.440 recently announced. 84 00:03:08.440 --> 00:03:10.770 This is our first strength product, 85 00:03:10.770 --> 00:03:14.030 and it's also the first that uses AI technology that 86 00:03:14.030 --> 00:03:15.997 actually runs on a physical device 87 00:03:15.997 --> 00:03:17.080 in the form of a platform. 88 00:03:17.080 --> 00:03:18.890 We're quite excited about it. 89 00:03:18.890 --> 00:03:22.580 It basically connects to any TV to transform 90 00:03:22.580 --> 00:03:25.710 the TV into sort of an interactive personal training 91 00:03:25.710 --> 00:03:26.780 studio. 92 00:03:26.780 --> 00:03:31.120 Peloton instructors lead a wide range of fun classes, 93 00:03:31.120 --> 00:03:34.910 but quite intense, that actually use dumbbells and body weights. 94 00:03:34.910 --> 00:03:37.610 And so where we bring computer vision technology and AI 95 00:03:37.610 --> 00:03:40.230 in there is, we have something called the movement tracker, 96 00:03:40.230 --> 00:03:42.820 which allows you to track members -- 97 00:03:42.820 --> 00:03:44.965 allows you to recognize their activity -- 98 00:03:44.965 --> 00:03:47.090 so that as you follow along [with] the instructors, 99 00:03:47.090 --> 00:03:48.770 [it] makes sure that you're actually completing these moves 100 00:03:48.770 --> 00:03:50.010 as you go through the class. 101 00:03:50.010 --> 00:03:55.970 And this real-time feedback and metrics-driven accountability 102 00:03:55.970 --> 00:03:57.610 is very appealing to our members, 103 00:03:57.610 --> 00:03:59.762 because now they have a goal to work to, 104 00:03:59.762 --> 00:04:01.720 especially when you don't have a coach at home. 105 00:04:01.720 --> 00:04:03.230 SHERVIN KHODABANDEH: And that's a device 106 00:04:03.230 --> 00:04:04.313 that they put in the room? 107 00:04:04.313 --> 00:04:06.340 SANJAY NICHANI: Exactly. 108 00:04:06.340 --> 00:04:08.030 It connects to your TV. 109 00:04:08.030 --> 00:04:10.190 The other really nice thing about the device 110 00:04:10.190 --> 00:04:13.950 is that it has what is called smart-frame technology; 111 00:04:13.950 --> 00:04:17.810 it basically gives you the freedom to go around the room, 112 00:04:17.810 --> 00:04:21.269 and it automatically pans and zooms where you are, 113 00:04:21.269 --> 00:04:24.070 and then you can see yourself on TV, 114 00:04:24.070 --> 00:04:26.685 so you're reflected on TV so that you can see your form. 115 00:04:26.685 --> 00:04:28.310 We're really excited about this product 116 00:04:28.310 --> 00:04:30.680 because a lot of the technologies, all the way 117 00:04:30.680 --> 00:04:32.770 from finding people and figuring out 118 00:04:32.770 --> 00:04:34.890 what activity they're doing, all that 119 00:04:34.890 --> 00:04:36.182 is driven by computer vision. 120 00:04:36.182 --> 00:04:38.140 SHERVIN KHODABANDEH: So as I'm listening to you 121 00:04:38.140 --> 00:04:41.190 about the setup here, my mind goes 122 00:04:41.190 --> 00:04:44.520 to just the amount of real-time data 123 00:04:44.520 --> 00:04:47.530 that's coming from many, many thousands of people 124 00:04:47.530 --> 00:04:48.780 at the same time. 125 00:04:48.780 --> 00:04:50.950 Tell us a bit about how you're processing all that 126 00:04:50.950 --> 00:04:54.800 and how much of that is really real time versus pre-packaged. 127 00:04:54.800 --> 00:04:55.300 128 00:04:55.300 --> 00:04:57.810 SANJAY NICHANI: Yeah, definitely. 129 00:04:57.810 --> 00:05:01.240 Let me talk a little bit about the production aspects. 130 00:05:01.240 --> 00:05:04.690 This device over here is completely self-contained. 131 00:05:04.690 --> 00:05:07.320 There is really nothing that's going out of the device 132 00:05:07.320 --> 00:05:09.820 into the cloud as far as image data's concerned 133 00:05:09.820 --> 00:05:12.420 or any other type of data is concerned. 134 00:05:12.420 --> 00:05:15.200 It's mostly the content that's coming through that's 135 00:05:15.200 --> 00:05:19.110 streaming in that is displayed to the user for them to follow. 136 00:05:19.110 --> 00:05:20.810 It makes it a very secure system. 137 00:05:20.810 --> 00:05:23.080 One of the big things for AI is keeping [systems] 138 00:05:23.080 --> 00:05:26.010 secure and private, and so we respect that. 139 00:05:26.010 --> 00:05:30.202 From a training perspective, you have to bootstrap your AI, 140 00:05:30.202 --> 00:05:31.660 and that's where you need the data. 141 00:05:31.660 --> 00:05:33.470 As we're aware, we need a fair amount 142 00:05:33.470 --> 00:05:35.060 of data that fuels the AI. 143 00:05:35.060 --> 00:05:38.890 This is where we have spent a fair amount of time sourcing 144 00:05:38.890 --> 00:05:41.120 data, annotating data. 145 00:05:41.120 --> 00:05:43.590 We'll talk a little bit, probably more, about some 146 00:05:43.590 --> 00:05:46.290 of the other aspects about having diversity of this data 147 00:05:46.290 --> 00:05:48.780 that's very vital for you to bootstrap your system, 148 00:05:48.780 --> 00:05:50.910 and once you have the data and label the data, 149 00:05:50.910 --> 00:05:53.120 then you build your AI systems from that. 150 00:05:53.120 --> 00:05:55.370 There is a separation of what happens during training 151 00:05:55.370 --> 00:05:56.670 and what happens during production. 152 00:05:56.670 --> 00:05:58.730 During training, there's a bootstrapping process 153 00:05:58.730 --> 00:05:59.335 that we use. 154 00:05:59.335 --> 00:06:00.960 Basically, the feedback [to] the person 155 00:06:00.960 --> 00:06:03.790 is, "Are you following along to that particular exercise 156 00:06:03.790 --> 00:06:04.380 or not?" 157 00:06:04.380 --> 00:06:07.770 And you can imagine how powerful that is, right? 158 00:06:07.770 --> 00:06:10.650 So, for example, if you get your feedback saying that, 159 00:06:10.650 --> 00:06:13.000 "Hey, last week, you did x number of moves, 160 00:06:13.000 --> 00:06:15.940 but now you did x y number of moves," 161 00:06:15.940 --> 00:06:17.557 that's very motivating for the user. 162 00:06:17.557 --> 00:06:19.640 You might be good with bicep curls but not as good 163 00:06:19.640 --> 00:06:21.370 with planks or pushups. 164 00:06:21.370 --> 00:06:23.520 We provide that feedback so you can work on it. 165 00:06:23.520 --> 00:06:25.550 Or it could also look at another dimension. 166 00:06:25.550 --> 00:06:27.560 For example, you could say, "Hey, you 167 00:06:27.560 --> 00:06:28.998 worked on these muscle groups; how 168 00:06:28.998 --> 00:06:30.790 about focusing on some other muscle groups, 169 00:06:30.790 --> 00:06:32.830 and how about taking these classes to focus 170 00:06:32.830 --> 00:06:33.997 on the other muscle groups?" 171 00:06:33.997 --> 00:06:37.480 So really having that feedback come back to the user 172 00:06:37.480 --> 00:06:40.310 and really guiding the user in their fitness journey, 173 00:06:40.310 --> 00:06:40.835 basically. 174 00:06:40.835 --> 00:06:42.460 So that really is the purpose of Guide. 175 00:06:42.460 --> 00:06:44.377 SAM RANSBOTHAM: That seems pretty fascinating, 176 00:06:44.377 --> 00:06:46.900 because you talked about recommending classes, 177 00:06:46.900 --> 00:06:48.810 but some of the appeal here might 178 00:06:48.810 --> 00:06:51.920 be that class doesn't have to be packaged anymore. 179 00:06:51.920 --> 00:06:56.690 It could be, "Well, Sam he's a slacker on planks, 180 00:06:56.690 --> 00:06:59.690 so he needs lots of ab work or core work, 181 00:06:59.690 --> 00:07:01.450 whereas he's awesome at doing pushups," 182 00:07:01.450 --> 00:07:04.040 or something, because of my massive build. 183 00:07:04.040 --> 00:07:05.980 Fortunately, this is on audio so no one can 184 00:07:05.980 --> 00:07:07.540 verify that that's not true. 185 00:07:07.540 --> 00:07:10.970 [Laughs.] But I can see then that you could actually somehow 186 00:07:10.970 --> 00:07:12.940 generate these classes, maybe in real time, 187 00:07:12.940 --> 00:07:14.900 so they don't necessarily have to be packaged; 188 00:07:14.900 --> 00:07:16.760 they could be adaptive. 189 00:07:16.760 --> 00:07:19.930 Is that some of the goal, or is that thinking too far ahead? 190 00:07:19.930 --> 00:07:21.470 SANJAY NICHANI: Yes, absolutely. 191 00:07:21.470 --> 00:07:25.230 To your point that we do know exactly what the class 192 00:07:25.230 --> 00:07:27.640 plans are, what we're working on is recommending 193 00:07:27.640 --> 00:07:30.710 classes that might be more appropriate or personalized 194 00:07:30.710 --> 00:07:32.210 to a person's fitness journey. 195 00:07:32.210 --> 00:07:32.710 Absolutely. 196 00:07:32.710 --> 00:07:34.502 SAM RANSBOTHAM: That's kind of interesting, 197 00:07:34.502 --> 00:07:36.740 because when I hear about education in general, 198 00:07:36.740 --> 00:07:39.510 we're kind of shifting from talking about a fitness class. 199 00:07:39.510 --> 00:07:41.860 We're almost talking about more of an education product 200 00:07:41.860 --> 00:07:45.090 here, in that it's adapting to what you need, 201 00:07:45.090 --> 00:07:50.210 and so much of what I hear talks about individualized training. 202 00:07:50.210 --> 00:07:52.300 But the production part you're mentioning, 203 00:07:52.300 --> 00:07:54.160 and the packaging and the overall scope, 204 00:07:54.160 --> 00:07:55.310 is still important too. 205 00:07:55.310 --> 00:07:57.270 SANJAY NICHANI: Absolutely, yeah. 206 00:07:57.270 --> 00:07:59.673 And to me, it's a combination of things. 207 00:07:59.673 --> 00:08:01.090 Education is great, because you're 208 00:08:01.090 --> 00:08:03.420 providing insights and metrics that help 209 00:08:03.420 --> 00:08:04.700 you improve your performance. 210 00:08:04.700 --> 00:08:06.120 I think one of the big things about Guide 211 00:08:06.120 --> 00:08:07.100 is the accountability. 212 00:08:07.100 --> 00:08:08.740 You have nobody looking over your shoulder; 213 00:08:08.740 --> 00:08:10.450 [you just] have a machine looking at you, 214 00:08:10.450 --> 00:08:13.620 and it holds you accountable, and it brings out 215 00:08:13.620 --> 00:08:15.410 the competitiveness in you. 216 00:08:15.410 --> 00:08:18.235 It's that accountability that's important. 217 00:08:18.235 --> 00:08:20.360 I think there are other things that we are striving 218 00:08:20.360 --> 00:08:23.370 for in terms of making the experience gamified 219 00:08:23.370 --> 00:08:27.430 in the sense that you want your whole workout routine to be fun 220 00:08:27.430 --> 00:08:28.537 and engaging, right? 221 00:08:28.537 --> 00:08:30.370 You don't want to keep looking at your watch 222 00:08:30.370 --> 00:08:32.559 and saying, "Ugh, are 30 minutes done yet?" 223 00:08:32.559 --> 00:08:36.500 And that is what I think is fantastic about all Peloton 224 00:08:36.500 --> 00:08:38.789 products, but particularly even this Guide product, 225 00:08:38.789 --> 00:08:41.049 is that we really strive for making 226 00:08:41.049 --> 00:08:43.580 it a fun process in addition to all the other advantages 227 00:08:43.580 --> 00:08:44.080 I mentioned. 228 00:08:44.080 --> 00:08:47.120 SHERVIN KHODABANDEH: Sanjay, it is a great example 229 00:08:47.120 --> 00:08:52.620 of AI enabling an experience in a different setting, 230 00:08:52.620 --> 00:08:55.830 in the privacy of your own home, and it's 231 00:08:55.830 --> 00:08:59.580 a great example of AI creating something that it's not 232 00:08:59.580 --> 00:09:01.300 possible to do without it. 233 00:09:01.300 --> 00:09:05.090 What are some other uses of AI for Peloton as a company? 234 00:09:05.090 --> 00:09:05.590 235 00:09:05.590 --> 00:09:07.990 SANJAY NICHANI: There are already 236 00:09:07.990 --> 00:09:09.560 AI initiatives that are going on, 237 00:09:09.560 --> 00:09:11.100 and we're making them better. 238 00:09:11.100 --> 00:09:14.560 Another area is voice -- the convenience of using voice 239 00:09:14.560 --> 00:09:16.110 for hands-free operation. 240 00:09:16.110 --> 00:09:18.350 Especially for a product like Guide, 241 00:09:18.350 --> 00:09:20.280 where if you're holding dumbbells, 242 00:09:20.280 --> 00:09:21.690 or if you're on the floor, you're 243 00:09:21.690 --> 00:09:24.290 prone and trying to do exercises, 244 00:09:24.290 --> 00:09:25.590 you can't hold the remote. 245 00:09:25.590 --> 00:09:28.620 So we have an AI team that's focused on voice. 246 00:09:28.620 --> 00:09:31.340 It is also going to be making its debut in the Guide. 247 00:09:31.340 --> 00:09:32.830 There's a fair amount of AI in that 248 00:09:32.830 --> 00:09:34.350 too, other than computer vision. 249 00:09:34.350 --> 00:09:34.692 250 00:09:34.692 --> 00:09:36.900 SAM RANSBOTHAM: Sanjay, also, this is a big deal too. 251 00:09:36.900 --> 00:09:39.860 You guys are putting a product with artificial intelligence 252 00:09:39.860 --> 00:09:42.410 in lots of people's homes as a consumer product. 253 00:09:42.410 --> 00:09:44.238 There's just not a lot of that going on. 254 00:09:44.238 --> 00:09:46.780 SHERVIN KHODABANDEH: Sanjay, one question I wanted to ask you 255 00:09:46.780 --> 00:09:52.200 is, if you help peel back the onion for our audience in terms 256 00:09:52.200 --> 00:09:57.850 of what it actually takes to design and scale a solution 257 00:09:57.850 --> 00:10:01.580 like you were talking about -- either with Guide or with voice 258 00:10:01.580 --> 00:10:05.560 -- that goes beyond the technical aspects 259 00:10:05.560 --> 00:10:08.160 and the algorithms and the technical aspects 260 00:10:08.160 --> 00:10:09.250 of the product. 261 00:10:09.250 --> 00:10:13.590 Where my mind is going is just the user experience itself. 262 00:10:13.590 --> 00:10:16.600 Talk a little bit about the process of the product 263 00:10:16.600 --> 00:10:19.980 design itself and how you bring that aspect into it. 264 00:10:19.980 --> 00:10:21.730 SANJAY NICHANI: Let me start off answering 265 00:10:21.730 --> 00:10:25.070 the question with what really kicks off things at Peloton. 266 00:10:25.070 --> 00:10:29.380 One of the real core values of Peloton is "put members first." 267 00:10:29.380 --> 00:10:32.270 We are obsessive about customer experience, 268 00:10:32.270 --> 00:10:34.570 and everything is centered around that. 269 00:10:34.570 --> 00:10:39.720 We listen to our members, get feedback from our members, 270 00:10:39.720 --> 00:10:43.350 and a lot of product work really starts with that. 271 00:10:43.350 --> 00:10:44.990 We have a cross-functional team that 272 00:10:44.990 --> 00:10:47.830 looks into many of those things, but it really 273 00:10:47.830 --> 00:10:49.950 starts from the member experience 274 00:10:49.950 --> 00:10:54.130 and how do we make our members happy and healthy, 275 00:10:54.130 --> 00:10:55.820 and there's a fair amount of work 276 00:10:55.820 --> 00:10:58.195 that's done also by user research teams, 277 00:10:58.195 --> 00:11:00.320 maybe building prototypes and putting them in front 278 00:11:00.320 --> 00:11:02.800 of users, sometimes experts. 279 00:11:02.800 --> 00:11:05.470 In this particular case, we've interviewed coaches. 280 00:11:05.470 --> 00:11:08.980 What I really like about AI -- or generally machine learning 281 00:11:08.980 --> 00:11:11.500 development -- is that it's fundamentally iterative, 282 00:11:11.500 --> 00:11:16.150 and it sort of intersects with the whole agile philosophy 283 00:11:16.150 --> 00:11:18.080 of software development. 284 00:11:18.080 --> 00:11:20.920 You basically say, "All right, now; 285 00:11:20.920 --> 00:11:23.930 we have a hypothesis right now, so we build a prototype." 286 00:11:23.930 --> 00:11:26.280 And the way ML works is, what you 287 00:11:26.280 --> 00:11:29.560 need to do first is deploy a minimal system, see where 288 00:11:29.560 --> 00:11:32.363 your errors are, and that decides, "Oh, 289 00:11:32.363 --> 00:11:33.280 do you need more data? 290 00:11:33.280 --> 00:11:35.120 Do you need to improve your models? 291 00:11:35.120 --> 00:11:37.942 Is the problem the quality of the data that you already had?" 292 00:11:37.942 --> 00:11:39.400 Then, when you actually put it out, 293 00:11:39.400 --> 00:11:41.920 it really does give you the intended benefit. 294 00:11:41.920 --> 00:11:44.820 I feel that ML forces you to be agile. 295 00:11:44.820 --> 00:11:46.070 That's how things get started. 296 00:11:46.070 --> 00:11:48.330 It's more of an iterative process. 297 00:11:48.330 --> 00:11:52.380 That's really something that we as an organization -- 298 00:11:52.380 --> 00:11:55.530 and all of the people developing AI products -- have to realize: 299 00:11:55.530 --> 00:11:58.390 that it is something that gets better over time. 300 00:11:58.390 --> 00:12:00.580 As people use it, it gets more and more accurate, 301 00:12:00.580 --> 00:12:04.160 and that's fundamentally because you are always 302 00:12:04.160 --> 00:12:07.610 looking at errors, looking at feedback, 303 00:12:07.610 --> 00:12:10.800 and that drives the whole process 304 00:12:10.800 --> 00:12:12.087 of continuous improvement. 305 00:12:12.087 --> 00:12:14.420 SAM RANSBOTHAM: But something seems different about this 306 00:12:14.420 --> 00:12:17.190 to me because you're talking about the culture within 307 00:12:17.190 --> 00:12:20.112 Peloton that may understand this need to iterate and improve 308 00:12:20.112 --> 00:12:22.570 and be agile, but when you're talking about delivering this 309 00:12:22.570 --> 00:12:25.580 as a product to consumers, I feel like they might have 310 00:12:25.580 --> 00:12:28.522 a different expectation of how ... 311 00:12:28.522 --> 00:12:30.230 well, first, I'm not even sure if they're 312 00:12:30.230 --> 00:12:32.370 going to know that there's artificial intelligence/machine 313 00:12:32.370 --> 00:12:33.610 learning involved in the product. 314 00:12:33.610 --> 00:12:34.940 Maybe they will or maybe they won't, and you 315 00:12:34.940 --> 00:12:35.857 could comment on that. 316 00:12:35.857 --> 00:12:40.430 But how do you get that culture about iteratively improving 317 00:12:40.430 --> 00:12:42.030 and "don't expect it to be perfect"? 318 00:12:42.030 --> 00:12:44.728 I think consumers expect things to be perfect initially. 319 00:12:44.728 --> 00:12:46.520 SANJAY NICHANI: Yeah, that's a great point. 320 00:12:46.520 --> 00:12:49.300 And I wasn't trying to say that we 321 00:12:49.300 --> 00:12:50.800 should be deploying stuff that's not 322 00:12:50.800 --> 00:12:52.750 perfect or close to perfect. 323 00:12:52.750 --> 00:12:56.070 The point I was making was that, for example, 324 00:12:56.070 --> 00:12:58.040 when we are going to be launching the Guide, 325 00:12:58.040 --> 00:13:00.860 we have gone through a lot of trials, field trials, 326 00:13:00.860 --> 00:13:02.360 and we're really trying to identify, 327 00:13:02.360 --> 00:13:04.380 what are the operating conditions for Guide? 328 00:13:04.380 --> 00:13:06.380 What makes it perfect? 329 00:13:06.380 --> 00:13:09.160 There are a lot of things -- some things that we absolutely 330 00:13:09.160 --> 00:13:13.490 cannot compromise on: safety, reliability, 331 00:13:13.490 --> 00:13:15.880 making sure it works for everyone. 332 00:13:15.880 --> 00:13:16.380 333 00:13:16.380 --> 00:13:19.730 SAM RANSBOTHAM: Those are areas you can improve over time. 334 00:13:19.730 --> 00:13:22.470 SANJAY NICHANI: Those are areas we can improve over time. 335 00:13:22.470 --> 00:13:26.520 So it's a question of, we start with the operating parameters 336 00:13:26.520 --> 00:13:28.030 we are very confident in. 337 00:13:28.030 --> 00:13:30.590 And that's what good companies do, is figure out, 338 00:13:30.590 --> 00:13:33.700 "OK, it is a space that's big enough to provide value 339 00:13:33.700 --> 00:13:35.460 to the customer, but at the same time, 340 00:13:35.460 --> 00:13:38.080 we have to be absolutely certain that it does very, very 341 00:13:38.080 --> 00:13:40.040 well in that," or near perfection, as you're 342 00:13:40.040 --> 00:13:40.950 talking about. 343 00:13:40.950 --> 00:13:43.130 And then you kind of expand on it from there, 344 00:13:43.130 --> 00:13:49.020 maybe adding more features or perhaps being able to handle 345 00:13:49.020 --> 00:13:51.615 more occlusion of body parts -- things like that. 346 00:13:51.615 --> 00:13:52.830 347 00:13:52.830 --> 00:13:55.580 SHERVIN KHODABANDEH: I wanted to switch gears a bit 348 00:13:55.580 --> 00:13:57.310 and talk about talent. 349 00:13:57.310 --> 00:13:59.910 You guys are doing really cool stuff with AI 350 00:13:59.910 --> 00:14:03.770 at the core of the product and the customer experience. 351 00:14:03.770 --> 00:14:08.000 What are your thoughts on the kind of talent you 352 00:14:08.000 --> 00:14:12.840 and companies that are aspiring to do similar things need? 353 00:14:12.840 --> 00:14:15.310 There is a talent war out there. 354 00:14:15.310 --> 00:14:18.220 What do you think it takes for the right talent to join 355 00:14:18.220 --> 00:14:19.025 and want to stay? 356 00:14:19.025 --> 00:14:20.650 SANJAY NICHANI: I'll sort of the answer 357 00:14:20.650 --> 00:14:22.060 this question in two parts. 358 00:14:22.060 --> 00:14:23.850 I have strong opinions about it. 359 00:14:23.850 --> 00:14:26.810 The first one is more around the talent itself. 360 00:14:26.810 --> 00:14:29.940 I feel like that is one of the biggest challenges. 361 00:14:29.940 --> 00:14:33.160 It's not just the challenge of the scarcity of talent, 362 00:14:33.160 --> 00:14:34.560 but also the type of talent. 363 00:14:34.560 --> 00:14:39.520 I feel like there is a lot of research and researchers in AI, 364 00:14:39.520 --> 00:14:41.100 and a lot of work being done in AI. 365 00:14:41.100 --> 00:14:44.900 But the focus is more on competitions, on papers -- 366 00:14:44.900 --> 00:14:48.350 topics such as architectures or optimization techniques -- 367 00:14:48.350 --> 00:14:51.400 but there are clearly not enough people focused on practical 368 00:14:51.400 --> 00:14:53.130 aspects of deployment. 369 00:14:53.130 --> 00:14:55.940 I just talked about what it takes to build an AI product, 370 00:14:55.940 --> 00:14:56.440 right? 371 00:14:56.440 --> 00:15:00.740 And to me, that's where I think that having people focus more 372 00:15:00.740 --> 00:15:04.710 on deployment and on production is very, very vital. 373 00:15:04.710 --> 00:15:07.180 And this requires people all the way 374 00:15:07.180 --> 00:15:10.270 from "How do I source the right type of data? 375 00:15:10.270 --> 00:15:11.690 Do I have the right data quality? 376 00:15:11.690 --> 00:15:13.550 How would I mix it with synthetic data? 377 00:15:13.550 --> 00:15:17.195 How do I build data pipelines, and how do I version it?" 378 00:15:17.195 --> 00:15:18.320 All that becomes important. 379 00:15:18.320 --> 00:15:20.300 Also, finding people with experience 380 00:15:20.300 --> 00:15:22.810 around just deployment of these models. 381 00:15:22.810 --> 00:15:23.008 382 00:15:23.008 --> 00:15:24.050 SHERVIN KHODABANDEH: Yup. 383 00:15:24.050 --> 00:15:25.310 SANJAY NICHANI: Finding people in that area 384 00:15:25.310 --> 00:15:27.780 is where I feel like there are the biggest scarcities. 385 00:15:27.780 --> 00:15:29.440 SAM RANSBOTHAM: That's something that Shervin and I come back 386 00:15:29.440 --> 00:15:30.240 to a lot. 387 00:15:30.240 --> 00:15:33.850 So much emphasis is on these algorithms, 388 00:15:33.850 --> 00:15:37.090 and what we frame as consumption is really where 389 00:15:37.090 --> 00:15:38.420 a lot of the bottleneck is. 390 00:15:38.420 --> 00:15:38.920 391 00:15:38.920 --> 00:15:42.020 SANJAY NICHANI: Coming back to what really keeps talent 392 00:15:42.020 --> 00:15:44.150 at Peloton, I think it's the mission. 393 00:15:44.150 --> 00:15:48.840 I think that Peloton's mission is just to empower people to be 394 00:15:48.840 --> 00:15:51.750 the best versions of themselves and have them feel good about 395 00:15:51.750 --> 00:15:54.150 themselves -- be happy, be healthy. 396 00:15:54.150 --> 00:15:57.883 It is a very noble mission that, every time that I ask people, 397 00:15:57.883 --> 00:15:59.300 "Why do you want to join Peloton?" 398 00:15:59.300 --> 00:16:01.733 that's that first thing that comes out: "I have a bike. 399 00:16:01.733 --> 00:16:03.650 I know someone who has a bike," and, you know, 400 00:16:03.650 --> 00:16:05.120 how it's changed their life. 401 00:16:05.120 --> 00:16:07.820 You have to be in line with that mission. 402 00:16:07.820 --> 00:16:10.050 That, to me, is the primary driver. 403 00:16:10.050 --> 00:16:12.900 There are other things, from a culture perspective, 404 00:16:12.900 --> 00:16:15.530 like the way we operate as teams. 405 00:16:15.530 --> 00:16:18.250 One more thing that actually is fairly important 406 00:16:18.250 --> 00:16:19.890 is the impact that you can make. 407 00:16:19.890 --> 00:16:21.460 All of the people working on Guide 408 00:16:21.460 --> 00:16:24.450 are going to be making an impact on millions of members, 409 00:16:24.450 --> 00:16:26.520 and this is just one of the products. 410 00:16:26.520 --> 00:16:30.550 So I think that having that sort of impact also 411 00:16:30.550 --> 00:16:31.840 drives people a lot. 412 00:16:31.840 --> 00:16:34.410 I would say those are the three reasons, really. 413 00:16:34.410 --> 00:16:35.760 SHERVIN KHODABANDEH: Great. 414 00:16:35.760 --> 00:16:38.240 SAM RANSBOTHAM: One of the things you described was a very 415 00:16:38.240 --> 00:16:41.480 edge-oriented approach to ML -- that it's within the box, 416 00:16:41.480 --> 00:16:43.660 it's within the home, the data doesn't leave. 417 00:16:43.660 --> 00:16:45.930 But some of the things you're describing now 418 00:16:45.930 --> 00:16:48.230 seem like they would benefit from aggregation. 419 00:16:48.230 --> 00:16:50.760 It would really help collectively 420 00:16:50.760 --> 00:16:52.670 if we understood better how to get fit to 421 00:16:52.670 --> 00:16:54.690 or how to improve our health. 422 00:16:54.690 --> 00:16:57.950 Health information is something that we tend to keep private, 423 00:16:57.950 --> 00:16:59.690 and we tend to want it to be private, 424 00:16:59.690 --> 00:17:02.910 but I can't help but wonder, would we in aggregate 425 00:17:02.910 --> 00:17:06.140 benefit if we were a bit more open with that data? 426 00:17:06.140 --> 00:17:07.910 What are your thoughts on that? 427 00:17:07.910 --> 00:17:10.359 SANJAY NICHANI: I think that's trust, right? 428 00:17:10.359 --> 00:17:14.099 From an industry standpoint, AI needs to get there. 429 00:17:14.099 --> 00:17:17.680 Once you get to that point, maybe it's possible. 430 00:17:17.680 --> 00:17:20.520 We've all seen what has happened with facial recognition 431 00:17:20.520 --> 00:17:22.869 systems and other examples. 432 00:17:22.869 --> 00:17:24.589 So definitely, there are trade-offs. 433 00:17:24.589 --> 00:17:26.880 There is a move more toward trying 434 00:17:26.880 --> 00:17:28.780 to anonymize it in some way. 435 00:17:28.780 --> 00:17:31.040 Can you achieve both objectives? 436 00:17:31.040 --> 00:17:33.390 SAM RANSBOTHAM: Sanjay, great talking with you. 437 00:17:33.390 --> 00:17:36.760 I think most people are familiar with the physical Peloton bike, 438 00:17:36.760 --> 00:17:38.440 but this is a product that you're 439 00:17:38.440 --> 00:17:40.610 talking about putting artificial intelligence 440 00:17:40.610 --> 00:17:42.610 in real time in people's lives. 441 00:17:42.610 --> 00:17:45.610 And there's just not a lot of examples of that going on 442 00:17:45.610 --> 00:17:46.637 that people are used to. 443 00:17:46.637 --> 00:17:48.220 We've really enjoyed talking with you. 444 00:17:48.220 --> 00:17:48.860 Thank you. 445 00:17:48.860 --> 00:17:50.240 SHERVIN KHODABANDEH: Yeah, thank you so much. 446 00:17:50.240 --> 00:17:51.540 SANJAY NICHANI: Thank you; I really appreciate your 447 00:17:51.540 --> 00:17:52.370 having me. 448 00:17:52.370 --> 00:17:54.400 SAM RANSBOTHAM: Thanks for listening. 449 00:17:54.400 --> 00:17:58.040 Tune in next time when we talk with Katia Walsh, Levi Strauss 450 00:17:58.040 --> 00:18:01.510 & Co.'s chief global strategy and AI officer. 451 00:18:01.510 --> 00:18:03.190 ALLISON RYDER: Thanks for listening 452 00:18:03.190 --> 00:18:04.700 to Me, Myself, and AI. 453 00:18:04.700 --> 00:18:07.150 We believe, like you, that the conversation 454 00:18:07.150 --> 00:18:09.370 about AI implementation doesn't start and stop 455 00:18:09.370 --> 00:18:10.550 with this podcast. 456 00:18:10.550 --> 00:18:13.040 That's why we've created a group on LinkedIn, specifically 457 00:18:13.040 --> 00:18:14.160 for leaders like you. 458 00:18:14.160 --> 00:18:16.910 It's called AI for Leaders, and if you join us, 459 00:18:16.910 --> 00:18:18.930 you can chat with show creators and hosts, 460 00:18:18.930 --> 00:18:22.540 ask your own questions, share insights, and gain access 461 00:18:22.540 --> 00:18:25.040 to valuable resources about AI implementation 462 00:18:25.040 --> 00:18:27.130 from MIT SMR and BCG. 463 00:18:27.130 --> 00:18:32.260 You can access it by visiting mitsmr.com/AIforLeaders. 464 00:18:32.260 --> 00:18:34.980 We'll put that link in the show notes, 465 00:18:34.980 --> 00:18:37.410 and we hope to see you there. 466 00:18:37.410 --> 00:18:43.000