WEBVTT 1 00:00:00.000 --> 00:00:03.480 2 00:00:03.480 --> 00:00:05.070 SAM RANSBOTHAM: 3 00:00:05.070 --> 00:00:05.160 How can AI help bring new ideas and products 4 00:00:05.160 --> 00:00:05.250 to market in industries where risk aversion is rampant? 5 00:00:05.250 --> 00:00:05.350 Find out today when we talk with Kartik Hosanagar, professor 6 00:00:05.350 --> 00:00:05.430 at the Wharton School and founder of Jumpcut, 7 00:00:05.430 --> 00:00:05.490 a startup helping previously undiscovered talent 8 00:00:05.490 --> 00:00:06.407 produce movies and tv. 9 00:00:06.407 --> 00:00:11.780 Welcome to Me, Myself, and AI, a podcast 10 00:00:11.780 --> 00:00:21.760 on artificial intelligence in business. 11 00:00:21.760 --> 00:00:26.110 12 00:00:26.110 --> 00:00:29.480 Each episode, we introduce you to someone innovating with AI. 13 00:00:29.480 --> 00:00:32.800 I'm Sam Ransbotham, professor of information systems 14 00:00:32.800 --> 00:00:34.140 at Boston College. 15 00:00:34.140 --> 00:00:37.510 I'm also the guest editor for the AI and Business Strategy 16 00:00:37.510 --> 00:00:40.880 Big Ideas program at MIT Sloan Management Review. 17 00:00:40.880 --> 00:00:43.580 SHERVIN KHODABANDEH: And I'm Shervin Khodabandeh, 18 00:00:43.580 --> 00:00:47.740 senior partner with BCG, and I colead BCG's AI practice 19 00:00:47.740 --> 00:00:48.920 in North America. 20 00:00:48.920 --> 00:00:52.450 Together, MIT SMR and BCG have been 21 00:00:52.450 --> 00:00:55.870 researching AI for five years, interviewing hundreds 22 00:00:55.870 --> 00:00:58.220 of practitioners and surveying thousands 23 00:00:58.220 --> 00:01:02.320 of companies on what it takes to build and to deploy and scale 24 00:01:02.320 --> 00:01:05.850 AI capabilities and really transform 25 00:01:05.850 --> 00:01:08.040 the way organizations operate. 26 00:01:08.040 --> 00:01:11.940 SAM RANSBOTHAM: Today, we're talking with Kartik Hosanagar. 27 00:01:11.940 --> 00:01:14.230 He's a professor at The Wharton School and founder 28 00:01:14.230 --> 00:01:15.610 and CEO of Jumpcut. 29 00:01:15.610 --> 00:01:16.960 Kartik, thanks for joining us. 30 00:01:16.960 --> 00:01:17.720 Welcome. 31 00:01:17.720 --> 00:01:19.770 KARTIK HOSANAGAR: Thanks for having me, Sam and Shervin. 32 00:01:19.770 --> 00:01:21.520 SAM RANSBOTHAM: So this is a bit of a different interview 33 00:01:21.520 --> 00:01:22.940 for me, because I've known Kartik 34 00:01:22.940 --> 00:01:24.730 for years in academic circles. 35 00:01:24.730 --> 00:01:27.790 But Kartik, your latest venture is Jumpcut. 36 00:01:27.790 --> 00:01:29.840 I think the idea basically is to help 37 00:01:29.840 --> 00:01:32.700 surface new and fresh stories for Hollywood. 38 00:01:32.700 --> 00:01:34.470 Can you tell us a little bit about that? 39 00:01:34.470 --> 00:01:36.630 KARTIK HOSANAGAR: My new startup is called Jumpcut, 40 00:01:36.630 --> 00:01:37.500 as you mentioned. 41 00:01:37.500 --> 00:01:39.740 What we are doing is essentially trying 42 00:01:39.740 --> 00:01:43.140 to create a new data-driven studio that's 43 00:01:43.140 --> 00:01:45.810 reimagining the way films and TV shows 44 00:01:45.810 --> 00:01:48.160 are developed, with the specific goal 45 00:01:48.160 --> 00:01:51.160 of elevating fresh new voices. 46 00:01:51.160 --> 00:01:55.380 The context here that got me interested in this is, 47 00:01:55.380 --> 00:01:58.910 Hollywood has historically been an old boys' club -- 48 00:01:58.910 --> 00:02:03.820 a few execs making decisions on what movies get made, 49 00:02:03.820 --> 00:02:06.750 who's in those movies, [and] at what budgets. 50 00:02:06.750 --> 00:02:09.850 All of these [are] just pretty much based on gut 51 00:02:09.850 --> 00:02:12.550 and relationships -- who knows who. 52 00:02:12.550 --> 00:02:15.410 There are costs [to] this kind of decision-making. 53 00:02:15.410 --> 00:02:18.870 There's the economic cost; Hollywood has historically had 54 00:02:18.870 --> 00:02:20.650 a very poor batting average. 55 00:02:20.650 --> 00:02:24.560 There's the social cost; by just about any measure, 56 00:02:24.560 --> 00:02:27.190 Hollywood has not been a particularly inclusive 57 00:02:27.190 --> 00:02:28.090 industry. 58 00:02:28.090 --> 00:02:30.680 And then there's the cost to audiences. 59 00:02:30.680 --> 00:02:33.790 So, what we're trying to do is break 60 00:02:33.790 --> 00:02:37.780 that mold and use data to make more objective, better 61 00:02:37.780 --> 00:02:38.820 decisions. 62 00:02:38.820 --> 00:02:42.900 But ultimately, by doing that, we 63 00:02:42.900 --> 00:02:45.800 can assess storytellers and stories on their merit 64 00:02:45.800 --> 00:02:50.310 as opposed to who is connected to whom or just [the] 65 00:02:50.310 --> 00:02:52.680 gut feelings of a few people. 66 00:02:52.680 --> 00:02:55.930 So that's how we're trying to democratize 67 00:02:55.930 --> 00:02:57.343 Hollywood using data. 68 00:02:57.343 --> 00:02:59.010 SAM RANSBOTHAM: I mean, maybe your world 69 00:02:59.010 --> 00:03:02.030 is slightly different than mine, but no movie execs 70 00:03:02.030 --> 00:03:04.060 are in my classes. 71 00:03:04.060 --> 00:03:05.790 So how did you end up with this idea, 72 00:03:05.790 --> 00:03:08.210 and how did you end up with Jumpcut? 73 00:03:08.210 --> 00:03:09.790 KARTIK HOSANAGAR: It's interesting -- 74 00:03:09.790 --> 00:03:12.480 no movie execs in my classes either, Sam. 75 00:03:12.480 --> 00:03:13.990 SAM RANSBOTHAM: Do you know that I 76 00:03:13.990 --> 00:03:16.080 went to high school with Julia Roberts, though? 77 00:03:16.080 --> 00:03:18.000 I mean, there's a connection there. 78 00:03:18.000 --> 00:03:18.750 But anyway, sorry. 79 00:03:18.750 --> 00:03:19.310 Keep going. 80 00:03:19.310 --> 00:03:21.570 KARTIK HOSANAGAR: Well, I hope you are still connected 81 00:03:21.570 --> 00:03:22.190 to your high school -- 82 00:03:22.190 --> 00:03:23.607 SAM RANSBOTHAM: No, unfortunately; 83 00:03:23.607 --> 00:03:26.190 this is pre-social media, and I made lots of tragic mistakes 84 00:03:26.190 --> 00:03:28.310 in that era, but anyway. 85 00:03:28.310 --> 00:03:30.458 Sad tale for another time. 86 00:03:30.458 --> 00:03:31.750 KARTIK HOSANAGAR: Right, right. 87 00:03:31.750 --> 00:03:34.350 I look forward to catching up on that story sometime. 88 00:03:34.350 --> 00:03:38.130 Coming back to your question: What got me interested in this? 89 00:03:38.130 --> 00:03:42.860 I've always had an interest in content and storytelling 90 00:03:42.860 --> 00:03:43.810 and films. 91 00:03:43.810 --> 00:03:45.900 I'm actually an amateur filmmaker. 92 00:03:45.900 --> 00:03:51.630 So back when I was a newly tenured professor, 93 00:03:51.630 --> 00:03:55.400 and also, I didn't have kids, it was an interesting combination 94 00:03:55.400 --> 00:03:57.980 where I had my weekends to myself, 95 00:03:57.980 --> 00:04:01.880 so I would go make short films and just put them out 96 00:04:01.880 --> 00:04:02.630 on YouTube. 97 00:04:02.630 --> 00:04:05.300 It was really just a fun hobby of mine. 98 00:04:05.300 --> 00:04:07.860 In fact, during my first sabbatical from Wharton, 99 00:04:07.860 --> 00:04:09.050 I wrote a screenplay. 100 00:04:09.050 --> 00:04:12.630 I flew into Mumbai, met with a bunch of film producers 101 00:04:12.630 --> 00:04:14.630 there, and pitched my screenplay. 102 00:04:14.630 --> 00:04:17.019 A number of them liked it. 103 00:04:17.019 --> 00:04:21.019 But the common response from many of them 104 00:04:21.019 --> 00:04:23.190 was, "We really like this, but how 105 00:04:23.190 --> 00:04:26.773 do we take a bet on a completely new writer/director?" 106 00:04:26.773 --> 00:04:28.440 One of them said, "I'll buy your script, 107 00:04:28.440 --> 00:04:29.962 but I can't have you direct it. 108 00:04:29.962 --> 00:04:32.170 You may be good, but I just can't have you direct it, 109 00:04:32.170 --> 00:04:36.030 because I can't go to financiers and to actors 110 00:04:36.030 --> 00:04:39.330 and ask them to take a bet on a new voice." 111 00:04:39.330 --> 00:04:43.280 So my response back then was, "Fair enough; makes sense. 112 00:04:43.280 --> 00:04:44.840 I realize why you can't do it." 113 00:04:44.840 --> 00:04:47.870 But at the same time, I didn't want to give my script away. 114 00:04:47.870 --> 00:04:49.520 So I came back. 115 00:04:49.520 --> 00:04:52.780 Then my sabbatical was over, so back to Wharton -- 116 00:04:52.780 --> 00:04:55.210 back to teaching and all of that. 117 00:04:55.210 --> 00:04:59.260 But over the years, I've met with so many writers 118 00:04:59.260 --> 00:05:02.910 and directors who have shared similar perspectives. 119 00:05:02.910 --> 00:05:04.770 There's a friend of mine who told me 120 00:05:04.770 --> 00:05:08.130 it took him 15 years to break into the industry. 121 00:05:08.130 --> 00:05:10.280 And he's a successful writer/director. 122 00:05:10.280 --> 00:05:12.500 Fifteen years for his first opportunity. 123 00:05:12.500 --> 00:05:15.300 And I hear this so many times. 124 00:05:15.300 --> 00:05:20.070 In recent years, we see really successful movies or shows 125 00:05:20.070 --> 00:05:22.860 like Get Out or Stranger Things come 126 00:05:22.860 --> 00:05:26.190 from very unexpected places, so I've always 127 00:05:26.190 --> 00:05:28.370 been fascinated by this space. 128 00:05:28.370 --> 00:05:32.170 Initially it was a hobby, but one fine day I kind of felt 129 00:05:32.170 --> 00:05:36.230 like the problem I'm seeing is something 130 00:05:36.230 --> 00:05:40.893 my skills and data might have some relevance to use to solve. 131 00:05:40.893 --> 00:05:43.060 SHERVIN KHODABANDEH: It's actually quite interesting 132 00:05:43.060 --> 00:05:46.230 because there are several angles to the story here. 133 00:05:46.230 --> 00:05:49.790 One is the equity angle and the social angle, 134 00:05:49.790 --> 00:05:54.660 and giving those who deserve and have the merit 135 00:05:54.660 --> 00:05:57.450 the voice that they might otherwise not get. 136 00:05:57.450 --> 00:06:02.100 And then there is the more business, data-driven angle, 137 00:06:02.100 --> 00:06:08.870 which takes my mind to Moneyball in baseball, or underwriting -- 138 00:06:08.870 --> 00:06:12.910 giving credit to folks who don't have a credit history 139 00:06:12.910 --> 00:06:17.010 because they emigrate to this country and, but for the fact 140 00:06:17.010 --> 00:06:20.790 that they've just arrived here, they would be perfect in terms 141 00:06:20.790 --> 00:06:23.380 of the profile and the background and the education -- 142 00:06:23.380 --> 00:06:29.930 and being able to actually assess talent based on a bunch 143 00:06:29.930 --> 00:06:33.720 of attributes and features and sort of de-risk the very 144 00:06:33.720 --> 00:06:37.420 executives that say, "How do I take a bet on you, 145 00:06:37.420 --> 00:06:40.342 because I don't know anything about you?" 146 00:06:40.342 --> 00:06:41.800 KARTIK HOSANAGAR: Yeah, absolutely. 147 00:06:41.800 --> 00:06:44.600 And you used a very interesting word here. 148 00:06:44.600 --> 00:06:46.170 You talked about de-risking. 149 00:06:46.170 --> 00:06:48.917 You talked about credit applications, 150 00:06:48.917 --> 00:06:50.500 and there are people who are applying, 151 00:06:50.500 --> 00:06:52.520 and how do we de-risk them? 152 00:06:52.520 --> 00:06:55.900 A lot of this is really about risk. 153 00:06:55.900 --> 00:07:00.620 If you look at Hollywood, one of the big things I've realized 154 00:07:00.620 --> 00:07:04.400 is, Hollywood would like to take chances. 155 00:07:04.400 --> 00:07:07.120 They understand the problems of inequity. 156 00:07:07.120 --> 00:07:10.270 They understand the industry has not been inclusive. 157 00:07:10.270 --> 00:07:14.370 But everyone's worried about the risk issue. 158 00:07:14.370 --> 00:07:19.310 What was interesting for me was, a lot of execs in Hollywood 159 00:07:19.310 --> 00:07:21.200 told me, "Well, the reason everyone's 160 00:07:21.200 --> 00:07:25.320 so risk averse is, if you do a movie with Brad Pitt in it 161 00:07:25.320 --> 00:07:27.000 and it doesn't work out, you won't lose 162 00:07:27.000 --> 00:07:29.250 your job for making that bet. 163 00:07:29.250 --> 00:07:34.295 But if you take a bet on something new [and] unexpected, 164 00:07:34.295 --> 00:07:35.670 and that doesn't work out, you'll 165 00:07:35.670 --> 00:07:38.820 have some explaining to do: 'Why did you make this bet? 166 00:07:38.820 --> 00:07:42.650 This is not the kind of bet we've made in the past.'" And I 167 00:07:42.650 --> 00:07:44.940 remember at some point I even read a statistic, 168 00:07:44.940 --> 00:07:47.830 which was on movies, and it said like 74% 169 00:07:47.830 --> 00:07:53.400 of the highest-grossing movies are all sequels or adaptations 170 00:07:53.400 --> 00:07:56.200 of existing IP [intellectual property]. 171 00:07:56.200 --> 00:07:59.380 So there's no one willing to take a chance on something new. 172 00:07:59.380 --> 00:08:01.070 Everything is a sequel. 173 00:08:01.070 --> 00:08:02.680 As an outsider, I was wondering, why 174 00:08:02.680 --> 00:08:05.030 is it that it has to be a bestselling 175 00:08:05.030 --> 00:08:08.990 book or a comic book for something to get made? 176 00:08:08.990 --> 00:08:12.000 Why can't it be an original screenplay or an original idea? 177 00:08:12.000 --> 00:08:14.420 And it came back to, "Oh, yes, because [the] IP 178 00:08:14.420 --> 00:08:15.700 has been de-risked." 179 00:08:15.700 --> 00:08:19.720 But the insight that I had was that IP isn't 180 00:08:19.720 --> 00:08:21.560 the only way of de-risking. 181 00:08:21.560 --> 00:08:24.870 There are other ways to de-risk as well. 182 00:08:24.870 --> 00:08:29.110 So that's essentially how we approached it. 183 00:08:29.110 --> 00:08:31.620 And there's really three pieces, and I'll mention each 184 00:08:31.620 --> 00:08:33.539 at a high level, and then you'll tell me 185 00:08:33.539 --> 00:08:36.690 if there's any of them that's worth going deeper into. 186 00:08:36.690 --> 00:08:39.870 One is, first of all, how do we discover 187 00:08:39.870 --> 00:08:41.870 stories and storytellers? 188 00:08:41.870 --> 00:08:46.700 The classic Hollywood approach is that studios get submissions 189 00:08:46.700 --> 00:08:50.850 of scripts from agents, and agents 190 00:08:50.850 --> 00:08:52.840 are representing talent and people who 191 00:08:52.840 --> 00:08:54.510 are outside of the system; they don't 192 00:08:54.510 --> 00:08:56.390 have a godfather who'd connect them 193 00:08:56.390 --> 00:08:58.710 or don't already have the connections. 194 00:08:58.710 --> 00:09:00.760 So the first thing we had to solve 195 00:09:00.760 --> 00:09:03.740 for is how do we find stories and storytellers that 196 00:09:03.740 --> 00:09:05.590 are outside of the system? 197 00:09:05.590 --> 00:09:08.330 So for that, one of the things we do 198 00:09:08.330 --> 00:09:13.780 is, our algorithms are assessing content as well as creators 199 00:09:13.780 --> 00:09:18.670 on various platforms like YouTube or Reddit, 200 00:09:18.670 --> 00:09:22.840 or even storytelling platforms like Wattpad and others. 201 00:09:22.840 --> 00:09:25.360 And what we're doing in those platforms -- 202 00:09:25.360 --> 00:09:27.520 I'll use YouTube as an example. 203 00:09:27.520 --> 00:09:31.350 If a screenwriter or if a director 204 00:09:31.350 --> 00:09:35.130 has created short films and has posted those short films that 205 00:09:35.130 --> 00:09:37.510 are already resonating with audiences, 206 00:09:37.510 --> 00:09:39.320 then we try and discover them. 207 00:09:39.320 --> 00:09:41.940 So the idea is to find people where they are. 208 00:09:41.940 --> 00:09:44.230 That's whatever country, whatever platform you're 209 00:09:44.230 --> 00:09:46.940 in, and go discover them there. 210 00:09:46.940 --> 00:09:49.230 And essentially, the algorithms and analytics 211 00:09:49.230 --> 00:09:51.680 are trying to analyze the content 212 00:09:51.680 --> 00:09:55.270 to look for high production value, which 213 00:09:55.270 --> 00:09:59.160 you can infer from the frames and images in the videos, 214 00:09:59.160 --> 00:10:00.050 for example. 215 00:10:00.050 --> 00:10:02.570 You're looking for strong storytelling 216 00:10:02.570 --> 00:10:04.400 and strong audience response, which 217 00:10:04.400 --> 00:10:06.920 you can infer from the kinds of comments 218 00:10:06.920 --> 00:10:10.140 that are elicited in response to these videos or even tech 219 00:10:10.140 --> 00:10:11.180 stories. 220 00:10:11.180 --> 00:10:16.070 You can short-list a set of stories or storytellers 221 00:10:16.070 --> 00:10:18.340 for our creative team to look at. 222 00:10:18.340 --> 00:10:21.820 So it's not AI making the decision, 223 00:10:21.820 --> 00:10:23.630 but it's AI making humans efficient, 224 00:10:23.630 --> 00:10:25.210 because if I had a creative team that 225 00:10:25.210 --> 00:10:27.620 had to scour through YouTube to find 226 00:10:27.620 --> 00:10:29.280 the needle in the haystack, I would 227 00:10:29.280 --> 00:10:31.230 need a massive team and several years 228 00:10:31.230 --> 00:10:34.700 to comb through 250,000 short films on YouTube. 229 00:10:34.700 --> 00:10:38.043 SHERVIN KHODABANDEH: Or just my two kids. 230 00:10:38.043 --> 00:10:39.960 KARTIK HOSANAGAR: [Laughs.] Well, that's true. 231 00:10:39.960 --> 00:10:40.680 That works, too. 232 00:10:40.680 --> 00:10:43.610 Except I think they will just send me Minecraft 233 00:10:43.610 --> 00:10:45.143 videos or something like that. 234 00:10:45.143 --> 00:10:47.310 SHERVIN KHODABANDEH: Given how many hours of YouTube 235 00:10:47.310 --> 00:10:48.010 they watch. 236 00:10:48.010 --> 00:10:48.980 But I digress. 237 00:10:48.980 --> 00:10:50.317 Sorry, go ahead. 238 00:10:50.317 --> 00:10:51.900 SAM RANSBOTHAM: Let's interrupt there. 239 00:10:51.900 --> 00:10:54.067 I mean, that's the opposite of the Moneyball problem 240 00:10:54.067 --> 00:10:55.040 that Shervin mentioned. 241 00:10:55.040 --> 00:10:56.930 Because with Moneyball, you had the people, 242 00:10:56.930 --> 00:10:59.000 and it's about figuring out which one was good. 243 00:10:59.000 --> 00:11:02.480 You're more about expanding that search space. 244 00:11:02.480 --> 00:11:05.820 KARTIK HOSANAGAR: Before we can say, "Among the stories, 245 00:11:05.820 --> 00:11:08.830 which is the one to bet on?" -- before we can do that, 246 00:11:08.830 --> 00:11:12.000 we want to know, "Is the pool of people and stories 247 00:11:12.000 --> 00:11:14.010 we're looking at the right pool? 248 00:11:14.010 --> 00:11:15.730 Or is that the complete pool?" 249 00:11:15.730 --> 00:11:18.050 If we don't expand the pool, we are not 250 00:11:18.050 --> 00:11:19.870 solving the inclusion problem. 251 00:11:19.870 --> 00:11:21.500 We work with agents. 252 00:11:21.500 --> 00:11:24.800 We work with established writers as well. 253 00:11:24.800 --> 00:11:29.270 But in addition, we're actively seeking out new people 254 00:11:29.270 --> 00:11:33.240 and not waiting for them to get discovered by an agent 255 00:11:33.240 --> 00:11:36.220 and the agent then forwards them to us, but instead 256 00:11:36.220 --> 00:11:37.720 [we're] finding them where they are. 257 00:11:37.720 --> 00:11:39.040 SAM RANSBOTHAM: So that was one. 258 00:11:39.040 --> 00:11:39.540 I'm sorry. 259 00:11:39.540 --> 00:11:41.262 You're going on with two and three here. 260 00:11:41.262 --> 00:11:42.470 KARTIK HOSANAGAR: Yeah, yeah. 261 00:11:42.470 --> 00:11:45.000 And the second is, again, how do we de-risk them? 262 00:11:45.000 --> 00:11:49.600 Once we find somebody we ask them, "Hey, do you 263 00:11:49.600 --> 00:11:50.950 have a story for us?" 264 00:11:50.950 --> 00:11:53.180 And they don't have one -- they have 15. 265 00:11:53.180 --> 00:11:56.440 And then we hear them or we read them, 266 00:11:56.440 --> 00:11:58.430 and we get excited with a few. 267 00:11:58.430 --> 00:12:01.200 We want to figure out, how do we bring in some objectivity 268 00:12:01.200 --> 00:12:01.700 to this? 269 00:12:01.700 --> 00:12:03.400 And that's where data comes in. 270 00:12:03.400 --> 00:12:06.550 And some of it is backward looking, 271 00:12:06.550 --> 00:12:08.230 which is a classic machine learning 272 00:12:08.230 --> 00:12:10.500 kind of approach, which is, "Let's look at data 273 00:12:10.500 --> 00:12:13.700 on what's been doing well." 274 00:12:13.700 --> 00:12:16.210 And that could be what movies are doing well 275 00:12:16.210 --> 00:12:18.820 but also what kinds of stories are trending. 276 00:12:18.820 --> 00:12:21.000 It could even be search queries and looking 277 00:12:21.000 --> 00:12:23.810 at where is the cultural zeitgeist, where 278 00:12:23.810 --> 00:12:24.550 are people going. 279 00:12:24.550 --> 00:12:30.160 And [we try to] understand which of the stories that we have 280 00:12:30.160 --> 00:12:34.430 are stories that we think audiences will respond to. 281 00:12:34.430 --> 00:12:36.420 But that's not everything, because I 282 00:12:36.420 --> 00:12:38.950 think as long as we are backward looking 283 00:12:38.950 --> 00:12:40.840 and we're looking at what's worked, 284 00:12:40.840 --> 00:12:42.860 it is a fundamentally conservative approach, 285 00:12:42.860 --> 00:12:45.360 because we'll do more and more of what's worked in the past. 286 00:12:45.360 --> 00:12:47.140 SAM RANSBOTHAM: It's the sequel problem. 287 00:12:47.140 --> 00:12:49.005 KARTIK HOSANAGAR: The sequel problem. 288 00:12:49.005 --> 00:12:49.880 We'll be stuck there. 289 00:12:49.880 --> 00:12:51.255 So the other question that we are 290 00:12:51.255 --> 00:12:53.910 trying to solve for is, "How do we go take a bet on something 291 00:12:53.910 --> 00:12:57.130 that's never been done before and there 292 00:12:57.130 --> 00:12:59.190 is no historical success there?" 293 00:12:59.190 --> 00:13:03.270 That's where we bring in ideas as from AB testing 294 00:13:03.270 --> 00:13:05.300 and experimentation. 295 00:13:05.300 --> 00:13:08.630 We interact with lots of audiences 296 00:13:08.630 --> 00:13:14.240 online, where we pitch stories, and we're 297 00:13:14.240 --> 00:13:16.990 having multiple stories compete with each other 298 00:13:16.990 --> 00:13:20.220 and see which ones people are gravitating toward. 299 00:13:20.220 --> 00:13:22.370 We pair the classic machine learning 300 00:13:22.370 --> 00:13:26.410 based on past data with digital experimentation. 301 00:13:26.410 --> 00:13:28.420 And we're running these experiments 302 00:13:28.420 --> 00:13:32.170 and then figuring out not just which stories we like, 303 00:13:32.170 --> 00:13:36.070 but also sometimes questions, and this is all very 304 00:13:36.070 --> 00:13:37.780 hypothesis driven. 305 00:13:37.780 --> 00:13:41.060 And it actually also creates very interesting breakthroughs. 306 00:13:41.060 --> 00:13:42.810 Like, one of the shows we're working on -- 307 00:13:42.810 --> 00:13:46.330 it's with a very senior writer, in this instance -- 308 00:13:46.330 --> 00:13:51.330 he had an interesting sci-fi story, 309 00:13:51.330 --> 00:13:53.930 and it's a very high-budget show. 310 00:13:53.930 --> 00:13:57.690 He came to us saying, "If I have to sell this, 311 00:13:57.690 --> 00:14:00.640 it has to be the case that this is a four-quadrant show. 312 00:14:00.640 --> 00:14:02.490 Because of the budget, it can't be 313 00:14:02.490 --> 00:14:04.290 that this has a niche audience. 314 00:14:04.290 --> 00:14:08.050 Can your data prove that this is a four-quadrant show?" 315 00:14:08.050 --> 00:14:10.500 And we said, "OK, let's test it out." 316 00:14:10.500 --> 00:14:13.220 And we tested it, and it tested off the charts. 317 00:14:13.220 --> 00:14:15.740 I went back to him and I said, "It's 318 00:14:15.740 --> 00:14:17.370 almost a four-quadrant show. 319 00:14:17.370 --> 00:14:19.210 It's testing off the charts -- 320 00:14:19.210 --> 00:14:21.040 Gen Z, Gen X, millennials. 321 00:14:21.040 --> 00:14:24.230 Doesn't matter what the age group; they're responding." 322 00:14:24.230 --> 00:14:27.400 Sci-fi, drama, all these different audiences: 323 00:14:27.400 --> 00:14:29.230 People who are into sci-fi are responding, 324 00:14:29.230 --> 00:14:31.860 but also people who are not into sci-fi or into drama, 325 00:14:31.860 --> 00:14:33.070 they're responding. 326 00:14:33.070 --> 00:14:35.070 People in the U.S. are responding. 327 00:14:35.070 --> 00:14:36.950 People outside and in international markets 328 00:14:36.950 --> 00:14:37.575 are responding. 329 00:14:37.575 --> 00:14:41.610 So all that's great, but women aren't responding. 330 00:14:41.610 --> 00:14:44.420 And I said, "Well, this is what's going on." 331 00:14:44.420 --> 00:14:45.970 And his first reaction was, "Oh, we 332 00:14:45.970 --> 00:14:48.000 can't go with that kind of data to the buyers, 333 00:14:48.000 --> 00:14:50.420 because that hurts the show." 334 00:14:50.420 --> 00:14:52.430 And that was our initial reaction, 335 00:14:52.430 --> 00:14:55.810 which was to be defensive about it: Can we hide that data? 336 00:14:55.810 --> 00:14:57.650 Can we not show it, and so on? 337 00:14:57.650 --> 00:14:59.660 And then, as we are talking, it's 338 00:14:59.660 --> 00:15:01.310 like, "Why are women not responding? 339 00:15:01.310 --> 00:15:02.240 What's going wrong?" 340 00:15:02.240 --> 00:15:07.470 We have this show, there's three main characters in it. 341 00:15:07.470 --> 00:15:08.640 There's a woman in there. 342 00:15:08.640 --> 00:15:10.200 There's two male characters. 343 00:15:10.200 --> 00:15:13.180 To simplify it, I'll just say there's the good guy, 344 00:15:13.180 --> 00:15:16.110 there's the bad guy, and then there's the woman in the show. 345 00:15:16.110 --> 00:15:19.630 As we talked about it, we realized, "Look, 346 00:15:19.630 --> 00:15:23.760 the woman in the show doesn't seem to have enough agency. 347 00:15:23.760 --> 00:15:25.610 She has no motivation. 348 00:15:25.610 --> 00:15:28.540 The female character is serving the motivations 349 00:15:28.540 --> 00:15:29.960 of the male characters." 350 00:15:29.960 --> 00:15:32.480 And he realized that he's approaching it 351 00:15:32.480 --> 00:15:34.750 with this mindset and he's not thought hard 352 00:15:34.750 --> 00:15:38.560 about what is driving her. 353 00:15:38.560 --> 00:15:44.080 And then he reimagined the female character. 354 00:15:44.080 --> 00:15:46.700 I'm getting into the weeds, but now the female character 355 00:15:46.700 --> 00:15:50.390 is a genetically CRISPR kind of female, modified, 356 00:15:50.390 --> 00:15:53.480 and she's got superpowers or special powers and so on. 357 00:15:53.480 --> 00:15:56.020 And suddenly, when we tested the new version, 358 00:15:56.020 --> 00:15:57.910 women were responding to it. 359 00:15:57.910 --> 00:15:59.930 Now the idea improved. 360 00:15:59.930 --> 00:16:02.460 This isn't like soulless data telling you 361 00:16:02.460 --> 00:16:05.720 that you need to insert a chase sequence on page 13 362 00:16:05.720 --> 00:16:08.140 to increase the audience. 363 00:16:08.140 --> 00:16:13.350 This is hypotheses and asking, "What is my story missing?" 364 00:16:13.350 --> 00:16:17.560 and really improving creative decisions [that are] 365 00:16:17.560 --> 00:16:20.467 very much human led but data informed. 366 00:16:20.467 --> 00:16:22.300 SHERVIN KHODABANDEH: This is quite exciting. 367 00:16:22.300 --> 00:16:24.610 What I particularly like about it, Kartik, 368 00:16:24.610 --> 00:16:28.900 is you're breaking into the last standing 369 00:16:28.900 --> 00:16:33.240 bastion of judgment-driven decision-making, right? 370 00:16:33.240 --> 00:16:36.210 I mean, if you think about 20 years ago, 371 00:16:36.210 --> 00:16:40.000 [to] the loan officer making a judgment on who's going to get 372 00:16:40.000 --> 00:16:43.290 the loan, and then fast-forward to 10 years ago, 373 00:16:43.290 --> 00:16:47.390 where it's the retailer [who] decides what price he or she 374 00:16:47.390 --> 00:16:51.890 should charge, or how he should stock up the shelves -- 375 00:16:51.890 --> 00:16:55.120 all of those industries have been completely revolutionized 376 00:16:55.120 --> 00:16:58.100 with data and analytics, and they're making actually 377 00:16:58.100 --> 00:17:01.190 data-driven decisions, and they're doing test and learn, 378 00:17:01.190 --> 00:17:02.630 and they course-correct. 379 00:17:02.630 --> 00:17:04.880 And we're seeing a lot of that in entertainment, 380 00:17:04.880 --> 00:17:08.950 but still a lot of the studios are very much judgment driven. 381 00:17:08.950 --> 00:17:10.530 And I think this is very interesting, 382 00:17:10.530 --> 00:17:13.069 because it's the beginning of the beginning 383 00:17:13.069 --> 00:17:13.990 for this industry. 384 00:17:13.990 --> 00:17:16.922 So hats off to you for doing that. 385 00:17:16.922 --> 00:17:19.130 KARTIK HOSANAGAR: Well, yeah, thanks for saying that. 386 00:17:19.130 --> 00:17:22.490 I mean, I'm super excited to apply data to a setting 387 00:17:22.490 --> 00:17:25.869 where people are the most skeptical of 388 00:17:25.869 --> 00:17:27.970 "Should data be used here?" 389 00:17:27.970 --> 00:17:33.820 While we are finding value in this, 390 00:17:33.820 --> 00:17:38.150 I will also say that Hollywood has enough and more skeptics 391 00:17:38.150 --> 00:17:40.470 with regard to whether data should play a role. 392 00:17:40.470 --> 00:17:42.660 And sometimes, there is [even this] 393 00:17:42.660 --> 00:17:45.960 misconception of what data is. 394 00:17:45.960 --> 00:17:48.280 And sometimes, it is, "Oh, there's 395 00:17:48.280 --> 00:17:50.490 an AI system making all the decisions, 396 00:17:50.490 --> 00:17:52.920 and I no longer have a role to play." 397 00:17:52.920 --> 00:17:55.220 I think there's a lot of room for subjectivity, 398 00:17:55.220 --> 00:17:59.300 but "How can data be a useful tool?" 399 00:17:59.300 --> 00:18:00.530 is how we're looking at it. 400 00:18:00.530 --> 00:18:02.950 And hopefully in five, 10 years, we 401 00:18:02.950 --> 00:18:06.260 can establish a track record of what data can do. 402 00:18:06.260 --> 00:18:08.570 And that might change the mindset here 403 00:18:08.570 --> 00:18:10.530 in the whole industry. 404 00:18:10.530 --> 00:18:12.060 I did see a lot of that reaction, 405 00:18:12.060 --> 00:18:14.740 saying, "Oh, data in creative has no role." 406 00:18:14.740 --> 00:18:16.820 But I saw another kind of reaction, 407 00:18:16.820 --> 00:18:18.680 and that was interesting. 408 00:18:18.680 --> 00:18:21.880 I had some people in the industry saying, 409 00:18:21.880 --> 00:18:26.000 "This is inevitable, and Netflix is already 410 00:18:26.000 --> 00:18:27.910 starting some of this stuff. 411 00:18:27.910 --> 00:18:31.430 We are forced by Netflix to get into this. 412 00:18:31.430 --> 00:18:33.850 So we have to start doing it, and we 413 00:18:33.850 --> 00:18:35.450 may as well partner with people like 414 00:18:35.450 --> 00:18:38.310 you to understand this game." 415 00:18:38.310 --> 00:18:43.350 So in fact, one of them told me -- and this is quite simple, 416 00:18:43.350 --> 00:18:44.320 intuitive. 417 00:18:44.320 --> 00:18:48.110 This is a producer who told me that when his show came 418 00:18:48.110 --> 00:18:53.800 on Netflix, within days -- first of all, within a day or two -- 419 00:18:53.800 --> 00:18:56.180 Netflix was able to give him feedback, of course, 420 00:18:56.180 --> 00:19:04.010 on how they thought his entire first season would play out. 421 00:19:04.010 --> 00:19:07.180 By the end of Month 1, they were already 422 00:19:07.180 --> 00:19:09.940 in conversations about Season 2, because Netflix 423 00:19:09.940 --> 00:19:12.580 was able to project out what things would look like. 424 00:19:12.580 --> 00:19:15.670 Apparently, he was also given information 425 00:19:15.670 --> 00:19:18.800 about which characters audiences are most interested in. 426 00:19:18.800 --> 00:19:22.450 They wanted to kill one of the characters early in Season 2, 427 00:19:22.450 --> 00:19:24.460 and they were told, "No, you cannot do this. 428 00:19:24.460 --> 00:19:26.120 That's your main character." 429 00:19:26.120 --> 00:19:29.000 So this producer was telling me, "Yes, I mean, 430 00:19:29.000 --> 00:19:32.030 Netflix is already pushing us to do some of this stuff." 431 00:19:32.030 --> 00:19:34.740 The difference is, Netflix's data is coming in 432 00:19:34.740 --> 00:19:37.550 after the show is released, and Jumpcut 433 00:19:37.550 --> 00:19:41.090 is trying to bring in data before the show is created. 434 00:19:41.090 --> 00:19:43.820 But I think some of them see this 435 00:19:43.820 --> 00:19:46.950 as being inevitable because of companies like Netflix 436 00:19:46.950 --> 00:19:49.467 and are happy to embrace our approach. 437 00:19:49.467 --> 00:19:51.050 SAM RANSBOTHAM: I was a bit surprised, 438 00:19:51.050 --> 00:19:53.980 because obviously your book from a couple of years ago 439 00:19:53.980 --> 00:19:56.690 was about how algorithms, I think in your words, 440 00:19:56.690 --> 00:19:58.950 are shaping our lives. 441 00:19:58.950 --> 00:20:02.230 But this seems to be one area that algorithms are really 442 00:20:02.230 --> 00:20:03.540 not shaping our lives. 443 00:20:03.540 --> 00:20:07.180 So how are you getting this constituency of people 444 00:20:07.180 --> 00:20:09.390 to pay attention to data and algorithms that 445 00:20:09.390 --> 00:20:10.970 have historically not? 446 00:20:10.970 --> 00:20:13.175 That's got to be not just selling a story; 447 00:20:13.175 --> 00:20:14.800 you're also having to sell an approach. 448 00:20:14.800 --> 00:20:16.900 KARTIK HOSANAGAR: So in the book, 449 00:20:16.900 --> 00:20:19.440 which is called A Human's Guide to Machine Intelligence , 450 00:20:19.440 --> 00:20:24.580 I documented several ways in which AI is shaping our lives, 451 00:20:24.580 --> 00:20:28.190 as Sam was just mentioning, and included some examples 452 00:20:28.190 --> 00:20:29.240 in the movie world. 453 00:20:29.240 --> 00:20:33.760 So, for example, I talk about how on Netflix, algorithms 454 00:20:33.760 --> 00:20:36.090 are driving a lot of our consumption. 455 00:20:36.090 --> 00:20:38.480 There was a paper by data scientists at Netflix 456 00:20:38.480 --> 00:20:42.720 that said something like 80% of viewing hours on Netflix 457 00:20:42.720 --> 00:20:44.890 originated from algorithmic recommendations. 458 00:20:44.890 --> 00:20:48.310 So it's pretty clear it is shaping our lives, even 459 00:20:48.310 --> 00:20:49.970 in a setting like the movies. 460 00:20:49.970 --> 00:20:53.290 But I think while it's shaping our lives -- 461 00:20:53.290 --> 00:20:56.840 certainly in terms of how we consume content -- 462 00:20:56.840 --> 00:20:59.480 how content is created, that side of the business, 463 00:20:59.480 --> 00:21:00.660 hasn't really evolved. 464 00:21:00.660 --> 00:21:02.860 So the supply side still looks the same, 465 00:21:02.860 --> 00:21:04.610 even though the demand has completely 466 00:21:04.610 --> 00:21:07.867 been reshaped by the market. 467 00:21:07.867 --> 00:21:10.200 SHERVIN KHODABANDEH: To me, there [are] a lot of analogs 468 00:21:10.200 --> 00:21:16.930 in what you're doing here to the evolution of data-driven 469 00:21:16.930 --> 00:21:20.160 decision-making -- not replacing [the] human, 470 00:21:20.160 --> 00:21:23.700 but basically creating a smarter, more effective, 471 00:21:23.700 --> 00:21:25.130 or efficient human. 472 00:21:25.130 --> 00:21:27.750 And there is exactly the same thing 473 00:21:27.750 --> 00:21:30.940 in retail and in personalization and in marketing. 474 00:21:30.940 --> 00:21:35.150 Like I said, it feels to me that entertainment and media 475 00:21:35.150 --> 00:21:37.600 has been sort of the last bastion of it. 476 00:21:37.600 --> 00:21:40.430 I think, Kartik, as you said, a lot of it 477 00:21:40.430 --> 00:21:42.810 is risk rather than anything else. 478 00:21:42.810 --> 00:21:44.570 And I think in another book of yours, 479 00:21:44.570 --> 00:21:46.990 you alluded to the chicken-and-egg problem: 480 00:21:46.990 --> 00:21:50.070 If you don't have data on the talent, 481 00:21:50.070 --> 00:21:53.430 how do you make those decisions without some level 482 00:21:53.430 --> 00:21:56.130 of experimentation and some level of data? 483 00:21:56.130 --> 00:21:58.690 And then, of course, there are disruptors like Netflix 484 00:21:58.690 --> 00:22:03.450 who are forcing everybody to become more data driven 485 00:22:03.450 --> 00:22:05.380 as a way of surviving. 486 00:22:05.380 --> 00:22:09.720 I wonder whether studios could survive 10 years from now 487 00:22:09.720 --> 00:22:12.160 without making a major step that way. 488 00:22:12.160 --> 00:22:13.160 KARTIK HOSANAGAR: Right. 489 00:22:13.160 --> 00:22:16.180 I would think that the time has come. 490 00:22:16.180 --> 00:22:18.290 In fact, maybe the time was yesterday 491 00:22:18.290 --> 00:22:20.760 for something like this, which is of course why 492 00:22:20.760 --> 00:22:22.770 I went and started Jumpcut. 493 00:22:22.770 --> 00:22:28.910 I think it's inevitable, because we know that human judgment has 494 00:22:28.910 --> 00:22:31.230 its big share of problems. 495 00:22:31.230 --> 00:22:35.470 Obviously, human judgment is also great in many ways, 496 00:22:35.470 --> 00:22:39.600 but we have our biases; we're colored by them 497 00:22:39.600 --> 00:22:42.410 in ways we don't realize. 498 00:22:42.410 --> 00:22:46.240 Having a tool that can free us from those, I think -- 499 00:22:46.240 --> 00:22:49.880 it's a no-brainer that we should be embracing them. 500 00:22:49.880 --> 00:22:52.020 And yeah, the time has come to do this 501 00:22:52.020 --> 00:22:55.340 in fields that were unexpected. 502 00:22:55.340 --> 00:22:58.990 I think sports is not, again, a place where I would have 503 00:22:58.990 --> 00:23:01.640 guessed there would be early adopters, 504 00:23:01.640 --> 00:23:03.920 but clearly there have been, and they've 505 00:23:03.920 --> 00:23:06.620 shown that the Moneyball kind of approach works. 506 00:23:06.620 --> 00:23:10.830 What you really need is to integrate data 507 00:23:10.830 --> 00:23:13.270 into decision-making along the way. 508 00:23:13.270 --> 00:23:17.980 You have a deep understanding of data to know when to lean on it 509 00:23:17.980 --> 00:23:19.630 but also when to question it, and you 510 00:23:19.630 --> 00:23:22.640 have a strong creative point of view, 511 00:23:22.640 --> 00:23:24.258 and you bring the two together. 512 00:23:24.258 --> 00:23:26.550 So you've got to almost create it [from the] ground up, 513 00:23:26.550 --> 00:23:29.800 which is why we said we're not a data insights vendor. 514 00:23:29.800 --> 00:23:32.620 We're a company that's creating really 515 00:23:32.620 --> 00:23:34.225 a new kind of business that brings 516 00:23:34.225 --> 00:23:35.350 data and creative together. 517 00:23:35.350 --> 00:23:38.180 SHERVIN KHODABANDEH: You have to really close the loop, 518 00:23:38.180 --> 00:23:41.360 because otherwise you have the problem 519 00:23:41.360 --> 00:23:45.820 that I think a lot of early adopters 520 00:23:45.820 --> 00:23:48.830 of data-driven decision-making fell into, 521 00:23:48.830 --> 00:23:50.560 which is they said, "Well, data is 522 00:23:50.560 --> 00:23:52.360 our future competitive advantage. 523 00:23:52.360 --> 00:23:57.100 Let's acquire as much of it as possible and put it in some 524 00:23:57.100 --> 00:24:00.970 Hadoop cluster somewhere, only to know we can't do anything 525 00:24:00.970 --> 00:24:03.070 with it because we haven't thought through --" 526 00:24:03.070 --> 00:24:04.730 SAM RANSBOTHAM: But they've got it, they've got it. 527 00:24:04.730 --> 00:24:06.688 SHERVIN KHODABANDEH: They have the data, right? 528 00:24:06.688 --> 00:24:09.720 And they're like the number of companies that have done just 529 00:24:09.720 --> 00:24:13.450 that 10 years ago, 15 years ago, even doing it now -- 530 00:24:13.450 --> 00:24:15.240 they're like, "We'll get all the data. 531 00:24:15.240 --> 00:24:18.580 Once we have it all, we'll figure out what to do with it." 532 00:24:18.580 --> 00:24:21.340 But as you're saying, sometimes it's 533 00:24:21.340 --> 00:24:24.450 about doing so much more with the data 534 00:24:24.450 --> 00:24:26.870 you already have by connecting it 535 00:24:26.870 --> 00:24:29.140 to the business process or the creative process 536 00:24:29.140 --> 00:24:31.025 and connecting the endpoints. 537 00:24:31.025 --> 00:24:32.400 SAM RANSBOTHAM: Kartik, we really 538 00:24:32.400 --> 00:24:35.570 enjoyed both your background and how you're applying this 539 00:24:35.570 --> 00:24:38.680 into your perspective on algorithms, and the difference 540 00:24:38.680 --> 00:24:41.330 that algorithms can make in terms of exploring our search 541 00:24:41.330 --> 00:24:42.780 space I think is fascinating. 542 00:24:42.780 --> 00:24:44.370 But also those parts two and three 543 00:24:44.370 --> 00:24:47.298 were pretty fascinating too, about how to integrate 544 00:24:47.298 --> 00:24:48.090 that with creative. 545 00:24:48.090 --> 00:24:50.400 We appreciate you taking the time to talk with us. 546 00:24:50.400 --> 00:24:51.650 KARTIK HOSANAGAR: My pleasure. 547 00:24:51.650 --> 00:24:52.680 Thank you for having me. 548 00:24:52.680 --> 00:24:54.120 SAM RANSBOTHAM: In our next episode, 549 00:24:54.120 --> 00:24:55.760 we speak with Sarah Karthigan about how 550 00:24:55.760 --> 00:24:59.220 she's helping ExxonMobil use AI for self-healing process 551 00:24:59.220 --> 00:25:01.370 improvement across business units. 552 00:25:01.370 --> 00:25:02.572 Please join us. 553 00:25:02.572 --> 00:25:04.030 ALLISON RYDER: Thanks for listening 554 00:25:04.030 --> 00:25:05.550 to Me, Myself, and AI. 555 00:25:05.550 --> 00:25:07.980 We believe, like you, that the conversation 556 00:25:07.980 --> 00:25:10.210 about AI implementation doesn't start and stop 557 00:25:10.210 --> 00:25:11.360 with this podcast. 558 00:25:11.360 --> 00:25:13.870 That's why we've created a group on LinkedIn, specifically 559 00:25:13.870 --> 00:25:14.990 for leaders like you. 560 00:25:14.990 --> 00:25:17.740 It's called AI for Leaders, and if you join us, 561 00:25:17.740 --> 00:25:19.780 you can chat with show creators and hosts, 562 00:25:19.780 --> 00:25:23.380 ask your own questions, share insights, and gain access 563 00:25:23.380 --> 00:25:25.860 to valuable resources about AI implementation 564 00:25:25.860 --> 00:25:27.970 from MIT SMR and BCG. 565 00:25:27.970 --> 00:25:34.227 You can access it by visiting mitsmr.com/AIforLeaders. 566 00:25:34.227 --> 00:25:35.810 We'll put that link in the show notes, 567 00:25:35.810 --> 00:25:38.240 and we hope to see you there. 568 00:25:38.240 --> 00:25:42.000